The Peter Attia Drive - March 25, 2024


#295 ‒ Roadway death and injury: why everyone should care and what you can do to reduce risk | Mark Rosekind, Ph.D.


Episode Stats

Length

2 hours and 28 minutes

Words per Minute

191.67763

Word Count

28,507

Sentence Count

1,642

Misogynist Sentences

11

Hate Speech Sentences

12


Summary

Mark Rosekind is a safety, sleep, fatigue, and policy leader with more than 30 years of experience enacting strategic, practical, and effective database solutions that enhance safety and health in complex environments. He was the Chief Safety Innovation Officer at Zoox, an Amazon-owned autonomous mobile company from 2017 to 2022, and was also appointed the distinguished policy scholar in the Department of Health Policy and Management at the Johns Hopkins Bloomberg School of Public Health between 2020 and 2022. Before becoming the NHTSA administrator, he was appointed by President Obama as the 15th administrator of the National Highway Traffic Safety Administration. He earned his bachelor s degree from Stanford and his Master s and Doctorate from Yale University and completed post-doctorate training at Brown University Medical School.


Transcript

00:00:00.000 Hey, everyone. Welcome to the Drive podcast. I'm your host, Peter Atiyah. This podcast,
00:00:16.540 my website, and my weekly newsletter all focus on the goal of translating the science of longevity
00:00:21.520 into something accessible for everyone. Our goal is to provide the best content in health and
00:00:26.740 wellness, and we've established a great team of analysts to make this happen. It is extremely
00:00:31.660 important to me to provide all of this content without relying on paid ads. To do this, our work
00:00:36.960 is made entirely possible by our members, and in return, we offer exclusive member-only content
00:00:42.700 and benefits above and beyond what is available for free. If you want to take your knowledge of
00:00:47.940 this space to the next level, it's our goal to ensure members get back much more than the price
00:00:53.200 of a subscription. If you want to learn more about the benefits of our premium membership,
00:00:58.040 head over to peteratiyahmd.com forward slash subscribe. My guest this week is Mark Rosekind.
00:01:06.500 Mark is a safety, sleep, fatigue, and policy leader with more than 30 years of experience
00:01:11.320 enacting strategic, practical, and effective database solutions that enhance safety and
00:01:16.320 health in complex environments. He was the chief safety innovation officer at Zoox,
00:01:20.940 an Amazon-owned autonomous mobile company from 2017 to 2022. He was also appointed the distinguished
00:01:27.920 policy scholar in the Department of Health Policy and Management at the Johns Hopkins Bloomberg School
00:01:32.860 of Public Health between 2020 and 2022. Previously, he was appointed by President Obama as the 15th
00:01:39.540 administrator of the National Highway Traffic Safety Administration. Before becoming the NHTSA
00:01:45.560 administrator, he was appointed by President Obama and served as the 40th member
00:01:49.720 of the National Transportation Safety Board, the NSTB, which you no doubt will recognize is the
00:01:56.120 organization that is always investigating plane crashes, train crashes, and other disasters.
00:02:02.240 Additionally, Mark previously directed the Fatigue Countermeasures Program at NASA Ames Research Center
00:02:08.140 and was the chief of aviation operations branch in the flight management and human factors division.
00:02:13.640 He earned his bachelor's degree from Stanford and his master's and doctorate from Yale University and
00:02:20.380 completed postdoctoral training at Brown University Medical School. In looking at what we internally call
00:02:27.740 the death bars, which you've likely heard me talk about, which we use to identify what are the threats
00:02:35.260 to our lifespan, you may recall that while the four horsemen generally get the lion's share of our
00:02:42.420 attention, there's always this pesky fifth cause of death, which is deaths due to accidents. And while we
00:02:50.340 typically speak about one subset of those, which are the accidents that are due to falls because they
00:02:56.500 disproportionately increase later in life, there's one cause that seems relatively consistent throughout
00:03:04.420 life. And that is accidental deaths due to transport. And so I wanted to do a deep dive into this topic
00:03:12.900 because frankly, when I consider my own mortality over the next decade, this occupies a disproportionate
00:03:20.820 share of what might account for my relative risk of death. And I know that for many of you listening,
00:03:26.520 that is also true. In this conversation, we obviously talk about Mark's background, which is quite unique
00:03:32.400 and how it led him to be an expert in this. We look at the statistics of car crashes and how that's
00:03:36.640 changed over time. We talk about the groups that are most at risk and the locations where most of
00:03:42.460 these incidents take place. We then look at various things that can increase the risk, such as obviously
00:03:48.180 being on your phone and being distracted, the role of alcohol and cannabis, sleep deprivation and
00:03:52.700 drowsiness, speed and weather. We talk about autonomous vehicles and new safety technology. And we talk about
00:03:59.720 what pedestrians need to be aware of and what resources are available for people to learn more,
00:04:04.600 especially parents. And this is something I'm thinking a lot about as my daughter is on the cusp
00:04:09.300 of beginning to drive. So without further delay, please enjoy my conversation with Mark Rosekite.
00:04:20.180 Well, Mark, thank you so much for joining me here today. This is certainly a topic that I don't think
00:04:26.060 gets enough attention given the consequences of it. It's also a topic that I think maybe people don't
00:04:33.320 necessarily appreciate the frequency of such interactions. And I guess one of the things I'm
00:04:38.140 hoping to understand today is how much of the nature of what we're going to discuss today is under our
00:04:44.480 control. I know that as I think about the things that are a threat to our lifespan, a number of these
00:04:50.560 things are kind of out of our control, but many of them are actually in our control. Certain diseases,
00:04:55.800 for example, like cardiovascular disease are almost entirely within our control and genes play a
00:05:00.420 role, but your ability to sort of go above and beyond the genetic hand you're dealt is significant.
00:05:06.780 But here, when it comes to accidental deaths, and as a subset of that, the role that fatigue plays in
00:05:12.120 that, I want to really explore this. But I think before we do, I think it's important that people really
00:05:17.000 get a sense of you and your background. When we decided we wanted to spend time on this topic,
00:05:23.520 it felt almost too good to be true that we discovered you and your work. The way things
00:05:28.980 sort of work, Mark, is basically, we sit around and brainstorm ideas that we want to cover. And
00:05:34.160 then we go out and look for an expert. That's 99% of our podcasts work in that way. I remember when
00:05:40.280 Nick came to me and said, okay, you know, this is what we found. And I was like, wow, that seems
00:05:45.120 amazing. We're going to really answer a lot of questions I've always had. So tell us a little bit
00:05:49.600 about your background. Because the first thing that stood out to me was how at a very young age,
00:05:54.420 you lost your father. Yeah, let's start there, right? Long time ago, but it still is challenging
00:06:00.040 to talk about, actually. Let me just begin by thanking you for making this topic part of your
00:06:05.960 discussion in your podcast. You already said it. This is so prevalent in everybody's life,
00:06:11.440 just being on the road. Everyone's a pedestrian at some point. We're all in vehicles moving around.
00:06:16.060 And yet, we have come to accept the carnage in ways that should just be unacceptable in our society.
00:06:22.800 So without bringing more focus, including what you can control, like, what should I be doing to be
00:06:27.900 safer versus things that I can't? Just a critical conversation. I really appreciate you taking the
00:06:33.560 time to do this. And I paused for a moment because it actually was not until my Senate confirmation
00:06:39.040 hearing to be administrator of NHTSA that I really talked publicly about this very much.
00:06:43.140 But my father was a San Francisco motorcycle policeman. He was chasing a traffic violator and
00:06:49.200 somebody ran a red, hit him, and he was killed in the line of duty. Actually, just over my shoulder,
00:06:54.700 that's a shadow box with his badge and purple heart. He was 30 years old. My brother and I,
00:06:59.760 I was three and a half. My brother was two, raised by a single parent. I'm okay talking about this to
00:07:04.840 start with because it just points out this is personal for me. And I've told people, it's like,
00:07:09.660 I don't wear that as a badge on my shoulder every time we have the conversation, but it is clearly
00:07:14.840 foundational to sort of what has put me on a lifelong path of pursuing safety and how to make
00:07:20.300 people's lives safer. And one of the things I often talk about, I'm sure we'll get to it, is
00:07:25.040 if you want to know how many lives were lost on our roads, make sure you know the exact number
00:07:28.980 because every one of those numbers is a father or mother or sister or brother or some relative or
00:07:35.920 one of your neighbors, etc. Those are real people and we cannot bring them back. And I think that
00:07:41.960 too often when we start talking about all the statistics, we walk right by the human part of
00:07:47.320 this and that gets lost. And really, that's where we should always start. Yeah, there's a quote that I'm
00:07:53.280 not remembering exactly, but it speaks to the idea that a million people is a statistic, but a small
00:07:59.380 number of people is a story. And I agree with that completely. I mean, I think we can get very numb to
00:08:04.760 what those numbers mean. I don't know if you've seen it, but there's a series on Netflix right now
00:08:09.580 that takes a lot of old footage from World War II and basically does some remarkable technology
00:08:17.840 application where it puts it back into color and makes it really remarkable. And so it's a six-part
00:08:23.540 series on World War II that is, like I think anybody who's watched it has shared my reaction to it,
00:08:29.680 which is it's stunning. But what's hard to fathom as you go through this is the loss of life. You
00:08:36.740 hear about 60 million or 80 million, I don't even remember, people lost their lives globally.
00:08:42.240 Obviously, we're aware of 6 million people being exterminated in concentration camps. And yet I
00:08:48.580 realized as I came to the end of that, I don't even know what that means because I've never seen
00:08:52.640 6 million people with my eyes. I've never seen 1 million people with my eyes. So it's very difficult
00:09:00.720 to explain those things. And what I think this series does very well is what you've done, which is
00:09:07.700 you get a few stories. You get a few stories that are very representative of the horrific nature of
00:09:15.960 what happened. And then at least you have some semblance of saying, wow, now imagine multiplying that
00:09:21.000 by a thousand or a million. And the numbers numb you. As you just said, it's like, I don't know what
00:09:26.720 that really means. So it numbs you. Very often in talks, I would actually start with what we call in
00:09:32.320 the business bent metal. And so, you know, having been at the NTSB, I'll literally take photos from
00:09:37.640 investigations minus the humans. But it's like, this was Dawn at 20 years old. She not only lost her life,
00:09:45.540 but the people in this minivan you see, four out of the five died. And the only one who didn't was a
00:09:50.620 child in a car. I mean, that makes it real for people. And what you hope is people translate
00:09:55.940 that to this could be you or a partner or your kids or your neighbor, you know, people that matter.
00:10:03.080 And there's this huge gap between those numbers you're talking about, which most people have no
00:10:07.240 concept. And the fact that when somebody actually in your circle loses their life in some kind of
00:10:12.820 crash, that makes it very personal. And again, you can't bring them back. There's no coming back from
00:10:17.820 that. Well, I definitely want to get into these statistics, both in terms of people in cars,
00:10:24.360 people as pedestrians, cyclists, et cetera. But before we do, I do want to come back a little
00:10:27.700 bit to your story, because I want people to kind of understand your training and what took you to
00:10:32.820 Yale. You did your PhD there, what you studied and how that ultimately kind of led to what you're
00:10:36.180 doing now. So tell us as much or as little as you think is necessary for listeners to kind of get a
00:10:41.120 sense of the trajectory you've taken to where you are today.
00:10:44.160 I'll give you the arc and it's more of a zigzag. Whatever you want to go in depth, let me know,
00:10:49.460 because as always, there are interesting stories along the way. I'm trained as a scientist. I was
00:10:54.840 actually a pre-med at Stanford and had the incredibly good fortune to take a course called
00:10:59.480 Sleep and Dreams when I was a sophomore. And it was taught by William Dement, MD, PhD, a professor in
00:11:06.420 the medical school. And Dr. Dement was part of the team that discovered REM sleep. And some believe he
00:11:12.160 was actually the guy who coined the term REM sleep. And as you know, even though we all as a human
00:11:17.060 race have been sleeping since calling out a primordial slime, we only knew about non-REM and
00:11:22.300 REM in the mid-50s. And so Dr. Dement came to Stanford and started really one of the first sleep
00:11:27.720 centers and was teaching this undergraduate course. And it was just fascinating. He was this passionate,
00:11:33.640 charismatic professor who was engaging. And here's this medical school professor teaching an
00:11:38.040 undergraduate course, which by the way, at the time, the two most popular courses were Sleep and
00:11:42.200 Dreams and Human Sex, another medical school professor, you know, it's kind of like for
00:11:46.100 undergraduates, what else is there? It was just fascinating. But what was really brilliant about
00:11:50.460 what Bill did is he actually offered a couple of courses. You could take a course and become a TA
00:11:55.300 for Sleep and Dreams, or you could take another course and learn where to put electrodes and how to
00:11:59.440 score sleep and actually get involved in research. So the summer of my sophomore year,
00:12:03.760 rising junior, I actually got signed up to be a research assistant, staying up all night in a
00:12:09.560 laboratory. Think about it, that as an undergraduate was an incredible experience. You'll decide if we
00:12:14.760 talk about this later, but everything got canceled that summer except one project, which was studying
00:12:18.920 the effects of the waterbed surface on Sleep. It was the 70s, and I actually met my wife during that
00:12:24.320 study. It's worked out quite nicely. But it was a way to get involved in research as an undergraduate
00:12:28.920 that you don't usually get the opportunity. So that kind of changed everything. And frankly,
00:12:34.080 when I graduated, Dr. DeMent made an offer for me to stay and run research projects. And the reason
00:12:39.340 I bring that up is because if you knew where the electrodes went, and this is a K-complex,
00:12:43.140 a sleep spindle, here's how you score sleep, you could run projects. And so I did that for a few
00:12:47.600 years, and I tell people that was like doing a postdoc before going to graduate school.
00:12:51.720 My direct supervisor was Dr. DeMent. Okay, so I did three years of doing it. And that's why when I was
00:12:57.640 ready, I actually applied to medical school, but changed because at the time, there was no sleep
00:13:02.200 specialty. There was no sleep medicine fellowship. There was no training in any of that. Dr. DeMent
00:13:07.920 was pushing to make that real. And so I decided if I wanted to spend a career looking at sleep,
00:13:13.060 it was go get a PhD. And so as you mentioned, I ended up at Yale, great academic research program,
00:13:18.820 but could also get clinical training. So if I wanted to deal with humans and projects and stuff,
00:13:23.680 I could get my training in clinical psychology, but still do research.
00:13:26.760 The interesting spin actually after that is, and I appreciate the chance to talk a little more
00:13:32.520 about this than just dates. But when I was finishing my PhD, somebody I knew from Stanford,
00:13:38.720 Mary Karskadon, who had been at Stanford, and she and Bill actually created what is now the gold
00:13:44.280 standard for objectively measuring sleepiness called the multiple sleep latency test.
00:13:48.860 Nobody ever thought Mary would leave Stanford. And she was about to start a new position at the
00:13:53.440 Brown Medical School, an assistant professor there. She had one technician ready to go. And she called
00:13:58.320 me up and said, Mark, would you like to come and do a postdoc? So there were three of us getting her
00:14:02.460 program started at Brown. It was an incredible experience. The plan was to stay, go on the faculty,
00:14:08.160 et cetera. And instead, I ended up going back and working for Bill, running a human research program
00:14:13.660 for a while. So, you know, I'm hardcore academic sleep. But part of my job was to get new projects
00:14:19.800 going. And so I got engaged with NASA Ames Group, which is one of the NASA facilities out here in
00:14:25.140 Mountain View, California. And they were doing fatigue jet lag research, but didn't have many
00:14:30.160 actual sleep people helping them do that. Some chronobiology circadian people. But it was fascinating
00:14:35.000 because they were doing a study that required recording EEG in a cockpit. And they weren't really sure how to do
00:14:40.400 that. So part of my job was to actually help them problem solve that so we could record pilots' brain
00:14:45.360 and eye movement activity in an ongoing way during flights. Fascinating. And I mentioned that because
00:14:51.040 that was a transition really out of the very specific academic kind of environment to NASA.
00:14:56.520 And so I ended up being recruited to work there and directed the program at the NASA Ames Fatigue
00:15:01.860 Countermeasures Program for seven years. And it was fantastic. And I think it emphasizes what has become
00:15:07.320 sort of for me, you know, again, not just personal, but a clear focus of my career. And that is
00:15:12.860 the application of the science into real world application. And so that's really been a force
00:15:18.640 for me throughout my career, hence the safety emphasis. Great sleep science, but how do you use that
00:15:24.220 to help people every day? Whether that's driving a car or flying a space shuttle, what do you do to make
00:15:29.220 that better? I did that for seven years. That was commercial and military pilots, astronauts,
00:15:35.040 controllers at Johnson Space Center, et cetera. Fantastic, incredible work. Love to talk about
00:15:40.460 that too, I hope. And then I started my own company, which broadened it from aerospace to basically
00:15:45.120 everybody. So when I had my own company, we worked with folks in all modes of transportation, basically
00:15:50.340 all over the world in healthcare, energy, military operations, you name it, was fantastic.
00:15:56.660 The next part that was so interesting, part of the zag, was I had the opportunity to become a board
00:16:02.460 member at the National Transportation Safety Board. So there are five members that are there.
00:16:06.440 These are the kind of positions where they call you, you don't actually submit a resume.
00:16:10.080 I had done some work at NASA helping them identify fatigue in a DC-8 crash in Guantanamo Bay,
00:16:15.140 before anybody knew that there was a naval air station there, a DC-8 crash. And the NASA group,
00:16:20.100 we helped them at the NDSP define the methodology to investigate fatigue. And they ended up identifying
00:16:25.820 fatigue as the probable cause in that particular crash. And that has become the methodology that they use.
00:16:32.460 So 10 years later, it was amazing to basically get a phone call and say,
00:16:36.700 would you like to be considered for a board member position? So I was a board member at the NDSP for
00:16:41.540 five years, launched on seven crashes, and sat through about 50 investigations that we voted on.
00:16:47.700 And I was ready to stay for a second five-year term when I got a call to become the head of NHTSA,
00:16:52.920 which is the National Highway Traffic Safety Administration. And that's the administrator
00:16:56.640 administrator of the organization within the Department of Transportation that is responsible
00:17:01.500 for all car and vehicle safety, regulation, and enforcement. And I was there for just a couple
00:17:07.660 of years because that one is tied to the president, basically. Unbelievable experience. Again, if you
00:17:12.320 think about what we just talked about, which is how do you take the science and data and actually
00:17:16.320 apply it to make things safer, better, was an incredible experience there. And when I left Washington,
00:17:21.120 I actually came back and worked as the chief safety innovation officer at an autonomous vehicle
00:17:25.000 company, which is like a way at the other end of the continuum that's been fascinating as well.
00:17:30.700 Wow. It's interesting because there's this marriage between two things that you're talking about,
00:17:36.980 right? Which is one could have stayed entirely within the world of sleep and done obviously very
00:17:42.900 interesting work. And you are fortunate to have been at Stanford, which in many ways certainly was the
00:17:48.680 epicenter of sleep research, at least in the US and potentially in the world. And then you've also
00:17:54.940 got this interest in safety and crashes, right? Obviously the NSTB. So we're going to talk about
00:18:00.920 both of these things today. I think I'd like to just start probably on the automotive safety side of
00:18:08.540 things. And I think what will come out of that is the role that fatigue plays. And then we can
00:18:13.800 certainly talk about that. I said earlier, we're going to put some statistics out there.
00:18:18.040 Can you give me a sense of what is the risk for death or injury that somehow touches the road?
00:18:27.620 So that means, again, you're a passenger in a vehicle, you're driving a vehicle, you're a pedestrian
00:18:32.120 struck by a vehicle, you're a cyclist struck by a vehicle. However you organize that, Mark, can you give
00:18:37.280 me a sense of what that looked like in 1950-ish versus 1970-ish versus 2000-ish versus today? Or give us a
00:18:46.680 sense of what that looks like?
00:18:48.740 Let's start with the final numbers from 2021, which is the last year we actually have complete data.
00:18:56.460 42,929 people lost their lives on our roadways. And you were just saying it says drivers...
00:19:03.360 Gee, gee, say that again. How big?
00:19:05.500 42,929 people in 2020. That is 118 people every single day. And good for you. It's like,
00:19:14.740 give me that number again. This is what I was telling you. It's like, you should know the exact
00:19:17.780 number because those are people, individuals. We can't bring them back. That's 118 every day.
00:19:23.000 And a lot of times people often, it's like, so how come I don't hear more about this? Or why are we
00:19:27.140 fighting? Like pandemic meant it was all hands on deck. Let's go get this. Like, how can we put up
00:19:31.880 with this? When you think about it, these happen geographically separated. So these are happening all
00:19:37.880 over the country. And very often it could be an individual in one of those vehicles. So it may affect
00:19:43.560 your family or your community. But very often most of these go unreported in the general media or
00:19:49.220 visibility for our society. 118 lives every single day. Just to put it in context, along with that,
00:19:57.280 we have about two and a half million injuries, which are everything from slight to very serious
00:20:02.780 life-changing injuries. And those are in the context of six plus million crashes every single year.
00:20:10.320 And I won't go into this in any depth, but just to put it on a global scale, it's about 1.4 million
00:20:17.100 people globally every year. And that's about 3,700 people every single day. But we're going to focus
00:20:24.900 just on the United States. And I think your question is really important because we know
00:20:30.080 over the years, things have come down very significantly with all kinds of different things I'm sure that
00:20:35.700 we're going to talk about. But it used to be like literally 100 years ago in 1923, it used to be
00:20:41.980 about 18.5 deaths per 100 million miles. Now we're down to about 1.5. So we often talk about vehicle
00:20:50.620 miles traveled or VMT. And that's important just because of the number of miles driven and the number
00:20:56.440 of people that are out there doing. There's a lot of different ways to cut this. And by the way,
00:21:01.160 I'm sure we'll go into some detail, but at NHTSA.gov, NHTSA keeps all kinds of data on this. You can look
00:21:09.080 up any year you want. And it's segmented by ages and geography states. You know, I mean, it's like the
00:21:16.140 level of detail is unbelievable what's out there. I will just comment that for a lot of people, when they
00:21:21.040 look at the data, the reason I cite 21, it's the last year that we have actual final data. So we can talk
00:21:27.560 about what we know from 2023. But those are estimates just done on a quarterly basis. And I'm sure we'll
00:21:33.760 talk about it. But we get those and they adjust up and down a bit. But it's the big numbers that really
00:21:38.260 matter.
00:21:39.400 There are two things that just jump out at me when you rattle off those stats, Mark. The first is I'm just
00:21:45.040 doing really quick math in my head, both on the global count and on the U.S. count. And this is
00:21:50.600 approximately one tenth the mortality for cancer. In the U.S., it's a little bit better than one
00:21:57.660 tenth. But globally, it's actually slightly worse than one tenth. A, that's incredible. It might also
00:22:03.400 say that in the U.S., we are probably safer on average than globally on an individual basis. So
00:22:09.500 not sure if that's true, but that would be my first thought. The second thing that just kind of jumps out
00:22:15.400 at me when you talk about this is something I remember somebody saying once that I think is
00:22:20.780 absolutely true. If you read about a death in the newspaper, it's because the manner in which it
00:22:26.820 happened was so unexpected or is somehow so horrific to us. And that's why you don't read about people
00:22:32.680 having heart attacks. I mean, let's not lose sight of the fact that that's the leading cause of death.
00:22:37.400 And that occurs 20 times more frequently than what we're talking about. And I don't know the last time
00:22:42.680 I read about somebody having a heart attack for the sake of having a heart attack and dying,
00:22:46.540 if it's somebody famous or whatever. But common things just generally don't get reported on. And
00:22:52.200 I think sadly, that's probably why we are a little bit numb to what's going on here, even though I would
00:22:58.900 argue that there's a difference, which is those of us who are driving by and see this carnage are kind
00:23:07.120 of left a bit visibly shaken by it. And I find it difficult to get information. I'll give you one
00:23:12.660 example. I live in Texas and we have big roads out here that are big and fast. And I think a lot of
00:23:19.380 them are not necessarily set up as the safest roads. There's a particular road near where we live,
00:23:25.000 where I believe there's probably a fatal accident on this road three or four times a year. And yet,
00:23:32.100 even after it happens, I'll go and do a quick search to see if I can get more information. And it's
00:23:36.940 not readily available. It's not entirely obvious what just happened. So if that's happening,
00:23:42.660 under my nose where I'm seeing the accident or, you know, seeing the aftermath of it within 25
00:23:47.660 minutes or something, I can completely appreciate why most of these deaths go unnoticed by all those
00:23:55.300 except for the people directly impacted by the relationship. That's the numbness. They're off the
00:24:01.500 radar. They're not visible, even locally to many people. So that's why I often cite NTSB statistics or
00:24:07.560 experiences because those are usually not necessarily mass casually, but there's usually
00:24:12.240 a lot of people involved. There have to be fatalities and they're very visible. As you
00:24:16.680 mentioned, very often it's because they're visible to the entire country, you know, and they get
00:24:21.480 invested. It takes a year, 18 months to investigate those, et cetera. But as you're pointing out, if it's
00:24:25.620 not in the local police blotter that it happened, there's almost never follow-up that actually say,
00:24:30.960 oh, by the way, you know, what we reported on last month, here's what actually happened there.
00:24:35.020 And I just mentioned that because I tell you from the NTSB, when you investigate these,
00:24:39.340 it's always a chain of events. It's never just one thing. And it takes a while to figure that out.
00:24:44.920 And just to be clear, a lot of that sometimes is just the local resources. Just to be clear,
00:24:50.220 you know, in the last few years, traffic enforcement resources and local police departments have gone
00:24:55.500 down. So as you were just saying, they may go and run to the scene when they've got to help people
00:25:01.120 in that emergency situation. But the ongoing investigation may be checking a box on a form.
00:25:07.220 So it stays below the radar for most people, unless you're in the circle that that person
00:25:12.560 touches your life. Yeah. And I struggle with this a lot, Mark, because I feel like there's this
00:25:17.920 enormous missed opportunity. So I look at an intersection that's no more than three miles
00:25:22.700 from our house. We haven't lived here for four years. I have already seen three fatal accidents
00:25:31.480 at that intersection in four years. And I can't find any really good information about the chain
00:25:37.320 of events and what happened. I have some sense a little bit. But to me, it's like, why aren't there
00:25:43.920 four minute videos being made that explain every one of these as a here's what not to do.
00:25:50.720 Here are the warning signs. At this time of day, when the light is this way, this is a very easy
00:25:57.640 mistake to make. And the other thing we don't really have a sense of, Mark, is the near misses.
00:26:01.480 If three accidents have resulted in fatalities, and by the way, I'm not saying three people have
00:26:06.660 died. I'm seeing three fatal accidents. It's probably six or seven people have died there in
00:26:11.140 three years. But I mean, you can tell me, Mark, but what would be your guess as to how many
00:26:16.840 accidents have occurred there or near misses occurred there that could have resulted in
00:26:22.440 fatalities? Do you have a sense of how you could even estimate that?
00:26:25.700 You could. And just two things about this. One, you're describing in what we call in human factors
00:26:31.200 that safety pyramid. Crashes are at the top. Near misses are right underneath that, but a bigger
00:26:36.840 layer. And the layer underneath that are errors. And this is why a lot of people where you see
00:26:41.080 proactive safety in aviation, they do a lot of work trying to capture those errors, knowing that
00:26:46.420 they lead to near misses. And when near misses get more visible, you see the panic in aviation right
00:26:52.480 now because they're seeing more critical near miss incidents because those are the precursors to the
00:26:57.060 crashes that occur, to your point. So you could calculate that. And I got to tell you, make a list
00:27:02.400 because out of this conversation, there's some concrete things that either you or I need to pursue.
00:27:06.720 The one you just mentioned is we got to make these more visible rather than just the family and
00:27:12.260 community feeling it. It's your point, which is even in a four minute video, you could capture what
00:27:17.900 happened and what was learned from that. And that could be enough for people to say, okay, I got to pay
00:27:22.920 attention to that because I go through that intersection every day twice. Yeah, exactly. Four times a day, I am
00:27:27.900 driving through that place that is the Bermuda triangle of death. And once a year, there's going to be a
00:27:35.880 death or series of deaths, probably five times a day, there's a near miss, probably 60 times a day,
00:27:44.180 there's an error that could have even been a near miss. And yet I don't actually know the predisposing
00:27:49.860 factors. I'm kind of extrapolating and making up from the data out there that I know loosely that
00:27:56.060 we'll talk about today. But yes, this is very troubling. So I think we've kind of hopefully made
00:28:00.100 the case for why this matters. Just a couple of other questions before we get into some of the
00:28:03.540 specifics. And I won't expect you to rattle off with the same precision. But can you give
00:28:10.040 some commentary on what these numbers would have looked like 20 years ago, 40 years ago, 60 years
00:28:16.500 ago? I mean, I have to think that with airbags and seatbelts and better cars, better car technology,
00:28:23.260 in terms of the collapsibility of cars, that things have gotten better. Like, is it easier to just start
00:28:29.400 in the 1950s? And presumably, it was just mass carnage? What's interesting is, as cars just
00:28:35.140 became more prevalent, that's when the numbers went up. And yes, in the 50s, you saw huge numbers,
00:28:40.360 which were actually part of what initiated efforts to establish the National Highway Traffic Safety
00:28:45.880 Administration, which was what had responsibility for creating federal motor vehicle safety standards,
00:28:51.400 like crash testing, all of those. So if you go again, 100 years back, we're probably over 90%
00:28:59.020 reduced to where we are today. Probably in the 50s and 60s are where you saw the peak of those
00:29:03.940 numbers, and they've been coming down. I would just say, because we're talking about this in the
00:29:08.520 statistical sense, what's interesting is we have really population level statistics we talk about,
00:29:14.020 42,929. What's interesting is we get these estimates every quarter, and then it takes a while
00:29:19.860 to finalize it. Again, we don't need to get into it, but literally NHTSA collects reports from every
00:29:25.740 police department in the country to come up with these numbers. So it takes a long time to collect
00:29:30.340 all that data, review it, finalize it, et cetera. But I say that because the estimates often prompt
00:29:36.340 people to say, oh, we're down 3% this quarter from last quarter. I think we should celebrate those lives
00:29:42.780 that have been saved, that are still with us. At the same time, that 2%, 3%, 4% up or down,
00:29:49.820 within the context of the overall population numbers, we haven't budged much.
00:29:54.040 And I'm sure we're going to talk about it, but I got to give you one example.
00:29:57.540 You know, a lot of people think, you know, we've cured drunk driving. It's been around forever.
00:30:01.840 We know about it. We've got all these things we can do. When you look at the 42,929, the top three
00:30:08.380 causes in there, impaired driving related to drunk driving, that has stayed about 30% for 20 years.
00:30:17.700 So 30% of the lives lost are due to that for about 20 years. So the absolute number has come down,
00:30:23.340 but the percentage has stayed about the same for 20 years. And then again, on that list right below
00:30:29.220 that is speeding. And now we've got number three, that's distraction. And I'm sure we'll be talking
00:30:34.240 about all of those. But again, I think to your point, it's both the large population numbers,
00:30:39.380 50s, we started seeing that peak. Now, again, it's down to about 1.5 per 100 million miles that we see.
00:30:45.560 But I do warn people is, again, when you start looking at just the quarterly to quarterly estimate
00:30:51.140 up or down by one or 2%, that will belie what is actually the larger population, things that aren't
00:30:57.140 changing dramatically enough if we're ever going to actually say get to zero deaths on our roadways.
00:31:02.760 And Mark, you mentioned that alcohol contributed to about 30% of those fatalities. What is the
00:31:08.120 contribution of speed and distraction approximately?
00:31:10.340 So speed's over 20%. I want to say somewhere between 20% and 25%. Distraction is very difficult,
00:31:18.420 right? I just actually called a friend of mine. There's this guy, Larry Blinko, who's a great
00:31:24.040 statistician at NHTSA. He's starting to think it could actually be up to 30%. And part of it,
00:31:29.800 you got to realize these are not individual numbers. So somebody drinking could be on their phone,
00:31:34.140 they could be going too fast. Somebody who's on their phone could be going, right? It's like
00:31:37.740 they all mix and match, but they're pretty big numbers. So it's at least a third, which again,
00:31:43.520 takes you into like 33%, 35% range for the alcohol impaired region. Speedings in the, again,
00:31:49.300 20% to 25% probably. And the distraction used to be considered somewhere in the 12%. But I think
00:31:55.060 Larry's new data is going to suggest it's probably much higher, maybe close to a third with some of
00:31:58.940 them. And especially if we talk about it, distraction is more than just your phone that people do in
00:32:04.320 their vehicle. So when you look at all of it, just like impairment is more than alcohol now.
00:32:08.400 Yeah, that's a great question. So to be clear, I said alcohol, but really impairment is that broader
00:32:13.500 category of which alcohol is probably still the most prevalent. In fact, at NHTSA, it's so appropriate
00:32:20.000 for this conversation, but impairment was always three Ds, drunk, drugged, distracted. And when I was
00:32:27.780 there, it's like, and I credit my son actually, it's like, where's the fourth D dad? Drowsy. Because really
00:32:35.400 any one of those would be enough to impair your driving ability. So now impairment, you do have
00:32:40.380 to think broadly that it could be any one of those. To your point, alcohol is still the number one.
00:32:45.360 Yeah. Well, there's so much I want to talk about. I'm just trying to think about how to,
00:32:47.980 how to structure us so I don't miss anything. Let's talk a little bit about, let's go deeper into this
00:32:53.700 contributing factor thing. Okay. So, um, the most obvious thing in the distraction realm is the
00:32:59.700 advent of the phone. So prior to, I don't know, early two thousands phones were not readily used.
00:33:08.440 Can we appreciate, or is there an appreciable signal in the data that suggests that the downward
00:33:15.940 trajectory of mortality has been slowed or in any way altered with the introduction of mobile phones
00:33:24.260 20 to 25 years ago? I think the way we would be able to characterize it is that we stopped seeing
00:33:29.820 the decrease that we had been seeing. And I say that because as you know, two things, one is it's
00:33:35.640 really hard to measure the distraction numbers. I mean, it's just so difficult. Same thing with
00:33:40.740 drowsy sleepiness and stuff. And the other part is just the evolution of the phones where first it was
00:33:45.680 just, you know, on a phone call, if you will, but now it's texting and looking up stuff on the,
00:33:49.760 I mean, it's just so diverse, the activities that you could be engaged in. And so again, I think that
00:33:54.520 maybe the way to think about it is more that we were making progress in a bunch of ways that
00:33:57.980 flattened or maybe got a little bit worse. And we may not be able to put all of our quote variants
00:34:02.920 appropriate that to specifically just phones. But I would also say it took to a new level,
00:34:09.180 the distractions you could get in your car. Because some of them have been around for a long
00:34:13.400 time playing with the radio. We can talk about buttons versus touchscreens, et cetera. Kids in
00:34:18.740 the backseat, those kinds of distractions have been around forever. But again, I think as those were
00:34:23.200 getting more controlled, you saw the numbers come down. Phones, I think, again, we could look at it more
00:34:28.920 as a flattening out, probably. Has there been any significant change in the past 40-ish years with
00:34:37.440 respect to the demographics of the drivers at fault in these crashes? In other words, are we seeing a
00:34:46.520 shift to younger people, to older people, anything that you can point to that there's a causal explanation?
00:34:52.620 So there are two groups that seem to be most at risk. Those are the 16 to 17-year-olds
00:34:59.100 and about the 65 to 70-plus. And the conjecture around those two, clearly we're talking about
00:35:09.720 people who are just learning to drive and who are at an age that don't have that frontal cortex
00:35:16.020 fully developed. I have a friend of mine, Greg Belenke, always talked about it. It's like there's a
00:35:20.320 hole in their head right there. So we're letting those people behind a couple tons of metal who
00:35:25.000 would have very little experience in just learning stuff. And at the other end, you've got people
00:35:28.740 getting older who actually, some of it may be aging-specific effects. The other is more just how much
00:35:34.400 they're actually driving and experience and that sort of thing. But those are the two main age groups
00:35:39.820 that we see that are mostly affected. More men than women die in these crashes. So many questions with
00:35:46.720 that. Some of it's segmented. Some of it, to your point, the causal or contrivative parts of that
00:35:51.140 would just be us with hypotheses about what's actually causing those differences and things.
00:35:56.700 What was the rationale for letting people drive at the age of 16 when, if you think about other
00:36:03.440 things that are mandated by age, you have an age at which you can join the military. You have an age
00:36:09.560 at which you can vote. You have an age at which you can drink alcohol, purchase firearms.
00:36:14.580 So there are various things that seem tethered to age, but driving is the youngest. I've always
00:36:21.620 wondered, I didn't get my driver's license until I was like almost 18 because I viewed it as a badge
00:36:28.500 of honor to ride my bike and take the bus everywhere. And I didn't want to be lazy. Like I was a weird
00:36:32.240 kid in that way. So I personally can't relate to what it's like to be a 15-year-old who's dying to get
00:36:37.140 his license. But tell me a little bit about that process and how that came to be. And for example,
00:36:41.780 given the stats you've just shared, why that age hasn't been pushed up?
00:36:46.760 I don't know you. I don't blow smoke. And so I don't mind telling you. Nobody's asked me that
00:36:51.580 question before. I'm going to go look that up now because nobody's really had a discussion around
00:36:56.100 that. The discussion is all about our education system, driver's ed. In so many other areas,
00:37:02.180 we've got recurrent training. I mean, you have that in medicine, you have that for professional
00:37:05.900 drivers, pilots, et cetera. How they actually came up with that age and came up with what are
00:37:11.420 we going to do around that age to actually prepare these people for lifelong driving experience? I'm
00:37:17.640 going to go look that one up. I don't know. That's the first time somebody's actually asked
00:37:21.040 me that. But it's fascinating because I would also say part of your point is we've actually not
00:37:26.340 gone back to question whether we need to change that or not, which, as you know, comes up all the time.
00:37:31.640 I can give you an example. I could pontificate and say, look, I mean, kids were working jobs.
00:37:36.740 They needed to be able to get there and dah, dah, dah. And it's like, okay, maybe all those
00:37:39.380 things are true. Is that true today? I don't know. The other thing that has always struck me,
00:37:44.720 one thing you should know about me, Mark, I do love driving and I love driving race cars.
00:37:49.220 So I love all things motorsport related. And I love, I love drifting, like doing all of these things.
00:37:55.460 And one of the things I've been trying to encourage my wife and daughter, my daughter's 15.
00:38:01.100 We're coming into this discussion and I've been trying to organize a course for my daughter and
00:38:07.060 some of her friends where we get a group of really good driving instructors on a 20 acre skid pad to
00:38:13.720 really teach them high end driving skills. The stuff I certainly didn't learn when I was young,
00:38:19.840 but the things that I've learned driving a race car, which is everything not to do,
00:38:25.700 like your natural inclination when this happens is going to be to do this and you will spin the car.
00:38:33.200 And if you're lucky, nothing else will happen. If you're unlucky, you'll hit something else. And
00:38:38.560 if you do this, you'll actually flip the car. And I don't believe that you can just academically
00:38:44.920 out learn that you have to just do the reps. You have to be on the track in the car doing it over and
00:38:51.480 over again. So again, I'm guessing that there has been some calculation that has said we can't
00:38:57.740 justify putting those resources into mandatory driver education, right? It's just people have
00:39:02.940 decided that we just can't request that kids learn that.
00:39:07.680 I don't think so. I think it's the first part of your question, which is, I don't think people
00:39:11.560 have questioned from the beginning. Why did we even start there? Are we preparing these kids well
00:39:17.120 enough for this life experience of driving? And how do we revisit that in our knowledgeable,
00:39:24.100 technology-driven society? How do we actually go and upgrade that to something that could actually
00:39:29.080 save their lives and the people around them better? I don't think those questions have been asked.
00:39:33.760 And I'm with you. I think the intellectual academic part of lessons, great. But unless you do the muscle
00:39:40.320 memory part of the behavioral piece of actually experiencing it, I don't think there's any question.
00:39:44.940 And I don't mind telling you, since you're familiar with F1, you know Jean-Todd. And I've gotten to know
00:39:49.500 him over the years. He's got this high-level panel. And what's been fascinating, he is now the UN's
00:39:55.040 sort of ambassador for global road safety. Oh, I didn't know that. And he has taken that on from
00:40:00.800 not only his F1 days, but his Ferrari days, etc. But he's taken this on with a passion. How do we take
00:40:07.760 what we've learned there and applied it? And why am I mentioning that? Because you already have the one idea
00:40:11.520 about the videos. This is another one you should write down. Because this is one of those investments,
00:40:16.540 right? If we get these kids early, it's a lifelong investment of the ones who actually learn how to
00:40:21.700 do this and be better at driving than the ones who sit in the course or do it online and never get
00:40:27.560 behind the wheel to know where the signal is. That's not really much education. And again, I think
00:40:32.940 it's more not that the analysis is done and we should write it off. I think it's people haven't asked
00:40:37.300 those questions or taken the time. And I mentioned John just because, you know what? Let's send him
00:40:42.340 a letter. Let's come up with what you're doing. Formulate that and send a letter and say, so
00:40:46.480 somebody at the very least ought to do this and let's study it and see if you can't come up with
00:40:50.780 a course that would make sense. Because what we just talked about is that investment now could be
00:40:56.600 huge in saving lives and costs. Yeah. Let's talk a little bit about the locations. I always tell my
00:41:03.060 patients that there are three areas where you need to really have a heightened sense of awareness
00:41:07.600 to protect yourself. And feel free to just correct anything I'm saying that is not 100% correct.
00:41:14.160 Because again, my data is a little bit old, but the three things I say is one, you have to be
00:41:18.660 hypervigilant in intersections. And that's both the standard four-way intersection, but also a T
00:41:23.680 intersection. Like when you're coming out of a mall or something and you're coming out of a gas station
00:41:28.220 or something like that. The second place is the two-way traffic without a median. Devastating,
00:41:35.560 especially as I said, we have these roads here that are 60 mile an hour roads with no median.
00:41:41.500 So personally, those roads scare me, but they're unavoidable where I live.
00:41:45.860 And then I think the third place you have to be aware is on freeways and in particular around exits
00:41:51.760 and on ramps where people are acting sometimes irrationally merging, trying to get off at the
00:41:58.200 last second, trying to get on doing silly things like that. And I basically say, look, if you can
00:42:03.440 harness the power of your attention only selectively while driving, make it those three spots.
00:42:11.720 So what would you say to that, Mark? What would you add to that? And can you comment on what fraction
00:42:17.700 of fatalities are a result of crashes under those three scenarios?
00:42:23.240 Yeah, I think you've actually nailed it. I'm not sure I can give you the exact percentages for each
00:42:28.020 one of those. It's actually in the order of you've got. And all I would do is say, you've got the
00:42:32.000 intersection part, which is any intersection, as you pointed out. The second is beyond just the median
00:42:37.260 in the road, it's any road separation. So that's true, not only of vehicles, but also pedestrians and
00:42:43.100 bicyclists. You would think about the median in the middle, but it's also on the sides.
00:42:47.900 Where unfortunately, the last 10 years, we've seen literally a 50% increase in pedestrian deaths.
00:42:53.520 Again, I'm sure we'll talk more about that. But again, that has to do with what you're describing,
00:42:58.000 not just the median in the road, but let's expand that to separation of the vehicles from,
00:43:04.020 again, the sides where pedestrians and cyclists might be as well. So I think those garner it and
00:43:09.660 you're on and off ramp of any kind of system. Again, with the issue there is the speed differential,
00:43:16.000 the challenge you have there is trying to figure out not just that I have to merge,
00:43:20.620 but the speed you need to merge with traffic going at whatever their flow is in that particular
00:43:25.260 situation. I think the other part, it's great for us to talk about the statistics. You know,
00:43:30.700 we love to hit the bell curves and like these are the highest. But I always point out, we also got to do
00:43:36.040 those edge cases because life still get lost in those. One of my mantras is never again. You know,
00:43:42.940 when those kinds of especially edge cases happen, you need to share that data. Hence,
00:43:46.900 that four minute video is so interesting, whether it's common or edge case. So it doesn't happen
00:43:51.060 again. You should need every intersection in the world or everywhere, you know, to go through that
00:43:55.300 kind of loss of life to decide we should change something. We should do something different here in
00:44:00.340 some way. And unless you make it visible, understanding the causal and contributory
00:44:04.440 factors, you can't make those changes. And just so I understand, Mark, I've seen accidents where
00:44:10.420 a driver makes a slight mistake, but it becomes catastrophic. And I'm always taken back to
00:44:16.860 1994 when my hero, Ayrton Senna died May 1st at Imola. And Max Mosley, who was the head of the
00:44:25.780 FIA at the time? I knew Max. Oh, you did? Wow. Yeah. Max said something very, very insightful
00:44:32.860 at the press conference in the days following Senna's crash, because the press understandably
00:44:39.280 were completely fixated on why he crashed. How did Senna crash? And Mosley said, you're asking the
00:44:47.520 wrong question. He crashed because he is the best driver in the world driving a car at the physical
00:44:55.300 mechanical limit of what it is capable of doing. Crashes are going to occur. The question isn't
00:45:00.780 why did he crash? The question is why did he die? And he really made sure that the sport took a turn
00:45:08.640 at that moment in, we will not tolerate drivers dying. We might not be able to stop the crashes,
00:45:15.820 but there will be no more deaths. And knock on wood, there has only been one death in F1 in the 30
00:45:22.320 years since. And what I find troubling is I'm gathering from you that that analysis isn't being
00:45:31.160 done for the 42 plus thousand people who died in 2021, where someone is saying, what contributed to
00:45:40.620 the death? And what can be learned about making that the ultimate thing that we put a buffer between?
00:45:48.160 For example, when a person makes a mistake, you always want to have more of a buffer for that
00:45:55.580 mistake to not result in the nature of force that could kill them. So for example, in a racetrack,
00:46:01.940 that's the difference between having a bigger runoff than a smaller runoff. That's the difference
00:46:07.060 between having more impact absorbing things in areas where we expect people to potentially go off.
00:46:13.260 And yet my guess is that analysis on the individual basis isn't necessarily being done?
00:46:20.000 That is correct. And I think what you do see are the level of analysis that we can cite,
00:46:26.660 the segmentation, et cetera, that again, are at sort of more population level than into the specifics
00:46:32.300 of a particular crash at that particular site with those particular individuals. Again, I'll cite the
00:46:37.600 NTSB. This is why it takes a year plus to do an investigation, because you're going to look at all of the
00:46:42.500 factors that are involved there, identify both the probable cause as well as contributing factors,
00:46:48.640 and then make recommendations so it doesn't happen again, which again, I hope we talk about this some
00:46:52.640 more, but those investigations are reactive, but they're intended to take information so you can
00:46:57.420 prevent them from reoccurring again in the future. And I think that, again, is what separates NTSB
00:47:02.560 investigations that are so thorough from, again, what we've been talking about at the local
00:47:06.700 police department level. They just don't have the resources or time and other kinds of things to go
00:47:11.620 and do those kinds of analyses. Does NTSB only investigate trains and airplanes and sort of
00:47:18.220 huge things? Like what's the mandate of the NTSB? Because it says transportation, which you would
00:47:22.360 think includes all forms of transportation, but obviously it can't do this type of analysis for
00:47:27.420 cars. Exactly right. The NTSB is required to investigate every aviation crash. So that's the
00:47:33.940 big commercial stuff, but that's also the local general aviation stuff that happens in every community
00:47:38.280 all over the place. But then it investigates specific crashes that occur that have national
00:47:45.320 importance. And so in every mode of transportation, mostly, for example, on the roadway, you see mostly
00:47:50.460 buses and trucks, because those are big. When there are fatalities, there's more involved usually,
00:47:56.360 but they can do single individual crashes as well. So when I was there at the NTSB, we did several
00:48:01.640 related to distraction with folks on their phones, for example. Now, when you go into it, you don't know
00:48:06.340 exactly what you're going to find, but you get enough information to say, we should go after and
00:48:10.000 see this because it would allow us to make recommendations that would have national
00:48:13.340 importance to do that. How did those come to you? Given the sheer volume of these crashes,
00:48:19.360 how would the NTSB make a decision or determination that this car accident is potentially one we're going
00:48:28.880 to put resources towards? I mean, do you have some inkling of what the cause is before? And you think this
00:48:33.860 is basically the illustration case? Let's pull the cover back on the NTSB a little bit to see how
00:48:40.040 this process goes. And it's very much what you're talking about. So we got to do all aviation. What
00:48:46.060 are the high opportunity investigations we could look for in marine, in roadway, in other areas? In fact,
00:48:55.200 people don't realize it, but pipeline and hazardous material also fits under there. So pipeline is a mode
00:49:00.120 of transportation that gets investigated as well. But to your point, what happens is it's like, so
00:49:05.340 what are the hot issues that we think are going on in this industry? And then you wait for calls. And
00:49:09.900 where do those call comes? They come from everywhere. A local PD might call about one that's going on.
00:49:14.420 You could get it from official. The FAA lets you know about a crash of something that's going on
00:49:18.680 somewhere. There's an op center, 24-7 op center at the NTSB. It gets all of those phone calls.
00:49:24.060 And again, to your point, you won't know ahead of time, but if somebody says, well,
00:49:27.240 we think this was involved, we think that was involved, you have to make a decision. Do we go
00:49:31.480 after that and look and then maybe find out it wasn't the case? But if it is, that's the one that
00:49:37.400 you want to be able to go in depth on so you can make recommendations to the entire industry or
00:49:41.800 individuals or whatever you need to do to try and get changes. Now, when we think of the NTSB or when
00:49:46.660 I think of the NTSB, I think of aviation and everybody thinks of the black box. You guys have this
00:49:52.800 amazing data recorder that almost without exception is recoverable and it contains not only what the
00:49:58.780 pilots said up until the moment of impact, but also the telemetry. You actually see this was the
00:50:05.260 position of this aileron. This was the thrust on this engine. This was the yaw in the stick. I mean,
00:50:10.380 you know every detail of it. What do cars have in them that allow the NTSB to do that kind of
00:50:18.500 thorough investigation? I mean, I'm a little embarrassed. I don't know the answer to this,
00:50:22.040 but I assume the car still contains some sort of a black box for telemetry. I mean, I know my
00:50:27.080 racing cars do, but I assume streetcars have some of that. Yes. And just to start, those black boxes
00:50:32.120 are actually orange. Yeah. And that's partly so you can find them, right? Yeah. Yeah. That besides all
00:50:37.100 the other beeping that's going on, it's like sometimes you can just spot it that way. Vehicles do have an
00:50:41.960 EDR, electronic data recorder, and there's actually no reason you would even know that because you don't
00:50:47.200 really have access to it. And what's interesting is most states, another conversation if you choose,
00:50:53.380 but there are some of these things that are controlled at the federal level through NHTSA,
00:50:57.260 Federal Motor Vehicle Standards and other kinds of regulatory authorities that are there. And many
00:51:02.900 of these things though, they're actually controlled at the state level. Okay. So that's kind of important
00:51:07.660 about which things are. So EDRs are where there's a basic federal requirement, but to your point,
00:51:13.900 in aviation, there are hundreds, sometimes now the newer ones, thousands of variables that you can
00:51:18.660 get from those recorders. And the EDR in most vehicles, it's the basics. And there's always this
00:51:24.580 tension between the industry about how much data they want to leave out versus those on the
00:51:29.900 investigation side that no, more is better. So some of these very recently only recorded like
00:51:35.560 30 seconds or a minute worth of data, maybe three minutes back. And then, and this is a debate even in
00:51:41.400 the aviation ones in the old days, they used to record over. For those of us who want to say,
00:51:46.160 how long have they been breaking hard like that? Or how long have they been at that speed for whatever
00:51:50.480 else? In a vehicle on the road, EDR, you may not have a long period of data to track that kind of
00:51:56.360 thing. So it's both the variables that are recorded and the amount of time that it records,
00:52:01.080 and that it's available to you post crash to be able to look at all that and reconstruct what
00:52:05.840 actually happened there. And is that one of the more important data sources you would rely on
00:52:11.180 when the NTSB did indeed come in for auto accidents?
00:52:14.980 Absolutely. That and of course, you talk to everybody. So witnesses not only in the scene,
00:52:20.100 but you talk to family and other people, you know, work, all that kind of thing. And within the fatigue
00:52:25.100 realm, when you look at that, it's literally if this is a trucker who was on the road, you're looking
00:52:29.580 at literally, where's their hotel motel key? And can you actually see that they got in the room? And
00:52:34.700 then we have to look to see if they were on their cell phone or not, where they're actually sleeping
00:52:37.980 when they, you know, it's like, you go with the electronic pieces you can, and then there's all
00:52:42.620 the other human elements of that that you use to try and piece at a minimum, the three days before
00:52:48.280 the crash. And again, you extend that based on what you're finding to go back, which is why I'll just
00:52:53.020 make one thing. I have a few soapboxes I'm sure we'll touch on. One of the hardest things is people
00:52:58.160 speculating in the first 24, 48 hours of what happened. Having launched on seven different
00:53:03.380 investigations. The first one was a Reno air show crash was air races in here. Huge. But what happens
00:53:10.480 is everybody, the media in particular, is in the first 24, 48 hours. So like, here's what caused
00:53:14.820 that. And then before you know it, it's like already established. And I used to go back to the DC and say,
00:53:20.520 okay, so how often do we actually see, you know, that first 24, 48 hours speculation end up in the
00:53:26.240 final report? People couldn't really think of many, maybe a little element of it shows up. The thorough
00:53:31.040 investigation uncovers things you would have never known if you stopped at that first 48 hours.
00:53:35.800 So my biggest thing from a safety standpoint is if you're not careful, you take action based on the
00:53:41.100 speculation. Now you've spent a year doing a bunch of changes and things that may actually not have had
00:53:45.860 any role in the crash that you were investigating. So the speculation can really bite you if you're not
00:53:50.900 careful. Hence the data sources like EDRs are critical. What can you glean about phone use in the car?
00:53:58.180 It's so common now. I sort of have this innate anger when I pass somebody and they're on their
00:54:05.240 phone. And I don't mean like they're talking and the phone is wherever it is. I mean, they're holding
00:54:10.260 their phone and you can see it as you drive past them. I sort of feel like, is this that much different
00:54:16.060 than if you were drinking a beer and I could see the bottle? What is your view on that? And how difficult
00:54:23.180 is it for, let's assume that a crash does not rise to the level of being one of the very, very few that
00:54:29.180 the NTSB looks at, but yet it still becomes a manner of like a criminal prosecution or something
00:54:34.860 like that. I mean, how much data are they able to infer if a person wasn't actually speaking on the
00:54:39.960 phone, which I assume is the easiest thing to figure out from the cell signal.
00:54:43.280 So let's actually start with, if you're going 55 miles per hour and you take five seconds to look
00:54:50.960 at your phone, your eyes off the road, you can travel the distance of a football field.
00:54:57.440 You know, then that's playing with the radio on your phone, whatever it is. Five seconds at 55 is
00:55:02.920 enough to take your football field. And I'm sure we'll talk about this if we do more impairment stuff,
00:55:07.940 but there's a very straightforward, you know, when you're driving three things you need to be
00:55:12.520 taking care of your hands on the wheel, your eyes on the road and your head in the game.
00:55:18.320 So when you talk about distraction, anything that pulls you away from one of those
00:55:21.980 is going to be a problem. And we now know that talking on the phone can degrade your performance
00:55:28.320 equivalent to 0.08 alcohol kind of performance decrements. Again, we don't have to get into all the
00:55:34.340 data, but there's really good data that even hands-free can do that. Because as you know,
00:55:39.520 there's no such thing as multitasking, right? It's switching, et cetera. And so people are like,
00:55:43.480 it's legal. You're like, hands-free, it's okay. It's like, no, it's not. Because if you're engaged
00:55:47.220 in that conversation, then your head's not in this game, whatever. And so again, I think that's one of
00:55:52.760 the challenges you have when you're looking at all this is how diversity are. So to your point,
00:55:56.980 the NTSB will go and get all the kinds of data. Locally, they're probably never going to get that.
00:56:01.880 So what would you look for? And I often think about this when I do drive by those people,
00:56:06.200 usually because I'm going the speed limit here and they're going less than that in a faster lane
00:56:10.840 because they're sitting there staring at something, right? Is I'm always sitting there thinking,
00:56:14.540 I know enough to basically ask the police department, get those phone records. I want
00:56:18.740 to know if a signal was bouncing off a tower somewhere locally, was that going on? And of course,
00:56:24.020 nowadays we have more cameras everywhere. And so can I show that they actually were on that phone
00:56:29.120 at a certain place, whether it's their video or video from a vehicle you might be in, etc.
00:56:34.080 There are more sources for that kind of information. But to your point, if you didn't
00:56:38.120 know about an EDR or video or that, again, an investigation of the NTSB, you can get a subpoena
00:56:43.840 if the phone company won't give it to you and literally get the records to know if they were
00:56:47.500 on the phone during a certain time. That stuff's available. In most crashes, insurance companies
00:56:52.020 even aren't going to necessarily pursue all of that. But those things would be available depending
00:56:56.320 on what happened for you to, again, like any other investigation at NTSB level, you could go after
00:57:01.300 that stuff to determine what was really there. Just a couple of weeks ago, we're on El Camino here
00:57:07.740 in California. So it's three lanes, 35 miles per hour speed limit. And we're stopped. We had seen a
00:57:14.920 motorcycle officer who had stopped somebody a while back. All of a sudden now he's coming up on the
00:57:19.400 left. We're like in the middle. So there's another vehicle on the left. But the guy's coming up
00:57:23.020 splitting the lane because we're stopped. And he's literally looking in the cars because I can see
00:57:28.120 his helmet turning and looking in the cars. And the guy next to us was on his phone. I mean, I was
00:57:32.080 sitting there just watching him, you know, and the cop came up and looked in there and like start
00:57:36.140 shaking his head. And the guy put it down. And I'm thinking like, that's the difference when you
00:57:40.940 actually have someone looking at you, whether it's video or something else that says you shouldn't
00:57:45.280 be doing that. And you and I know that as soon as the cop went by, he's back on the phone again,
00:57:50.000 right? So Mark, have there been any technological solutions proposed to dramatically lessen the
00:57:58.680 burden of phone use while driving? And again, to be clear, I say this as someone who would
00:58:04.840 be inconvenienced by it myself. I'm constantly using long drives when I take them as a chance
00:58:11.680 to get caught up on phone calls and listen to podcasts and audio books. And even though my hands
00:58:17.840 are on the wheel and my eyes are on the road, there's no doubt, especially in a phone call,
00:58:23.740 it takes my head out of the game a little bit. So again, phone companies and consumers alike would
00:58:32.860 be, I'm sure, opposed to this. But is there no technology solution proposed to make it much
00:58:40.880 more difficult to be distracted while driving?
00:58:43.240 There are. It's interesting because companies did oppose it quite a bit, but there are now,
00:58:49.600 and you should look on your own phone, but there is usually now a button that says don't call. And
00:58:55.740 it has a message. It'll send a text or a voicemail basically and says, I'm driving now, call you later
00:59:01.820 or leave a message or whatever. So there are technological things.
00:59:06.080 Yes, but it still requires the user to take the action. And again, the question is like,
00:59:12.640 how can you make this so that you don't have to opt into it?
00:59:16.620 Right. And those could exist as well. They already do. Phones can now tell when your vehicle's moving.
00:59:22.880 So accelerometers and other sorts of things do that. And I think what you're getting to,
00:59:27.340 which is interesting, is that's a whole nother choice issue about you're moving above a certain
00:59:33.280 speed. So you're out of a parking lot or whatever else. We jam it or we just don't let it happen.
00:59:39.000 That would be an interesting battle, I bet, for some. And that's why it's so interesting about your
00:59:44.040 hypothesis there, because right now you could just make the decision. I'm going to shut it off
00:59:49.700 or I'm going to put it on airplane mode and I'm not going to get anything while I'm here.
00:59:53.560 So when I was at the NTSB, Debbie Herzman was the chairman and we were investigating a couple,
00:59:59.720 as I mentioned, crashes where clearly cell phones were an issue. And we made a recommendation that
01:00:04.740 cell phones should not be used in cars except for emergency situations. And I remember exactly
01:00:10.860 around that time because I used to call Debbie and we would be talking about stuff and she had a long
01:00:17.020 commute and she used to use that to catch up with all kinds of stuff. And then all of a sudden we were
01:00:22.120 doing these investigations and I would keep getting her voicemail. And then as we got closer to our
01:00:27.700 recommendation, et cetera, I said, I know what's going on now. She goes, yeah, I wanted to know
01:00:31.900 what it would be like to shut the phone off or put it in a bag in the back. And you know what?
01:00:37.120 It's really inconvenient, but I'm a better driver when I'm doing this than when I'm doing it
01:00:41.640 the convenient way. So to your point, there's the technological, that's sort of a societal,
01:00:47.600 back to our question, like, do we make that vote and say, no, we're not going to let you do that
01:00:51.120 if you're moving. But you always have the personal choice about if it's important enough to you,
01:00:55.700 you can decide. And the personal choice in some cases could also mean you can differentiate.
01:00:59.860 It's like, I'm on a strange road and I'm a little faster. I got more people in the car. I'm not
01:01:03.220 going to do that as opposed to open roads, speed limit. There's nobody around. Music's not even an
01:01:08.780 issue. Again, you could make more choices when you did that. Yeah. Or do I have the ability to override
01:01:14.180 it? And I mean, this is how I perhaps erroneously justified in my own mind, which is I'm now steeped in
01:01:21.480 the practice of identifying hotspots. And whatever it is I'm doing, if I'm listening to something or
01:01:28.620 if I'm on the phone, I at least convince myself, again, this might not be true, but I convinced
01:01:33.600 myself, okay, you're just going to pay more attention right now because this is the intersection
01:01:37.760 and you're looking both ways, even when you have the right of way or that kind of thing.
01:01:41.480 But yeah, the other example of this would be, again, it would be hideous to look at,
01:01:46.500 but you could easily install a breathalyzer in every vehicle that allows it to not start
01:01:52.780 without a blood alcohol below. You might even make it a more egregious setting than 0.08 and
01:01:58.020 make it 0.05 or something like that. And nobody would want those in their cars because they're
01:02:03.520 so unsightly. I think they do actually have devices like that for people who have been convicted
01:02:07.960 drinking and driving. But when you just look at the numbers of deaths due to intoxication and
01:02:14.560 distraction, it begs the question, right? Like how much would we be willing to be inconvenienced
01:02:20.320 to save, call it 50 lives a day? Because that's about what you would do. You'd save about 50 lives
01:02:26.640 a day if you took those two things off the table. If people didn't have phones in their cars
01:02:33.100 and they couldn't have alcohol in their system when they drove, that's a pretty good estimate,
01:02:38.060 right? About 50 people per day would be alive.
01:02:39.980 A third of 118, so maybe 40. So let's go back a moment. Realize you're different when you were
01:02:47.280 talking about being on the phone in certain places where you're going to pay more attention.
01:02:51.000 You're different in a good way because you're actually cognizant. You're thinking about those
01:02:55.620 things and what risk they would create for you. And again, in your framework, it's like,
01:03:00.700 what can I control? What's out of my control? Well, that's something you have control.
01:03:04.100 You're different than most people because you're even thinking those things.
01:03:07.160 So that actually does make you safer in those situations than someone who isn't even thinking
01:03:11.960 there's an issue with me talking on the phone and playing with my radio. And by the way, I'll do
01:03:16.900 this. It's like that puts you in a different place than those folks. But going to your other,
01:03:21.960 it's not hypothetical anymore. Congress just passed a law that new cars will have to have a technology
01:03:28.600 that can detect whether you're at the 0.08 level or higher. And what's interesting is that
01:03:34.400 technology has actually been in the works. So take it as a model for what you're talking about
01:03:38.060 with phones. But that technology has been in the works for a long time.
01:03:44.040 And it's another, it's called DADS. It's one of them. And it has a free breath sort of analyzer
01:03:48.980 and also stuff that's in the steering wheel that literally is looking at molecules in the air.
01:03:54.700 And it's what you said is it won't start if it hits. It's for everybody.
01:03:58.680 Now, how will it differentiate between someone in the car who's incredibly inebriated and
01:04:06.840 presumably putting particulate matter of ethanol or whatever metabolite is being searched? But the
01:04:12.260 driver, let's assume, is not. The designated driver. I mean, that's why you do that. Right now,
01:04:17.300 it's specific enough with the, and that's why I think the latest has both sensors, the one that's
01:04:21.840 in the wheel as well as the other more general one. And they've done enough research to be able to
01:04:25.960 differentiate those. The last thing I'll just say is what's interesting, that's one form of it.
01:04:30.700 The legislation that just passed is that basically car companies are going to have to provide a
01:04:35.000 technology without, again, it's more of a performance target than it is the mechanism.
01:04:40.040 And so other people could come up with different ways to be able to do that.
01:04:43.360 But I mentioned this because I think you're onto something here, which is it took maybe decades
01:04:49.260 to get this legislation from Congress to say we should have this in every car because of the lives
01:04:55.160 it could save. And part of that was the ongoing debate, sometimes about technical things, more
01:05:00.840 often about what we've started this conversation with, the societal value if we think it's more
01:05:05.700 important to inconvenience you if you're drunk than it is to allow you to kill someone else on the
01:05:10.520 roadway. When did drunk driving fall out of favor? Like, I get the sense that there was a day in
01:05:18.640 which driving drunk was a normal thing to do. Like, nobody cared that you had a drink and got in your
01:05:23.440 car. Clearly, that's not the case today, despite the fact that the numbers are still as high as
01:05:28.100 they are. Was there a particular moment in time in which an accident changed all of that? Or was it
01:05:34.200 just a general sense of awareness?
01:05:37.760 I will apologize to them, but it really was about MAD, Mothers Against Drunk Drivers. And the woman who
01:05:44.900 started it lost a child in a drunk driving crash.
01:05:48.580 And this is what, in the 80s?
01:05:49.840 Yeah, and that's why I'm trying, I'm kind of appalling, because I think it was late 70s,
01:05:54.180 early 80s. But they really are the first ones to have a, quote, victim group that said this should
01:06:00.860 not be allowable anymore. And so I would say it wasn't so much an individual crash.
01:06:05.700 But they put faces to a set of crashes and a cause.
01:06:10.000 And took it to the statehouse, to Congress. It's like, we cannot let this happen anymore.
01:06:16.200 And people don't know this, but MAD actually provides counseling services and all kinds of
01:06:21.040 other things that they do. But they really are the model, again, a victim group that said,
01:06:26.540 we need this issue in front of our society. It's not okay. And so they're the ones to get 0.08 and
01:06:32.480 all. Again, see, that's one of the things. Every state has to vote for what the blood alcohol level
01:06:36.960 has to be. And so everyone's different. But they're the ones who pushed, it used to be higher,
01:06:42.680 and then finally got to 0.08. They're the ones who basically are the ones who pushed especially
01:06:48.060 hard. Others did as well, but they were right out there in the front for this new legislation
01:06:52.320 regarding the technology. And they're the ones who will make sure that the pressure stays on
01:06:56.780 to see the technology actually integrated within vehicles coming. So I think that's an example of
01:07:02.420 what you're talking about. It could be a crash, high visibility, maybe a celebrity. What do we do about
01:07:07.020 this? Or it can be this new model that was created with MADD, which we see in other arenas as well.
01:07:13.320 Yeah, I always find that the individual cases do more. You're probably familiar with the case of
01:07:17.840 Libby Zion, just for folks listening. So Libby Zion was a woman admitted to a hospital in New York.
01:07:23.880 I can't even remember if it was Presbyterian.
01:07:25.760 I think it was Mount Zion.
01:07:27.100 It was a Mount Sinai. Yeah, you're right. It was Mount Sinai. She was admitted to the ER,
01:07:31.700 a resident who technically probably wasn't even qualified to make a decision that was being made about her
01:07:36.640 care, made a decision to give her, I don't even remember the details, gave her a medication while
01:07:41.500 she was already on an MAOI, and it resulted in her death. And even though I don't really think this
01:07:47.700 was just about fatigue, I think it was more about resident supervision, it became the linchpin case
01:07:53.200 that her father basically took against the medical community about resident work hours. And it might have
01:08:01.040 taken 15 years from her death until the changes that were imposed on resident work hours, which was
01:08:08.220 kind of at the end of my training period. But again, it all came back to this woman's death, wasn't about
01:08:14.820 the million other insane stories that resulted from medical residents being exhausted. So I guess I bring
01:08:23.060 all of this up to say, Mark, have we had that moment yet with distracted driving? Do we have the
01:08:28.820 equivalent of Mothers Against Drunk Driving? Or do we have the equivalent of Libby Zion's death?
01:08:34.620 Because I have my story. So my story is, I used to work with a guy named Nick Venuto. Nick and I sat
01:08:42.660 opposite each other in our offices. And Nick was a really, really amazing guy. We always say this,
01:08:50.160 but truly an amazing guy, a remarkable cyclist. But he was so committed to his family that he used his
01:08:58.480 commute, his long commute on his bike to do the bulk of his training so that he didn't have to do
01:09:04.780 any cycling during the week and he could just do his main training on the weekends. He had just won
01:09:10.680 one of the most remarkable time trials in Austin, which was the Mount Palomar climb, which is an epic
01:09:16.400 climb. One of the hardest climbs in the United States. Anyway, it was a Tuesday afternoon in May of,
01:09:23.340 I want to say 2010, 2011. We were both just leaving the office together. And I was going to drive home
01:09:31.120 to get on my bike to then ride to where I did my intervals that day. And he was riding home.
01:09:38.040 And I remember thinking there were two places I could go and ride that day. One was up Torrey Pines
01:09:43.640 and the other was on the bike path of 56, which was a bike path that ran along a freeway.
01:09:48.840 That was the road that he always took home. And I was like, maybe I go ride that way and I'll ride
01:09:53.920 out one way with him. I ultimately ended up deciding to go up Torrey Pines that day.
01:09:59.540 That day when he was driving home, a woman in an SUV in the right-hand lane, which is the lane
01:10:05.640 adjacent to the bike path, where there's a hill that you have to go up. It's about a 12-foot hill
01:10:11.340 at about 40 degrees with a fence, was on her phone. And I think what happened is the car in front of her
01:10:19.640 stopped and she drove up the hill through the fence, hit and killed Nick on a path that I rode on most days.
01:10:29.160 And I mean, I spent the next two years so goddamn angry that I wanted to start carrying a gun when
01:10:40.840 I rode my bike to shoot any motorist that got anywhere near me. I mean, that's how pissed I was.
01:10:49.840 And I thought to myself, how many times does this happen? Like this is one guy that I just happened
01:10:55.840 to know who died, but this is happening constantly. And why are people not more irate? Why is something
01:11:04.520 not being done about this? Forgive my long story, but is there a movement around these? It's not just
01:11:11.680 the cyclists and pedestrians, it's other drivers who are victims of the distracted driver. And by the way,
01:11:17.140 I want to say one more thing. I am a distracted driver sometimes. I could easily have been that woman.
01:11:22.980 There are times when cars in front of me are stopping and I haven't caught it until the last
01:11:28.060 second. And just because it hasn't resulted in an accident, I don't know that it makes me morally
01:11:32.160 any better than that woman. Well, it's personal for you too then, just as we started. Very much so.
01:11:39.240 Again, I paused at the beginning because I don't often talk about my father, but it's personal for a
01:11:44.020 lot of people. But I think what's interesting is the stories actually aren't told enough. That's what
01:11:50.320 we're talking about MAD. They told the stories. This should stop. We have not reached the societal
01:11:56.120 unacceptability, the societal outrage that you just portrayed at the level of we just got to stop it.
01:12:03.860 And that's why I say MAD helped bring the visibility, but the legislation for technology
01:12:07.820 in the vehicle just got passed. It's not even in the cars yet, right? I mean, it's still got to figure out
01:12:13.260 how to do that part. So you asked it earlier. We could do the same thing with distraction.
01:12:17.620 And I think good and bad. The good part is people are asking questions like what you just did.
01:12:23.920 How do we take this to the next step? We have models, alcohol impaired driving. We have ways to go
01:12:29.380 after this and even do better with technology and things, etc. How do we do that in the realm of
01:12:33.960 distraction? And you've also identified why it's so hard. The bad part is even us people who are trying
01:12:41.100 to be really good, trying to save lives and do the right thing. It's hard. There's still some things
01:12:46.620 where we make choices, even knowing choices like that's not really the best thing to do here.
01:12:51.840 And that's a challenge, which is even really good folks who know what's going on. I have friends of
01:12:56.440 mine in the police department. It's kind of like, I want to make sure I see every one of those police
01:12:59.860 cars stop at the stop sign. No rolling because you're the model for others. It's okay. Watching
01:13:05.680 that motorcycle get nodded head of the guy who was on his phone, etc. It's like we need all the
01:13:11.600 available mechanism strategies we have to change many of these things. And what you're getting to
01:13:16.740 is there are so many of them. We can look at the top three, but that doesn't necessarily get rid of
01:13:22.540 all of them. But I think to your point, and a lot of what you focus on these conversations is,
01:13:27.320 so what are the things that are under your control, though? So for you, if that means watching a little
01:13:31.740 bit more about when you're on those calls, and how you're paying attention to them, hands on or
01:13:36.100 hands off, etc. And that's why I'm part, you're already different because you're already thinking about
01:13:40.340 where those danger zones are. Someone who doesn't even think about that is at a higher risk, okay?
01:13:45.480 And as we talked about it, this is risk management. What's under your control, what's not? It's risk
01:13:50.340 management. And what you choose to actually try and control and take care of others that will still
01:13:55.060 put you at risk in certain situations. Yeah, one of the things I've often wondered, Mark, is there any
01:14:00.180 research into the lives of those who kill other people on the road, and how their lives are forever
01:14:07.900 impacted? Because I have to be honest with you, I think until right now, as I'm telling the story,
01:14:13.040 I've never once given a thought to that woman who killed Nick. Truthfully, I've hated her guts.
01:14:18.920 But I don't know that that's fair. And I don't know anything about her life today. I don't know how
01:14:23.700 often she thinks about Nick. I don't know what she lives with. But my impression is that there are two
01:14:31.680 deaths at every death, and that a part of the person who killed this person is at risk. And this
01:14:38.220 is a very different type of homicide. This is not first-degree murder. This is involuntary
01:14:42.800 vehicular manslaughter. Nobody wakes up thinking they want to kill someone today in their car.
01:14:48.560 I know your training is in psychology, so I just wonder if there's any aspect of your training that
01:14:54.140 gives you a sense into what those people go through. And by the way, is that something we should be
01:14:57.860 tapping into to help create more of a zeitgeist around this? It's not just the lives of those
01:15:02.500 who die. It's those who live. 25% to 33% of people in a crash are going to have PTSD within 30 days or
01:15:09.880 longer. There is a literature on that. And I would just say, going from the statistics to the individual,
01:15:15.820 Matt is quite good at actually not only having victims' families talk about what's going on,
01:15:21.880 but they have drivers who have killed someone that are also part of who speaks to folks about what
01:15:28.460 this does to change their life. And sometimes that's jail time. Sometimes even if they don't
01:15:33.300 go to jail, it changes their life forever because they are waking up every morning. Survivor's guilt,
01:15:38.560 anybody who really cares about humanity is going to carry that with them for the rest of their lives.
01:15:43.760 And so, yeah, we see numbers. Actually, again, close to a third, basically, people are going to have
01:15:48.240 PTSD, some aspect of that. And then telling those stories can be very powerful.
01:15:54.180 Before we go into some other things I want to chat about, I want to round out some of the other
01:15:57.480 contributing factors. Where does weather rank on this list? Whether it be rain, snow, ice,
01:16:04.240 sudden darkness, or cloud, how much does that contribute?
01:16:07.340 So I'm going to just drop a small bomb here for a minute about something. And then to answer the
01:16:12.460 question, then we'll come back to it whenever you're ready. But NHTSA did a study, came out in 2015,
01:16:17.940 2018, that showed in the chain of events, the last event before a crash, that 94% of the time,
01:16:25.220 it's a human choice or error. Okay, and that's my language, human choice or error. So then the
01:16:30.980 question is, what are the other 6%? Well, 2% are vehicle defects, 2% are the environment, 2% other.
01:16:38.820 And so to your question, the environment is both the physical environment, potholes,
01:16:45.700 road conditions, etc. And then things like weather.
01:16:49.380 It's virtually nothing.
01:16:50.320 Not a lot. That's right. So that does happen, but it's going to be small compared to the other.
01:16:55.140 And again, we can talk a little more about the 94% because there's been some controversy about
01:16:59.020 that number. But that's a soapbox for me. You got to be ready to get into that one.
01:17:03.660 So let's say more about that then. So where's the controversy around that? Are there people who
01:17:07.880 are arguing that that number is too high, that that's overstating it?
01:17:11.440 What's interesting is nobody's actually addressed the number. And so you'll tell me where you want
01:17:16.980 me to, okay, that's enough of that, Mark. Let's go on to something else. But let me just start.
01:17:21.100 This is why a crash is not an accident. A crash is preventable. An accident implies it's inevitable.
01:17:29.020 And most people don't know this, but the word accident started getting used mostly in
01:17:33.840 occupational environments like the work setting in the 20s and 30s when companies didn't want any
01:17:39.440 responsibility of on-the-job injuries and deaths. Oh, that was an accident. I always say it's like
01:17:44.400 it was an act of God. It was unintentional. Nobody could do whatever. And part of the reason I bring
01:17:49.500 that up is because if you believe in safety, that by investigating, you can determine the causal and
01:17:56.340 contributory factors and then intervene in some way by changing, so you prevent it from
01:18:01.340 reoccurring, you prevent whatever, then that's what safety is about. It's a crash. And if you
01:18:06.940 didn't believe in that and you really think it's an accident that's inevitable, why are you trying to
01:18:11.280 make changes, investigating, et cetera? No. And that's the mission of the NTSB, investigate those
01:18:16.220 crashes to make recommendations so they don't reoccur. Okay? So what's interesting is there's two
01:18:22.000 elements to this. One is, as a NASA scientist, pretty much you could start any paper with 70,
01:18:27.680 because it was so well-established, 70 to 80% of incidents, you know, that pyramid we were talking
01:18:33.660 about, 70, 80% were human factors related. It was just a given after all this research and things
01:18:40.320 that were going on. You know, the Institute of Medicine report, right, to errors human, 100,000
01:18:45.560 medical errors every year due to human, et cetera. And so it always surprised me. It's like 94%
01:18:52.080 shouldn't actually surprise anybody. But there is something which I call safety misdirection.
01:18:59.040 And that's where two things are going on. One, denial. Let's not deny what the causal or
01:19:04.680 contributory factor was. And the other is blame. And so the old version of safety misdirection is,
01:19:11.160 oh, the car companies are just blaming the drivers for stuff. Okay? And they want to sort of avoid,
01:19:17.000 it's kind of like the corporate accident language. Let's get away from that. And so I always point out
01:19:23.040 that what I learned at the NTSB is you can have safety or you can have blame. If you want to blame
01:19:27.760 people or da-da-da-da, that's not going to get you to the safety. The safety part is identifying the
01:19:33.080 causal and contributory factors and then intervene so those things don't happen again. Which, by the way,
01:19:38.760 even if it's the human, that very often, as we've already discussed, there's a different
01:19:43.180 intervention like technology, which our society happens to turn to a lot. When it's something
01:19:47.920 about our human behavior that's hard to change, well, what's the technology we might be able to
01:19:51.980 use that either supports, helps, or just eliminates the ability to do that? So that's one of the things,
01:19:57.680 the controversial part that's been interesting is people saying, well, if you talk about 94%,
01:20:01.240 you're just trying to blame the drivers, et cetera. Now, there are some car companies and some people
01:20:06.340 in the autonomous vehicle space that love to do that. So it's, again, how they use the data,
01:20:12.040 but scientifically, a huge part of this is human choice or error. Then we still have the challenge
01:20:18.540 of then how do we address those to make sure those things don't reoccur again in another place?
01:20:23.460 So, again, it's a little bit of a soapbox for me when I hear about people like, oh, that 94% has
01:20:28.800 been, it's like, no, it's not. Nobody's actually arguing about the number. I think it's more the blame
01:20:33.780 and shame part, which again, doesn't get you safety no matter what. So you can't deny what
01:20:38.640 the sources are and blaming people. In fact, nowadays it's like, oh, you're just blaming
01:20:42.380 the driver. Then they point and they start blaming the car. They blame the software.
01:20:46.360 That's still not going to get things fixed for you. So you're going to save lives.
01:20:50.080 So there you go. I'm with the 94% and think that that other 6% shows that last event in the chain
01:20:56.460 is very small percent. It's going to be weather or potentially defects.
01:20:59.480 I don't know if you will have this statistic, but while we're on the topic of cause and fault,
01:21:07.740 we talk about nearly 50,000 people a year died last year. Do we know how many of those people
01:21:16.480 were the people who were the cause of the mistake that led to the crash versus people who were
01:21:23.880 presumably not doing anything that would have led to a mistake?
01:21:29.280 We don't have that level of detail of those specific crashes that occurred to be able to tell
01:21:36.000 that. The reason I'm asking, by the way, is to sort of make the point that there are two elements to
01:21:41.500 this. There is what can I do as a driver to make sure I'm not the one who makes the mistake?
01:21:48.560 And then what can I do as a driver to make sure that when the other driver makes the mistake,
01:21:54.680 I'm in a position to react better or see their mistake before it's happening.
01:22:00.220 And I think about this a lot as a dad who is starting to talk a lot about this to his daughter.
01:22:08.360 If I'm driving her to school, just ask her. It's a nonstop lecture. It's, hey, Olivia,
01:22:14.680 did you see how I did that? Did you notice why I did that? Why did I look there? Did you see what
01:22:19.900 that driver just did? What I'm assuming is that we have to be vigilant on both fronts. It's not
01:22:27.300 enough to just say, I'm going to drive at a responsible speed and I'm not going to hold my
01:22:33.800 phone and I'm not going to drink. That's a great first step. I don't think it's the complete step.
01:22:39.320 You have to assume this is the mantra I use, Mark. And my wife thinks this is a little grotesque,
01:22:46.460 but I don't care. I say it to her all the time. I say, I want you to imagine that somebody woke up
01:22:52.680 today with the stated purpose of killing you. So they've been handed an envelope. The envelope
01:22:59.640 contains your name and they're told to kill you today. But here's the catch. They're not allowed to
01:23:05.520 use a gun. They're not allowed to poison you. They have to do it with their car. So I said,
01:23:11.960 armed with that knowledge, how would you drive different today? It's only one car. You're going
01:23:17.380 to see thousands of cars today. It's only one, but there is someone out here who's trying to kill you.
01:23:24.080 They're being paid to kill you. What will that knowledge do to your attention? And how will you
01:23:31.840 treat each intersection, each on-ramp, each off-ramp, each T-joint, all of those things that
01:23:38.040 we talk about? I don't know that that's a great way to go living your life. It's a bit morose,
01:23:42.380 but it's the only heuristic I've come up with to help with the other half of the equation.
01:23:48.080 That's why I asked the question, right? I don't know how to quantify the effect
01:23:51.740 size of each of these things. Is it worth the baggage and the overhead, the emotional overhead
01:23:58.080 of playing that game? Could be your life. So yeah, it's worth it. And I said this at the
01:24:05.020 beginning, which is thank you for bringing this topic up. Because at the core, like you say,
01:24:09.820 your mantra of what's under your control, what's out of your control, that's what we're talking about
01:24:15.260 here. But even the things that are out of your control, you can be more vigilant to some of the
01:24:19.900 risk factors or other things that you could do. At the higher conceptual level, I would say,
01:24:24.720 and maybe just a slight sort of reworking exactly what you said. But part of this is
01:24:30.280 situational awareness, SA. I mean, you know this in racing. It's a big thing in aviation.
01:24:36.140 And I think it's what you just literally articulated with your daughter. Did you see how I looked at that?
01:24:41.740 Did you see that car over there doing whatever? So I think that is huge. And the second part is what
01:24:47.740 we would typically frame as defensive driving. It's a little different than what you portrayed,
01:24:53.320 but it is the same thing, actually, which is there are people that are out here that are not making
01:24:57.580 the good choices you are, but who are on their phone or drunk or haven't had enough sleep. You
01:25:02.980 don't have any control over that. But with your situational awareness and driving defensively,
01:25:07.340 there are things you can do. So besides scanning here, looking there, you can come back again.
01:25:12.160 You can look at those intersections where you know things are higher so that if that raises your
01:25:16.020 vigilance, that could be enough for you just to pause at that stop sign longer.
01:25:20.120 And it becomes not even an error, let alone a near miss or worse. And so I think that's what
01:25:26.120 you're portraying. The situational awareness includes following the rules, doing the good.
01:25:31.180 And then the other is the defensive driving part is don't think everyone else is doing the same thing.
01:25:36.340 They're not. And I would just say whether they're intending to get you or not,
01:25:40.920 those bad choices are going on. Those errors are going on. And so, yeah, I think that's actually a
01:25:47.020 great way to approach it. And I think, unfortunately, it's so easy to drive nowadays. A lot
01:25:52.260 of folks, especially with even newer technologies, some people have their heads so far out of the
01:25:57.300 game, they've lost the situational awareness, right? It's all about the music, the conversation,
01:26:01.900 et cetera. So all of that is lost. So, no, I think that's actually when you're doing your other
01:26:07.220 training with the muscle memory stuff, that needs to be in there, too, because part of this is the
01:26:11.160 mental game of making the right choices and knowing that other people won't, then what can you do?
01:26:16.520 Yeah. Let's now talk about some of these more contributing factors in a little bit more
01:26:20.060 detail. And I want to start with substance abuse or substance use or whatever you want to call it.
01:26:24.100 We've talked about alcohol, of course. Give people a sense of what does it take to reach 0.08?
01:26:28.820 Because I suspect it's a lot easier to get to 0.08 than people realize.
01:26:34.740 Yeah. In fact, most people in the safety arena with alcohol like to say that impairment starts with
01:26:40.800 the first drink. And I think that's a great, like, as soon as you pull up alcohol, that's going to
01:26:45.220 start changing your reaction time, thinking everything's going to start changing. I've got
01:26:49.560 a great model to demonstrate this. So I drive a racing simulator and I've done this game where I
01:26:55.740 will have a glass of wine and get in the simulator. And to be completely clear, with a glass of wine,
01:27:01.860 I do not perceive anything in my own level of awareness. I don't feel a buzz. There's nothing like
01:27:08.820 that. However, in the simulator, there's a noticeable difference. Even at a glass of wine,
01:27:14.320 where I'm sure I am below 0.08, because I weigh a lot, and I don't feel buzzed in any way, shape,
01:27:20.340 or form, I don't drive as well. I miss the apex more. I'm more likely to spin. It's a subtle difference,
01:27:28.920 but it's absolutely noticeable in that environment, which is much, much more demanding than just driving
01:27:36.120 home from the restaurant. So, but as a general rule for someone my size, I've never taken a
01:27:41.860 breathalyzer test. So I don't actually even know what 0.08 is. So for you, that's going to be two
01:27:47.320 to three beers and one good shot of hard liquor. Over what period of time? Within one to two hours.
01:27:55.360 Okay. But you're bringing up a great point, which is why I always hesitate to give too much information
01:27:59.500 about that. Because you can go online with some pretty good BAC estimators, because your age,
01:28:06.400 your weight, your health. Exactly. All of those things, whether again, it's less alcohol content
01:28:13.160 in beers and wine, you know, versus again, hard liquor stuff. Over the time course, you were just
01:28:18.060 about, did you eat? And what did you have in your stomach when you read? It's like, there's so much
01:28:22.020 there. I think actually even more important is what you were telling in your story,
01:28:26.020 which is there's very often a disconnect between the subjective experience and the objective one,
01:28:33.380 which is that very often people think they're doing a lot better. We see this parallel in alcohol
01:28:38.040 and sleep loss. I'm doing great. I'm really fine. When if you look at the objective performance
01:28:43.420 measures, they're just doing horribly or they're off, but don't realize how off they are. So that
01:28:49.120 disconnects really important. That's why you say I'm good to go, whether it's not having enough sleep
01:28:53.820 or, you know, I had a couple of glasses of wine, but I'm okay. Had enough to eat and I'm good to go.
01:28:58.320 That disconnect, I think is actually bigger risk than sometimes the exact amount of alcohol you
01:29:03.180 have in your system. Yeah. And I would argue that the sleep one is just the same. If I don't have a
01:29:08.360 great night's sleep, it shows. And again, that's, I think that's, what's kind of cool about a simulator
01:29:11.880 is you're simulating driving around at 200 miles an hour. And at that speed, things are happening
01:29:17.720 three times faster than when you're in a car on the road, even on a freeway. And of course you're
01:29:22.960 turning constantly, right? So the stakes are so much higher and the forces and inputs are so much
01:29:28.320 more subtle. Any wrong input will lead to a significant change in output. And it is remarkable
01:29:34.780 how you can feel fine for the mundane task, but not for the drive. Add that to your course you're
01:29:42.300 creating, right? So besides the muscle memory, whatever, it's like, you'd love to challenge,
01:29:46.540 especially those adolescents with even whatever's appropriate there. Sleep loss would be better than
01:29:51.920 alcohol, obviously. But think about that as a challenge just for them to say, I feel fine.
01:29:56.820 Give me the rating on the one to 10 scale and then show the objective measurements in driving. Like
01:30:01.520 you say, even the mundane ones. Yeah. Let's talk about other substances. And I think the most relevant
01:30:06.720 is cannabis. What did the data say about cannabis use and driving? It's not good in that what's
01:30:14.400 happened is in alcohol. There's a really good established protocol with, you know, here's what
01:30:21.680 impairment means at point away. And by the way, we can use a breathalyzer. We can use blood. There are drug
01:30:27.420 recognition experts, DREs that can use behavioral measures on scene. And we know they hold up in court
01:30:33.660 with cannabis. There is still ongoing debate of what defines impairment. And so if you can't even get that
01:30:42.320 defined and then you have no roadside kinds of measures and it's harder, as you know, whether
01:30:49.200 you're using breath, hair, all these other things that people are trying to do, it ends up, well,
01:30:54.460 you know, it sticks around in your system longer or can be in the follicles for this long. It's like
01:30:58.740 there is no good measurement. So we don't have the impairment definition. We don't have good measures
01:31:04.640 to get us there to sort of figure out how we do that. And the other thing, which is really
01:31:09.580 dangerous, is the poly usage. It's rarely one of these on their own. With cannabis, you see alcohol
01:31:18.020 or other kinds of things that people are using at the same time. So that makes it even more difficult
01:31:22.780 to figure out, especially in the roadside kind of environment, of what people are actually using.
01:31:28.720 And so, I mean, this is now, I was administrator six, seven years now ago, and all this research was
01:31:35.380 still going on to try and define what I was just describing. That as all these states now start
01:31:40.160 making it more legal, I always tell people when you're voting for that in your state, don't think
01:31:44.880 about the sport of cannabis, but instead think about what that's going to mean if somebody next to you
01:31:49.500 on the roadway has been smoking and in some way is like the level of impairment of alcohol. People just
01:31:56.100 don't think of that part. And unfortunately, while the alcohol is pretty well defined, the cannabis part
01:32:00.480 is very poorly defined. In that time, we still don't have it figured out yet.
01:32:05.360 Is there a manner in which we can at least extrapolate from the standpoint of understanding,
01:32:08.520 right? So alcohol is a CNS depressant. I guess at the risk of just talking through something that
01:32:14.620 seems self-evident, what is the mechanism by which you think ethanol is contributing to mistakes?
01:32:23.800 We're talking now about the subset of mistakes that lead to bad enough crashes and potential
01:32:28.160 fatalities. Like, is it that people end up driving too quickly when they're drunk? Is it that their
01:32:34.200 reaction times are so slowed when they're drunk? And not even drunk, let's not even use that word.
01:32:39.700 When they are under the influence of alcohol, what are this suite of mistakes that they're making that
01:32:45.200 are leading to these catastrophic errors? So I think the two big categories are any performance
01:32:52.120 related ones, which are degraded or impaired. So that would be sustained attention, reaction time,
01:32:59.080 decision making. And the other is the one we talked about, and that is the subjective perception
01:33:04.360 part being disconnected, where they have no sort of, you know, their decision making, like it's really
01:33:09.220 okay. So we know basically with alcohol, all those CNS things that you were talking about are going to
01:33:16.480 be, again, be degraded or impaired in some way. And again, I talk about this the same way with fatigue.
01:33:22.580 And I'll just use that as the example for the moment, but there's this like continuum. Everyone
01:33:26.800 is like, you're asleep here, you're awake over here, but the alertness in between is a continuum.
01:33:32.340 And so I think most people would absolutely agree being asleep at the wheel, not good. But that
01:33:37.640 continuum beforehand, we know that with sleep loss, circadian disruption, alcohol, that basically
01:33:43.200 your performance will degrade way before you actually say fall asleep or drunk enough to go
01:33:48.520 unconscious. And that's why I call that degraded as opposed to the impairment where literally you
01:33:52.600 can't perform at all. But at least in the sleep realm, I mean, we can see 50% reduction in your
01:33:58.200 decision making, 30% reduction in your memory. We can see 50 to 75% increase in your reaction time.
01:34:04.840 These very specific things when you're in the driving mode, trying to deal with a situation are
01:34:10.260 going to affect you and your ability. And that's why I always tell people it's like way before you have
01:34:13.980 you have that fall asleep, go unconscious, you've got these performance degradations that put you at
01:34:19.160 risk. But in that sense, I mean, to me, sleep and cannabis seem very similar in that they're sedating
01:34:27.180 and there's clearly a spectrum from completely debilitated to loosely less functional. And there's
01:34:36.880 no real way to measure it directly the way you can use a breathalyzer for alcohol. Are there
01:34:43.320 differences between how we view drowsiness due to sleep deprivation or sleep interruption and
01:34:51.720 cannabis use, which again, I'm just bringing up cannabis, not because it's the only other substance
01:34:56.420 out there besides alcohol, but because as you said, it's becoming more and more ubiquitous. I think
01:35:01.220 there's a belief on some level, maybe somewhat warranted that it could be less toxic than alcohol. I think a
01:35:13.300 unintended consequences of that with regard to this domain.
01:35:18.260 Well, and I'm in complete agreement with you there, which is that I think it remains hypothesis,
01:35:22.980 but with everything we know about how physiology and the brain and things work, it's got to be.
01:35:27.480 We may not be able to quantify it quite yet. Here's a comment I often make about sleep loss
01:35:32.220 and circadian disruption. When you lose sleep, you disrupt the clock. All aspects of human capability
01:35:37.780 are degraded or impaired in some way. So I mentioned that because a new study comes out and says, oh,
01:35:43.280 we just link this now to cancer in a new way or immune function in a new way. And I'm kind of like,
01:35:48.860 okay, so now we have more data and we can be more exact in certain areas. Good thing.
01:35:53.660 But the overall comment holds, which is kind of what you're saying here too, which is everything we
01:35:58.920 know about how cannabis works in the brain, how it's going to affect us physiologically performance-wise,
01:36:03.560 we know that's not going to improve things. It's going to degrade them. And while we may get better
01:36:08.140 at quantifying how bad that is, let's be clear. It's not a good thing for driving under the influence
01:36:16.020 of that. I think that's what's going to happen here is that we will get there. There's some great
01:36:20.440 people at NIH and other places that are working on how to get that impairment level defined, to think
01:36:25.800 about what the test could be, et cetera. We'll get there. Took a decade or two, frankly, to get the
01:36:30.500 alcohol ones in the way they are now. We'll get there. But for the moment, let's be clear.
01:36:36.000 Everything we know is that's a negative. It's not a positive. And we may get better at it, but it's
01:36:41.260 still not going to improve things in some way. I'll just make another general comment, which is,
01:36:46.100 I think, the sleep part that's interesting. Everyone sleeps. They think they're an expert.
01:36:51.140 And it's kind of interesting that everyone is like, yeah, I've lost some sleep and, you know,
01:36:54.700 I'm still here. I'm fine, et cetera. Whereas, again, people will think about alcohol and cannabis.
01:36:59.320 Just, you know, that's more of a choice. And by the way, I don't drink or I don't do this,
01:37:03.100 et cetera. No one's ever going to say, yeah, I think that sleep thing, I just stopped it.
01:37:06.640 That's a requirement for just our existence. And so it's slightly different that way.
01:37:12.560 Are there any other prescription drugs that play a significant role in this? I mean,
01:37:17.860 the obvious ones that come to my mind would be benzodiazepines. But is there any other class
01:37:22.220 of drugs that frequently enough show up on the impairment side of the equation?
01:37:27.020 Thank you for bringing this up because drunk, drugged, people always think of cannabis,
01:37:33.120 et cetera. But that includes prescription medication and over-the-counters. So any
01:37:37.580 antihistamine that makes you drowsy, any sedating antidepressant, trazodone gets used,
01:37:42.360 any one of those that you're taking has the potential to affect your sleepiness alertness
01:37:48.180 level when you're actually driving. So I'm glad you brought that up because everyone thinks of the
01:37:51.800 big ones, alcohol for sure, cannabis, we got to start talking about. But these others,
01:37:56.600 you think about it, prescription, something you don't want to just read the label. You want to
01:37:59.820 talk to your healthcare provider about that and see how much could that affect me? And the over-the-counters
01:38:03.860 as well. Those are all big issues. And we, again, it's the polypharmacy. You see that a lot,
01:38:08.740 that there's multiple things, but trying to pull out exactly what they were,
01:38:11.900 that's often after the fact. Post hoc, you go interview people and say, oh yeah, it's allergy season.
01:38:16.820 And are you taking the one that's sedating or are you taking the one that's the non-drowsy version
01:38:20.520 of it? Let's talk a little bit about speed, which still makes up a pretty sizable fraction of the
01:38:26.720 contributing forces. Is that changing? Is that less today than it was 30 years ago? It strikes me that
01:38:34.440 clearly cars are faster today than they used to be, but are people necessarily driving them faster
01:38:39.060 today? And is that not offset at all by the far greater impact of technology, which we haven't come to
01:38:45.540 talk about yet. So we should certainly spend some time talking about seatbelts and airbags, but
01:38:49.020 yeah, what's the general throughput of speed in this equation? It's a big factor. Force equals mass
01:38:55.280 times acceleration. It's all about the energy. So when you're going faster, you're talking about when
01:38:59.680 a crash happens, there's just more literally impact of what's going to come out of that particular
01:39:05.220 collision. And so it's interesting. Speed has actually gone up and that has more to do with the
01:39:09.280 capability of the vehicles, roadways, and the technologists. Think about, I mean, you're a racer.
01:39:14.100 There's always like, how do we get better, faster, et cetera.
01:39:17.140 Although it seems that cars outside of SUVs, probably cars are getting a lot lighter.
01:39:22.160 So when you think about mass times deceleration, which I assume is a big part of what that force
01:39:28.140 is that people are experiencing, does that not offset it on the car side of the equation?
01:39:33.260 Not very much because generally cars are getting bigger. Now that we've added SUVs and others,
01:39:39.600 when we look at what's going on out there, there's people describe it as a weight bloat that's
01:39:43.740 occurred. And unfortunately, they do protect you. If you're in one of those energy related
01:39:49.740 kinds of crashes, the bigger, the protective cage you have around you, the better off.
01:39:54.260 People have a tendency to say, I want to be in that vehicle, not in the little small
01:39:58.040 version where I could get killed doing that sort of thing.
01:40:01.700 I think what's interesting is we do see, and this is why it's so complicated, but we do see
01:40:06.420 during the last recession, during the pandemic, the deaths go down, but sometimes other particular
01:40:13.140 things like speeding goes up. And we can have all kinds of hypotheses about, are there fewer
01:40:18.580 people on the road? Is just the density different, et cetera? They're all hypotheses. We think about
01:40:24.020 those. We don't really know. But I bring it up because it's interesting that even the economic
01:40:29.380 environment of our society can end up having effects beyond the usual causal or contributing
01:40:34.640 factors, for example, that we're mostly focused on in this conversation. So even those things can
01:40:40.100 have that societal ripple. And I would just say for the speed side, two other things, which is,
01:40:45.940 I'm sure we'll talk about the safe system approach, and that has to do with safer roads. And one of the
01:40:51.100 things they look at is controlling speed through road design. So that's where roundabouts or what they
01:40:57.300 call speed diets, where you basically two become one lane, and then instead we're going to have a more
01:41:02.180 dedicated pedestrian or cycling lane where you can use, et cetera. And then the last thing I'll just
01:41:06.760 say on the technology is there's a huge debate. It's not where the alcohol is, where the technologies
01:41:11.960 pass legislation, but there's a lot of discussion about speed limiters now and whether or not, as we've
01:41:18.300 been discussing, and I said this earlier, society very often when we have human choices or things that
01:41:23.780 we want to deal with, we go to technology. So all of a sudden now, speed cameras and speed limiters
01:41:29.500 and vehicles are in more discussion than they have been in a decade.
01:41:34.340 Yep. Makes sense. All of these things are, you'd pull them out of our dead lifeless hands if you
01:41:39.820 don't let me drive quicker and don't let me talk on the phone and whatnot. I want to go back to
01:41:44.980 another kind of deconstruct the accident question. Do you have a sense or are these data knowable
01:41:50.320 as to what fraction of those fatalities in the vehicle are a result of the integrity of the
01:41:57.600 vehicle being lost due to the collision itself versus a flip? In other words, how often is avoiding
01:42:04.060 the flip of the vehicle with a lower center of gravity a relevant part of this equation?
01:42:10.540 Yeah, that's actually a big deal. And when you look at the crash tests are done, it's interesting,
01:42:15.560 they don't actually crash vehicle to vehicle. The crash worthiness of your vehicle is tested against
01:42:20.080 a big block of cement, basically. But now there are different versions of that. So there's a side
01:42:24.960 impact. And now over the last years, there's also an angled one as well, and then rear end,
01:42:31.240 and then you can add all kinds of other variations of that as well. And I would say that's the kind of
01:42:36.020 thing I don't keep in my head anymore. But there are some databases that gets at least into things
01:42:41.820 like rollover. And that's where the side impact is critical, right? Because that's mostly going to
01:42:46.120 happen with the center of gravity when you're actually have, you're up on a curb or some other thing
01:42:50.760 that's going to tip you potentially, and again, the direction that you're hit, etc. But there is
01:42:54.840 some data on that to show sort of which are the versions. And that's why they added things like
01:42:59.320 the angled collision, basically, recently, just to figure out that, you know what, there's enough
01:43:03.820 of that going on, we should understand the crash worthiness of vehicles when they're hit at an angle
01:43:07.680 like that. Are electric vehicles indeed significantly safer than non-electric vehicles on the basis of a
01:43:13.780 lower center of mass? And on the basis of not having an engine, typically in the front during a
01:43:19.780 front collision, therefore it can have much more force absorptive capacity? Or is that more of a
01:43:24.940 marketing strategy? We don't have sufficient data yet to sort of make that factual comment per se.
01:43:31.700 But I think you've identified all of a range of the factors that actually differentiate electric
01:43:36.360 vehicles from the standard. And as you know, right now, one of the biggest things is just getting the
01:43:41.280 myths separated from the fiction that is told. There are still more fires in ICE vehicles than there
01:43:46.100 are in EVs. Some of that's about battery. Some of that's about just the population that's out there of
01:43:50.820 these different vehicles. I mean, it's like, you got to get into that level of data analysis. That's why I'm
01:43:55.800 talking about the segmentation and stuff. So some of these things, again, NHTSA.gov is one place, the
01:44:00.820 Insurance Institute for Highway Safety, they do a lot of great crash testing. And I'm sure we'll get into some
01:44:05.540 of that when we talk about some of the vehicle specific things. But you can get a little bit more of
01:44:09.780 that data. My caution is always that some of that is still in development, as far as
01:44:14.300 understanding the kind of differences. Hypothesis, like on what you were just identifying, but the
01:44:18.940 level of segmentation to actually make factual statements, I think we're still developing a lot
01:44:22.760 of that. Let's talk about where autonomous driving comes in. If you believe that north of 90%
01:44:30.980 of these crashes have at their root cause errors by humans, then you would think that an autonomous
01:44:41.960 vehicle provided the entire network of vehicles is autonomous. That's always been the big if,
01:44:47.340 right? It's not enough to just have some autonomous vehicles. That could actually make
01:44:50.680 things worse. You have to kind of have everything being autonomous, would certainly have the potential
01:44:55.500 to do better. I don't think it goes from 94 to zero, but it seems that there's a glad path.
01:45:00.440 What is your view of autonomous vehicles and the hope that they bring to this problem? In many ways,
01:45:07.420 I wonder, is that the solution as opposed to built-in speed limiters, breathalyzers? I was also
01:45:13.840 going to half-jokingly suggest that if we can put all of those things in cars, we can also probably put
01:45:19.520 in eye flicker sensors and we can track micro-sleep because I think back to when I was in residency,
01:45:26.440 how tired I would be driving home and I could barely hold my eyes open. Clearly, we could sense that.
01:45:32.520 So there are clear things you can build in to make humans less likely to hurt themselves.
01:45:38.420 But at some point, you might just say, let's just put all of our energy into autonomous vehicles.
01:45:42.860 So this is a fun part of a conversation, which is the general comment we've already been discussing,
01:45:49.840 which is when society has issues that are around human behavior and choices and errors we make,
01:45:55.460 then do we often look to technology as a way to help us do better, save lives, improve situations?
01:46:02.520 So there's another study that NHTSA did where they actually looked at 14 different technologies as
01:46:09.280 straightforward as seatbelts, airbags, electronic stability control, 14. And over 50, 52 years,
01:46:16.220 how many lives did those technologies saved? 613,501. And those are just the 14 they looked at,
01:46:23.620 which were related to the Federal Motor Vehicle Safety Standards and other things that are
01:46:26.960 in vehicles. So you can't do this. But I point out at 40,000 lives a year, we could have gone 15 years
01:46:35.920 with no lives lost on our roadways. Kind of our conversation, not necessarily injuries or crashes,
01:46:41.620 but technology works. And I'll say this as the caveat to start with is the potential is unbelievable,
01:46:48.540 but we still got to prove it's going to work, fully acknowledging that these new systems will also
01:46:54.920 introduce new risks. Okay. I think again, when I told you people like to point their blame,
01:47:00.100 then it does like still got to prove it. And we also got to acknowledge that now we're going to
01:47:04.320 have software risks and other kinds of machine learning and technological risks that are introduced.
01:47:10.320 We got to make sure we address those as well. Your point that we may get to zero in some areas,
01:47:14.920 should we even expect that in others? Having said that, what you just described actually is the way I
01:47:20.340 think we need to be using technology. So those are really in two big areas. The advanced driver
01:47:28.920 assistance systems, ADAS, are the things that are already in cars. I'm sure you've got them in yours.
01:47:35.540 Automatic emergency braking, lane keeping assist, cruise control, rear camera, all these different
01:47:41.820 systems that are independent. And so when you think about it, those are a part of, again,
01:47:46.680 advanced driver assistance is how do we support the driving task by giving you these things that
01:47:52.040 we know when you look this way, you can't be looking that way, but we could give you visibility
01:47:56.240 to that situational awareness again that could actually help with that. The higher level is a
01:48:01.580 fully autonomous vehicle. And let's just say, you know what? We're going to take out the steering wheel,
01:48:06.700 the pedals and everything else. It's going to do the entire task. And the way sometimes this is
01:48:11.420 separated is at the lower levels. SAE actually has these five levels, and I'm not going to get into all of
01:48:15.980 that. But you can think about it basically as at the lower levels, the human is responsible for
01:48:21.620 monitoring the environment and the vehicle. There's an interim level, level three, where the vehicle
01:48:29.120 will do some of those things. But if it needs you back in the loop, it's going to let you know,
01:48:33.980 take the wheel. And level four and higher is vehicle is going to do it for you. It's going to
01:48:39.060 monitor the environment and the vehicle and handle the driving task.
01:48:42.740 Just so I understand, Mark, is Tesla the best example of, is that the furthest along technology
01:48:49.200 in the driver assist commercially? I'm not sure it's further along, but I would say it has the
01:48:55.060 most integrated systems of pulling those separate systems in. But what you just said that's so
01:49:00.320 important is the label that often gets used, autopilot, gives a sense that it is autonomous
01:49:06.300 driving, when in fact, it's what you just said. It's actually a collection of driver assistance
01:49:12.180 systems, which actually even they publicly, because of regulatory requirements, that's level two.
01:49:17.640 I was just about to say, is that level two or three? But it's level two.
01:49:20.780 Yeah, you got it. You got it. And probably that's one of the more integrated systems that uses
01:49:25.340 all these different things. But you're seeing even the traditional automakers basically offering all
01:49:30.680 these different systems that are in there. So one of the ways, take a step back, is to realize if
01:49:37.340 we're going to take the 280, 300 million cars that are on the road today, it can take anywhere from 10
01:49:44.300 to 12 years to actually penetrate the fleet with new technology.
01:49:50.680 That seems optimistic. I was going to say probably longer. I mean, it would seem to me at least 20
01:49:56.120 years to overturn that fleet. And that would also include, like, this is going to be one of those
01:50:00.680 moments where I imagine a real regulatory challenge. So I love internal combustion engines. I am an
01:50:08.120 internal combustion engine junkie. And not only that, I love naturally aspirated internal combustion
01:50:12.720 engines. And I'm not going to embarrass myself by saying how many of those I have. But I have a belief
01:50:19.180 that those things aren't going to be made for very long. And that's why I love to have them. I like the
01:50:23.940 engineering. I love the sound, et cetera, et cetera. So for example, like a 1967 Corvette, you can still
01:50:29.560 drive it today. Even though it doesn't meet a single safety or emissions standard that exists
01:50:35.360 today, it gets grandfathered in as you go along. But when we start to think about autonomous vehicles
01:50:41.540 and you get to this point where you say, well, look, for the system to really work, every vehicle
01:50:46.580 must be a level four or level five. That is a totally different regulation. That's no longer just
01:50:52.460 saying to the car producer, you can't make a car that doesn't meet these requirements. It's saying to
01:50:57.920 the consumer, oh, by the way, you can't have that car anymore. Is that what's necessary here?
01:51:03.020 My one thing for you is horses. When people say, I'm never going to be able to buy my ICE vehicle
01:51:09.020 again or one in the future, it's like, you know what? People still breed them, raise them, race them,
01:51:16.380 own them. They're always going to be ICE vehicles. You're going to still have it around and do what you
01:51:21.540 need to do. But I think it also brings up the enormity of the problem.
01:51:25.600 Yeah, this is the biggest infrastructure problem I can imagine.
01:51:30.820 Yes. And that's why when I was administrator, we actually put out the first federal automated
01:51:36.360 vehicles policy and people were screaming, well, that's a nice policy, but like, what are you going
01:51:41.240 to regulate? And I used to say, what are we going to regulate? It's just a huge issue. The autonomous
01:51:46.580 vehicles aren't there yet. There's some great programs going on demonstrating their potential,
01:51:50.740 but they're not there yet for full deployment throughout our entire society. They're not
01:51:56.060 there yet. They're great demonstration, pilot programs, larger ones. And there are a lot of
01:52:01.680 problems that are going on as well with some of the companies. So that is still, again, to be proven
01:52:06.900 and understanding where the risks are with that model, we still need to get there. That's why I think
01:52:12.280 one of the ways to conceptualize this is we're probably on a path. There used to be a debate,
01:52:18.020 and I think we know what the answer is now. Do we just go along the levels, one, two, three, four,
01:52:22.880 you know, until we get up to four and five fully autonomous, or do we jump right to four and five
01:52:26.740 and let's go for the full thing? I think we're seeing it now just by what we're talking about.
01:52:31.320 ADAS is that stepwise, get people familiar with the technology, see where the advantages are,
01:52:37.120 see where it helps you, see where it might actually save your life, et cetera. That'll move us toward
01:52:41.740 it to where eventually it's like, I'd like to just get my hands off the wheel, period,
01:52:45.740 at least for my commute or something else, let the vehicle do it. Again, if you want to race or do
01:52:50.540 something else, that's a different environment, different task you want to do. But I think now
01:52:55.800 that question's been answered that we're probably going to be going through these steps
01:52:58.980 where technology is introduced. We need to get enough data to show it's really going to make
01:53:04.100 a difference. And then when we have that, I think people are like, okay, it's built in,
01:53:07.980 it's working, et cetera. And then there'll be more comfort at some point. The two examples I'll give you,
01:53:12.720 one is when I was administrator, one of the things we did was actually challenge the auto industry
01:53:18.820 to make automatic emergency braking standard on every new vehicle by 2022, or at least 95,
01:53:25.780 99%, like of all the new vehicles. We originally had 10 manufacturers. And a couple of months later,
01:53:32.040 when we launched this program to try and get this going, we had 20 manufacturers basically covering
01:53:37.840 about 95 to 99% of all new vehicles. And the whole idea was they would get in the room and spend some
01:53:43.360 time saying, what are the criteria that we want to see everybody? It's called democratizing safety.
01:53:48.420 So it's not just if you buy a higher end car, or it's an option that I can, it's like, no,
01:53:53.060 democratize means every new vehicle, AEB. And the Insurance Institute of Highway Safety says,
01:53:57.860 if every car had AEB, we could probably reduce 50% of the rear end crashes with that.
01:54:03.260 Wow. Yeah. And just to be clear, Mark, this is not the warning that comes on when you're about to
01:54:09.700 hit somebody, but it actually will break if the distance and rate of speed between you and the
01:54:16.320 car in front of you triggers an algorithm that says, you're going to hit this thing.
01:54:20.180 Peter, this is why it's so great to have some time to discuss this stuff. AEB actually has multiple
01:54:25.300 elements to it. The warning is one. Another is if you don't break, it'll break for you.
01:54:31.100 Another one is if you break, but you don't break hard enough because of the distance,
01:54:35.060 it will actually add breaking power to what's going on. And so that was what this working group
01:54:40.840 was, is let's figure out what the performance criteria will be for this AEB mandate that we
01:54:47.440 want to see. Now, why I'm bringing this up is because it was not a regulation. And pretty much
01:54:51.840 that's because every regulation that had come through recently, like when I was there, was 10 or 12
01:54:57.380 years in the regulatory pipeline. And so we said, you know what, let's just challenge them
01:55:02.420 and see who would agree. And like I say, we ended up having 20 manufacturers come together.
01:55:07.320 And I mentioned that because somebody just put a report out that basically they met that requirement.
01:55:12.640 Now, why is that relevant? Besides the fact that everybody's got AEB and all those new vehicles,
01:55:16.900 again, to our point, that's not retrofitting everything else. So there's still plenty of cars that
01:55:21.600 don't have this yet. It's going to take a while for all of that to change. The other thing that most
01:55:26.820 people, it's not a secret, but most people don't understand this, but to get a regulation through,
01:55:31.640 you need to have an office management budget basically do a cost benefit ratio, right? They
01:55:36.540 got to do a calculation that says it's worth it. It usually takes a penetration of at least 10% of
01:55:41.500 the vehicle population to get enough data to do that kind of analysis. OMB needs to do an analysis
01:55:52.200 of something because it's the government that's mandating it. That's why they have to be able to
01:55:59.320 say, yeah, if we mandate this, the cost to the industry to do this is X, but then how do they
01:56:05.280 assign a cost to life? Oh, the numbers are there. Always debated, but do a search on it. Yes, but the
01:56:11.060 number of the cost of life doesn't include loss or pain and suffering to the people who are left
01:56:16.220 behind. It's economic loss of life, meaning you lost 10 years of your life working where you would
01:56:21.940 have made this amount of money. So it's probably an underestimate of the true value of life by far.
01:56:26.780 Absolutely. And part of what I'm bringing up is you typically need at least 10% penetration in
01:56:31.940 the vehicle population, even to collect the most basic data to send it to them. It's like if it doesn't
01:56:38.580 even have the data for that cost benefit, so they can do their straightforward lives versus we have
01:56:43.560 this technology, how many lives you think you're going to save? And what data do you have to
01:56:46.480 demonstrate it and substantiate it enough for the cost to the car, the manufacturer, the society?
01:56:53.580 So again, I think that's one of the challenges you see going on to what we were saying before is that
01:56:58.440 seeing these stepwise additions to these driver assist systems are getting added. Things like AEB that
01:57:05.480 are now standard allowed that. So the agency just came out recently with, there's something called an
01:57:11.100 advanced notice for proposed rulemaking and a notice for proposed rulemaking. I don't know
01:57:15.180 which one it is, I forget, but basically they're now going to regulate AEB. And I like to point out
01:57:20.240 as part of the reason that they'll be able to do that at an even better level, like maybe extending
01:57:24.300 it to trucks, heavier vehicles, et cetera, is because of this other work that was done that made
01:57:29.300 democratize it as something that needs to be out there that everyone should be able to have in
01:57:33.460 their vehicles. So again, I think we will see this stepwise going, but I also point out that when we want
01:57:39.900 to get to the full autonomous vehicle, you said this earlier, I want to bring us back to it.
01:57:45.260 The mixed fleet is going to be a huge problem for everybody. And if you think about it, it's going
01:57:51.080 to be at least three groups, fully autonomous vehicles, vehicles that have a lot of the ADAS
01:57:56.880 support systems, and people that are just driving cars that don't have any of that stuff in it.
01:58:01.780 And they'll all be on those streets. You won't know which is which necessarily.
01:58:05.560 Mercedes just came up with a different color light for when it's in autonomy for some of the
01:58:10.400 things that they're doing. But earlier, our conversation about defensive driving is like,
01:58:15.080 just think about how more challenging that kind of environment is going to be.
01:58:18.960 Well, let's touch briefly on drowsiness. I realize this is kind of your life's work.
01:58:24.120 It's also the one that I think people who listen to my podcast regularly are very familiar with the
01:58:29.120 implications of sleep deprivation. What are the most important things you think you want people to
01:58:33.340 understand with respect to how compromised sleep impacts driving?
01:58:39.600 I'm going to repeat what I said earlier, which is lose sleep, disrupt your clock, the circadian
01:58:44.580 clock in our brains. Lose sleep, disrupt the clock, and you will pay for it by having human
01:58:51.280 capabilities degraded or impaired. That is across the board. So I don't care what your job is or what
01:58:59.100 task you're on. But if you're not getting the right sleep at the right time, you're going to
01:59:04.040 pay for it. This is one of those you can't fool human nature, just nature, period. You can't fool
01:59:08.880 it. There will be a price to pay. And is there any regulation about when a truck driver can drive?
01:59:14.380 I understand there's clear regulations about how many hours he or she can drive. But are there
01:59:18.980 regulations that say, we'd like you to stay within your circadian rhythm and do your driving?
01:59:23.780 We'd like you to see a sunrise and sunset and drive accordingly, even though I know that that's not
01:59:28.820 necessarily convenient and it might be far more convenient to drive through the night.
01:59:32.240 So what's really interesting is if you understand sleep in the clock, some of what we would do as a
01:59:38.360 society is exactly opposite the way the world actually runs. And so there are regulations in
01:59:45.540 aviation and trucking and rail and a lot of other places, as you already mentioned, the resident
01:59:52.100 training hours, et cetera. To your point, I always point out there's the two physiological elements
01:59:58.580 you've got to deal with, the sleep part and the clock. Usually it's easier to deal with the sleep
02:00:02.580 part. And people still don't do that well. But as you know, just to be clear, if you don't have 10
02:00:08.160 hours off your job, commuting, getting eight hours of actual supine sleep opportunity doesn't mean
02:00:13.980 you're going to get the sleep, but at least eight hours opportunity. Maybe nine hours if what residents
02:00:18.460 used to do is live in the hospital, maybe. But so yeah, you can have a little bit better on the
02:00:23.080 sleep part. But to your question, the circadian piece is a lot harder. And keeping that stability
02:00:29.380 in a 24-7 society, military operations, air traffic control, moving goods, operating airports,
02:00:36.140 it's like these are 24-7.
02:00:38.120 This speaks to, again, I think the general theme here that I'm hearing, Mark, is we're going to
02:00:42.520 quickly approach the limits of what humans are willing to do in terms of the sacrifices they're
02:00:48.160 willing to make. And therefore, we have to come up with technology to work around that. We're not
02:00:54.180 going to be willing to live in a society where nothing happens at night. So we have to automate
02:01:00.000 those processes so that we don't have to rely on individuals functioning at fractional reserve and
02:01:06.600 fractional capacity.
02:01:08.380 You got it. But we are so far away from that in our society now.
02:01:11.460 Yeah, it might not be in our lifetime. You and I probably won't appreciate the full extent of
02:01:17.080 pick your favorite sci-fi movie where the trucks are just autonomous vehicles that are transporting
02:01:22.100 things on freeways at night.
02:01:24.060 Right. I say that because start at the beginning, which is the information, even naivete and ignorance
02:01:31.980 of society around the dangers, not just the performance, but that is still an emerging.
02:01:37.860 I mean, it's been going on for a while. I feel like Bill DeMent, who-
02:01:41.460 wasn't just a professor of mine, but became a friend and colleague and teacher and mentor,
02:01:45.840 all those things. He was probably one of the leading voices who said, we got to pay attention
02:01:50.180 to this because the price is too high. And he always pointed out that if we don't start with
02:01:54.580 education, so that people appreciate not just what the cost is, but for me and Bill, there's a
02:02:00.080 lot of like, if you appreciate what the benefits are of getting the sleep you need and keeping the
02:02:05.240 schedule, et cetera, that should outweigh putting yourself through this other stuff. That's the
02:02:10.300 individual piece. But then it gets to what you just described is if we really accept that as a society,
02:02:15.320 then we have to move to what are the solutions going to be, right? And again, it's not always
02:02:20.140 intervening on the individual, give them drugs. It's like, no, it doesn't have to be that. It's
02:02:24.340 going to have to be these other things like technology that's helps us offset those choices
02:02:29.060 we make as a society that could still put us at risk. Yeah. Let's close our discussion, Mark,
02:02:34.760 with a couple of things that people can take away as far as actions that they can take to
02:02:40.920 avoid hurting others and being hurt. And let's start with that of being a pedestrian.
02:02:46.700 pedestrian. So what can an individual do as a foot pedestrian to minimize their risk? Let me start
02:02:54.820 with a basic question. How many times is a pedestrian killed when they have the right of way in a
02:03:01.420 situation? So they're not jaywalking across a red that they shouldn't be. They're doing the right
02:03:06.560 thing. They're either up on the sidewalk and a car comes up on them or they're crossing when they
02:03:11.500 should be crossing and someone runs in there. Do you have a sense of that? I don't. It's a great
02:03:16.080 question. And I'm not sure we have that data just because of the categories that these would be
02:03:21.160 placed in. Because again, if they weren't doing something illegal, I'm not sure that that kind
02:03:25.440 of information would necessarily be collected. So in other words, we could probably back into it
02:03:30.060 if we, I assume the data are tabulated for every time the pedestrian is quote unquote at fault. And I
02:03:35.960 know we're trying to get out of the at fault mentality here and just solve the problem. But you could
02:03:41.040 say, look, of the 42,000 people who died, 6,000 were pedestrians. And we have a really clear example
02:03:48.500 of a thousand of those pedestrians were the ones at fault. And the driver tried to avoid them at the
02:03:54.200 last second, but couldn't. And then we could maybe say, well, look, in 5,000 of those cases,
02:03:57.920 the pedestrian was not. But do you have a sense of what a pedestrian can do specifically around
02:04:03.320 their situational awareness? What do they need to be on the lookout for to reduce those odds?
02:04:08.580 You bet. And I think it is worth noting, again, in the last decade, that number has gone up 50%.
02:04:13.920 This last year, the estimates are 7,500 people have died as pedestrians of those numbers. So
02:04:20.960 that's significant. So to the specific actions, the choices that you have,
02:04:25.840 one is if there's a sidewalk, be on it. And again, that's just separation right from the driving
02:04:31.620 that's going on. If there is no sidewalk, you actually want to walk against the traffic.
02:04:36.620 So that's for visibility stuff of what's going on. You already mentioned it. It ends up,
02:04:42.320 I mean, we do know 64, 67% of those fatalities occur outside of intersections. So they're happening
02:04:50.800 elsewhere. But it does mean that if you're crossing, you want to find an intersection that's got a
02:04:55.200 crosswalk if you can, and has some system of stop signs or counting for when you should go.
02:05:00.720 And you want to follow those rules as best you can. Okay. The other one, I think, is visibility.
02:05:06.740 I can't tell you how, especially at night, which is when most of those occur, over 50% are simple,
02:05:13.020 straightforward visual things, making sure people can see you. And the other one, I think,
02:05:17.320 nowadays in particular, we mentioned it for driving, but situational awareness is critical.
02:05:21.620 And so it's amazing how many people could be either drinking or on their phone with head down.
02:05:26.660 So distraction, the kind of things you think about as a driver, you would apply to pedestrians and
02:05:31.640 cyclists as well. Are cyclists included in that 7,500 deaths per year?
02:05:36.780 No, I think those are just pedestrians. And I don't remember the cyclist number, but it's there.
02:05:41.740 Yeah. As someone who rides his bike still outside, though, not nearly as much as I used to. And
02:05:47.000 frankly, a lot of that is due to just my lack of faith in drivers. I certainly adhere to the
02:05:52.860 principle of always assuming the car doesn't see me and always assuming that the driver is a moron.
02:06:00.080 And those assumptions are surprisingly accurate more often than I would like to acknowledge. I mean,
02:06:07.440 the amount of people that will pass you, so they've presumably seen you, only to then want to turn
02:06:13.520 right directly in front of you is remarkable. But yeah, I think all of those make sense. The walking on
02:06:20.540 the other side of the road, the being completely aware. And I just think as a general rule, as a
02:06:25.420 pedestrian, also assuming drivers don't see you, they're calibrated to look for bigger, faster
02:06:31.580 things than you. So you just have to kind of assume they're not seeing you.
02:06:37.140 You know, in California, pedestrians have the right of way wherever they are. But I always point out,
02:06:42.180 having a lawn on your side.
02:06:43.160 Having a lawn on your side means nothing if you're hit and dead or injured. I always find that to be
02:06:47.700 the dumbest argument ever, which is assume you never have the right of way, even when you do have
02:06:54.000 the right of way, when you have that big green light and the big walking sign, still assume that a
02:07:00.300 driver is going to make a mistake because the consequences are much higher for you than them,
02:07:05.120 even though legally the law's on your side. Anything else, Mark? We've sort of throughout this
02:07:10.520 discussion infused as much insight we have as far as behaviors that you can take into your own hands.
02:07:17.640 And aside from the obvious, I think the obvious ones are so clear. But I do, again, want to
02:07:22.140 reiterate this thing of looking both ways through the intersection. I always tell people left, then
02:07:28.020 right, then left, because it's that left one that's going to kill you when you're driving. So left,
02:07:33.480 right, left scan before the intersection when you have right of way. The other thing for me,
02:07:39.240 Mark, is when I'm on a four-lane road, two in each way, I'm never in the left lane unless passing,
02:07:46.960 just because, especially with these non-median roads out here, it's that much harder for somebody
02:07:52.620 to cross and weave into my lane. So it's staying in the right-hand lane. It's really funny. We've got
02:07:58.640 these 75-mile-an-hour roads out here that have no median, and there are typically two or three lanes
02:08:05.220 only in total, two in one direction, one in the other. And sometimes they're one in one.
02:08:10.240 And those are nerve-wracking. You know, a one-on-one 75 with no median. Those are roads
02:08:17.060 where I will not talk on the phone. I will not listen to a podcast. All I'm doing is imagining
02:08:23.140 how quickly could I veer off to the right if a truck got into my lane.
02:08:27.960 If I can just a slight nuance to what you said is part of the defensive piece of that is don't
02:08:34.560 expect or assume that people are going to actually follow the rules. And so you may be, you know,
02:08:40.740 at that intersection and looking, but it's like, don't think because someone's got a stop sign
02:08:45.340 there, don't assume they're going to follow the rules, whatever they are. And that includes yellow
02:08:50.080 lights, red lights. No, if there's someone there and their potential risk, your choice is whether you
02:08:55.820 decide to proceed through, slow down, pause. Those are the things that are under your control.
02:09:01.420 Yeah. It's really funny. Two days ago, I was driving somewhere and I was in the right-hand lane
02:09:06.200 and I was going to make a right-hand turn into something. And I didn't actually realize that
02:09:12.500 there was a driveway before where I make my right. I put my right blinker on, and this is a pretty
02:09:18.120 quick road. It's about a 50 mile an hour road. And I'm coming up to, I'm now probably about,
02:09:24.800 I don't know, 300 feet before where I'm about to make my right. And a woman came to the edge of the
02:09:32.220 road where, again, I didn't even realize there was a road. It's easy to miss road. And she looked,
02:09:37.680 saw me, but saw that I had my right blinker on and just pulled out, assuming I was going to turn
02:09:43.680 right there. Now you could argue, well, technically, I guess she's right. I had my blinker on,
02:09:48.020 but I had that blinker on long before she showed up thinking I'm turning, you know, whatever,
02:09:52.140 three, 400 feet up the road. And again, I had to really slam on the brakes not to hit her.
02:09:59.060 And she would have taken the brunt of that, not me. Regardless, again, it's not about fault. It's
02:10:03.580 what is the consequence of this awful collision? And I was amazed that she pulled out. She pulled
02:10:10.300 out when my rate of speed hadn't even begun to slow. And I wondered to myself, how does she not see
02:10:16.560 how fast I'm going? Like, what did she think I was going to do? Somehow, like I'm on a rail that's
02:10:22.380 going to allow me to turn that quickly. But see, that's a perfect, perfect example of what we just
02:10:27.340 talked about. Don't assume people are going to follow the rules or that what you're seeing is
02:10:31.200 what's going to happen. And to what you just said is she wasn't thinking, you know, she just assumed
02:10:37.380 that that right signal meant you were turning right into where she is.
02:10:40.680 And my point is, even though maybe I was at fault, when it's all said and done, maybe it's my fault
02:10:45.400 that I had the blinker on when I wasn't planning to turn right there. But I think taking fault out
02:10:49.520 of this is the right way to go. And instead thinking root cause effect, root cause effect,
02:10:56.040 because the only accident I've ever been in my life, Mark, was a very bad accident where the
02:11:01.840 other person was 100% at fault. They ran a red light while I was going correct speed limit,
02:11:08.940 which was 50 miles an hour through an intersection. And luckily for this woman,
02:11:14.500 I hit the passenger side, not the driver's side. She was the only person in the car.
02:11:20.640 I'm in a pickup truck. She's in a sedan. I hit her so hard that both vehicles were a write-off.
02:11:29.100 It's the only vehicle I've been in where the airbags deployed. And what struck me is
02:11:35.160 how amazed I was at how quickly it happened, because I was driving this way and she was
02:11:42.660 stationary to make a left. So it wasn't like I was looking at her. I saw her the whole time,
02:11:49.520 but never thought she would just jet out in front because it wasn't in my usual going through the
02:11:55.740 intersection, somebody over there, somebody over there. It was the woman right in front
02:11:59.320 who erroneously thought she had a green light to go, even though she had a red arrow.
02:12:05.300 And she just pulled out slowly. And I did probably get the brakes on before I hit her.
02:12:10.880 But I think about that. And I think had she had a passenger in that car, could have been a very
02:12:16.920 different situation. And it happens so quickly that you don't have to make the mistake to suffer
02:12:22.820 the consequence. I really appreciate you telling these stories. Some people are good at it. Some
02:12:28.120 people don't like to talk about them, but this touches everybody. I mean, you've lost someone
02:12:33.560 in a roadway and it's like the emotions you were experiencing are so justified and understood and
02:12:39.400 your approach to how you think about it now that are based in all of that. Same thing here. Those two
02:12:44.860 drivers you just talked about with different things. It's like neither one of them thought,
02:12:48.920 again, I'll go out today and see if I can get killed. It's not just the other way, right? It's that
02:12:53.000 one too. It's like, let me see if I can run into somebody to do it. It's like, they're not doing
02:12:56.960 that. But it's a complex dynamic environment with humans that are imperfect and make bad choices and
02:13:03.620 also errors in what they do. And how do we try and do that more safely? And I think, again, those
02:13:08.680 encapsulate everything we've been talking about. Big societal changes. How do we make this better
02:13:12.980 for everyone? But a lot of it comes down to your behavior, what you can control, the stuff that's
02:13:18.700 out of control, you can still do something about in some situations. I think it's also, we didn't talk
02:13:24.560 about this much, and you made just one quick comment in there. There's a whole culture around
02:13:29.160 cars that's part of what's going on here too. I mean, it's our independence, it's economics,
02:13:35.900 it's family, it's just sort of the American way, right? Don't tell me how. So there's a lot going
02:13:41.760 on here that also are at play, which I think when we talk about, are we ever going to have fully
02:13:46.640 autonomous so we can save lives? I'm not sure we know that yet. Maybe we work that way when the data
02:13:51.320 can justify it. But for the moment, there's a huge car culture that's in operation here as well
02:13:55.840 that clearly is affecting our choices and willingness to give certain things up for these
02:14:00.920 other societal benefits that we know we could attain. Yeah. Are there any resources, Mark,
02:14:06.440 that you would point people to who are kind of interested in the, what can I do? There are,
02:14:11.780 again, lots of things that hopefully will happen technologically and we can debate the policies all
02:14:16.840 day long. But at the end of the day, everybody listening to this is going to get in their car
02:14:20.340 today. And I hope we've given people a lot of strategies. Is there anything you would point
02:14:25.820 people to, especially parents maybe who have teenage kids who are starting to drive where we can,
02:14:33.740 as you said, I think get them started in the best way possible because you start to create
02:14:38.820 patterns of behavior early. So let's hold the kids for a moment. And I think the place to start a
02:14:45.560 few websites, NHTSA.gov, NHTSA.gov is a great website that you can go on. It has a VIN, vehicle
02:14:54.820 identification number lookup. So if you're wondering, do you have a defect and recall that you should
02:14:58.640 take care of, you can do that. Car seats, you can literally go on there and put your kid's birth
02:15:03.160 date, height, weight, and it'll actually give you recommendations of whether it's backwards,
02:15:07.600 forwards, booster seat, et cetera. Really good with that. And all the data we're talking about that I
02:15:13.540 can't keep in my head is on that website. And so it's a great source for things. We haven't talked
02:15:18.780 about it, but NHTSA runs a new car assessment program, NCAP. It's the five stars, stars for cars.
02:15:26.300 It's what's on the Monroly label on your new car. That's the stars evaluating its crash worthiness and
02:15:31.940 some other factors. What's it called? It's NCAP new car assessment program. Okay. So again,
02:15:38.160 when you're thinking about where should I start? Well, to start with the ratings. So NHTSA has got
02:15:42.580 all this different stuff. That's a great resource there. Next would be the Insurance Institute for
02:15:47.820 Highway Safety, IIHS. They do a lot of the crash testing, but they also have a lot of research that
02:15:53.420 they do. So they're the ones who do the top safety pick and other kinds of things. So that AEB challenge
02:16:00.000 I talked about, that was actually collaboration between NHTSA and IIHS doing that challenge for the
02:16:04.980 industry to democratize AEB. And besides the crash information, they also have other studies and
02:16:10.280 things they do. So really good there. National Safety Council, NSC. They have great data because
02:16:17.660 they all have car stuff. They have pedestrian stuff. They have cyclist information. They're really good
02:16:23.280 for general resources for safety, including roadway pieces. Sorry, that one again was?
02:16:28.940 That's the National Safety Council, NSC. NSC. Okay.
02:16:31.900 And the fourth one I would mention is Safe Kids Worldwide. And they actually certify car seat
02:16:39.780 technicians because it ends up, and I'm going to get this number wrong, but 60, 70% of car seats are
02:16:46.740 installed incorrectly. I know it's over 50%. So it's great to do all that work to get the right one,
02:16:51.940 but then if it's not incorrectly. So Safe Kids Worldwide actually does that too. I think those four
02:16:57.120 are great resources for different things. When you get to the teenagers, a couple things. AAA
02:17:03.580 actually has different kind of driver contracts that you can use. So that's the insurance company
02:17:09.580 also has a foundation. So they have a lot of good auto safety information there as well. But they
02:17:14.640 actually have contracts that you can do with your kids. And I would just say you should also look up,
02:17:19.380 because depending on the state, they may have graduated licensing. And that licensing for new
02:17:24.420 drivers actually includes, can you have other kids in the car? No. Can you drive after dark? No.
02:17:31.440 This for six months, this for a year, et cetera. And so I was just talking to a friend. It's like,
02:17:36.600 oh, we did the AAA. I'm like, you should look at the graduated licensing along with the AAA and just
02:17:42.020 see what the elements are and then come up with your own. And I hate to say this because as a scientist,
02:17:46.980 you'll appreciate this too, but feel free to experiment with her. Put that training program you were
02:17:53.400 talking about. There's some simulator work. There's also some actual driving work. There's
02:17:58.040 the intellectual part that would supplement any formal stuff she gets from another source.
02:18:02.760 But do that. I mean, that's the intervention. I would do your own assessment then to sort of,
02:18:07.060 could you measure some of that in the simulator, for example, to see if now you have an end of one,
02:18:11.160 totally get it, but it's her life you're trying to save. So whatever you could do there to try and
02:18:16.680 bolster her with more education, experience, et cetera, that would make her again,
02:18:21.940 more situationally aware and defensive when she's driving along with the skillset, I think would
02:18:25.980 be great benefit. The other thing I'll just tell you that's interesting is we've seen in the last
02:18:30.440 five to 10 years, young drivers are less and less interested in getting their driver's licenses right
02:18:35.880 away. So we're seeing kids in college that don't know how to drive, you know, or they're waiting till
02:18:41.160 after college when they have a job or they're picking their location by public transit.
02:18:45.560 And this might be a great trend given what you said about the maturation of kids anyway. So yeah,
02:18:52.020 you know, one thing I'll just share with you with my daughter, she's not psyched about this, but
02:18:55.600 she's going to be driving a manual transmission truck. That's actually going to be her car. And
02:19:00.780 every time she fusses about it and says, Oh, you know, I just say, I mean, you could take the bus,
02:19:06.480 like that's cool too. But if you want to drive, you're going to be driving stick. And there's a big part
02:19:11.440 of that is the connection to the drive. And it reduces at least one of those variables in my
02:19:16.540 mind, which is the distractibility. And she's nervous about rolling backwards on hills. And I'm
02:19:20.640 like, good, concentrate harder. Yep. And learn the skill. Yeah. So that becomes a non-issue for you.
02:19:26.520 Yeah. Well, Mark, this has been really illuminating. It's quite disturbing as we've talked about. And I
02:19:32.640 almost wish there was a way that we could convey every one of the 42,929 stories of the people who
02:19:41.860 died two years ago, the most recent data we have here. I think it, on some level, it sometimes takes
02:19:47.340 those things. Unlike when you think of all the chronic diseases that I spend most of my time
02:19:51.120 thinking about and talking about, they disproportionately affect older people. Your risk of death from
02:19:57.600 cardiovascular disease, stroke, cancer, and Alzheimer's disease. This is disproportionately a disease of the
02:20:04.760 elderly. Whereas when I last looked at this, which was 2020, and I can't imagine the data have changed
02:20:11.040 that much. The thing that stuck out to me the most was that automotive deaths were the most uniform cause
02:20:19.720 of death by decade of anything in the top 10 causes of death. There was nothing more uniform
02:20:27.580 than dying in a car. Equal opportunity killer. Which is why let me, I know as we start wrapping
02:20:35.240 just two things that I think would be good, especially for you to be thinking about. One is
02:20:40.620 that, and these are where the parallels are there. One of the differences, however, is between health
02:20:46.120 and say the safety element, et cetera, is, and one of the reasons I've really looked at safety and
02:20:51.660 transportation from my sleep interests is death can be instantaneous. I mean, it's like in milliseconds and
02:20:59.080 you don't go out that day thinking it's going to happen. And as you were saying. Yeah, you don't get
02:21:02.160 to say goodbye to your grandmother who's got cancer for two years. Yeah. Someone comes to your door.
02:21:08.360 They're not coming home. It's like, just like that, right? And so I think that's one thing that
02:21:13.880 separates it in some ways from those chronic illnesses and many other kinds of things. But I would also say
02:21:20.180 there's some real significant parallels when I think about this, because one of the big pushes
02:21:25.020 that I've tried to make happen in the automotive realm is trying to move from a more reactive
02:21:30.700 safety culture to a more proactive one. Which again, I think the parallel there of what you do for
02:21:36.880 prevention and thinking about how do we eliminate or mitigate the things we know that can cause bad
02:21:42.540 stuff, promote the ones we know that will make good stuff happen. How do we do that? And I think
02:21:48.700 that's been a challenge having worked at NASA as a NASA scientist on the aviation side. They're very
02:21:53.180 proactive. I mean, in aviation, they went 12 years. In the US, no person died in a commercial aircraft
02:22:00.640 crash in like 12 years. Okay. And then it was one. And I always tell people, it's like, it's so hard to
02:22:06.560 get there. It's even harder to stay there, saving all those lives. But that takes a proactive culture
02:22:13.120 that says we're going to do what we can to eliminate or mitigate, diminish what those risks are and
02:22:18.840 promote the good stuff we know that's going to make a difference. You do that all the time. This
02:22:22.840 is why I said thank you at the beginning, because bringing this into that realm, and not just public
02:22:28.240 health, this is like societal safety and what we get here. There's a real parallel there that there's
02:22:33.080 a chance to be more proactive. Now you think about it, investigation is reactive. Bad thing happens,
02:22:38.620 we investigate, but it's to make it proactive. That what we learn from there has to be translated
02:22:43.880 into some action that's going to prevent it from happening again. And that's where, frankly, and
02:22:49.200 always, again, I've already said thank you, I'll do it again. Just having people discuss this and think
02:22:53.460 about it, tell their story to someone else, those can save lives. I don't know your daughter at all.
02:22:59.560 I've known you for this period of time, but it's like, you're way better off, both the fact you've
02:23:03.960 talked about it, you've thought about it, and will basically provide that context and skill set to
02:23:09.420 your daughter. You may not even be around when it saves her life, or her kid's life, or some other
02:23:15.800 person in her sphere, because that's the societal change in a proactive way that's going to make the
02:23:21.560 big difference. And yeah, I think we need to save them one at a time. Your friend, my father,
02:23:26.140 the consequences are so significant, but we can't bring those back. How do we use those
02:23:30.880 as opportunities, though, to make the future for you, your daughter, her family, my kids, etc.?
02:23:37.100 What do we do to make that in a proactive way safer for the future? You're doing that straight
02:23:41.600 on with the healthcare. The parallel is there in this realm as well, because the costs are so high.
02:23:47.600 If there's one thing I could do based on this discussion, Mark, because I now realize there's so
02:23:51.680 many pieces that are changing with the technology that are going to move us in the right direction.
02:23:55.500 But if I could make a change today, based on what we discussed, it would be that I wish that for
02:24:01.520 every significant accident or fatality that occurred in a given city, the story was told
02:24:08.000 in two fronts. The story was told in the human sense of the story, so that we understood the life
02:24:15.620 or lives that were lost, and the consequences, and how that's going to ripple through forever.
02:24:19.880 How Nick's wife lost her husband and Nick's kids lost their dad, but also in a very clinical
02:24:26.340 autopsy-like manner of the accident. I really think that every time a horrible accident or fatality
02:24:33.760 occurred, if each of us could see a 60-second video that would say, this is what happened on
02:24:40.600 this date and this time, and these were the contributing factors, that's it. And everybody
02:24:45.500 has to invest. It's not a big price to pay for those of us who are alive, that for every time
02:24:50.300 somebody dies, you got to invest a minute in hearing how it happened. No other technology at
02:24:55.680 this point in time, just the explanation of what happened, is going to make us better drivers.
02:25:02.260 No question. This is why I was not kidding about write those down, the four-minute video or like what
02:25:07.220 you're talking about now. And I would just slightly extend to say, not just identifying the causal
02:25:13.120 contributing factors, but if there's any action that people could actually take to do differently,
02:25:18.000 and think about this with that local intersection you keep talking about, without what you just
02:25:21.940 described, it's going to keep happening. People aren't going to know what happened there, or if
02:25:26.000 there's anything different they could do to spare their lives, it's not going to change without what
02:25:31.200 you were just describing. Yeah. Yeah. Figure out how to do that. Mark, thank you very much. Really
02:25:37.320 appreciate your time and your insight today. Thank you. Enjoy the conversation.
02:25:41.020 Thank you for listening to this week's episode of The Drive. It's extremely important to me to
02:25:46.480 provide all of this content without relying on paid ads. To do this, our work is made entirely
02:25:51.460 possible by our members. And in return, we offer exclusive member-only content and benefits above
02:25:57.880 and beyond what is available for free. So if you want to take your knowledge of this space to the next
02:26:02.420 level, it's our goal to ensure members get back much more than the price of the subscription.
02:26:06.920 Premium membership includes several benefits. First, comprehensive podcast show notes that detail
02:26:13.700 every topic, paper, person, and thing that we discuss in each episode. And the word on the street
02:26:19.320 is nobody's show notes rival ours. Second, monthly ask me anything or AMA episodes. These episodes are
02:26:27.420 comprised of detailed responses to subscriber questions, typically focused on a single topic
02:26:32.280 and are designed to offer a great deal of clarity and detail on topics of special interest to our
02:26:37.640 members. You'll also get access to the show notes for these episodes, of course. Third, delivery of
02:26:43.380 our premium newsletter, which is put together by our dedicated team of research analysts. This newsletter
02:26:48.920 covers a wide range of topics related to longevity and provides much more detail than our free weekly
02:26:55.160 newsletter. Fourth, access to our private podcast feed that provides you with access to every episode,
02:27:02.040 including AMA's sans the spiel you're listening to now and in your regular podcast feed. Fifth,
02:27:08.960 the qualies, an additional member only podcast we put together that serves as a highlight reel featuring
02:27:15.160 the best excerpts from previous episodes of the drive. This is a great way to catch up on previous
02:27:20.340 episodes without having to go back and listen to each one of them. And finally, other benefits that
02:27:25.340 are added along the way. If you want to learn more and access these member only benefits, you can head
02:27:30.740 over to peteratiamd.com forward slash subscribe. You can also find me on YouTube, Instagram, and Twitter,
02:27:38.200 all with the handle peteratiamd. You can also leave us review on Apple podcasts or whatever podcast
02:27:44.780 player you use. This podcast is for general informational purposes only and does not
02:27:49.980 constitute the practice of medicine, nursing, or other professional healthcare services, including
02:27:54.360 the giving of medical advice. No doctor patient relationship is formed. The use of this information
02:28:00.460 and the materials linked to this podcast is at the user's own risk. The content on this podcast is not
02:28:06.860 intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not
02:28:12.420 disregard or delay in obtaining medical advice from any medical condition they have, and they should
02:28:17.160 seek the assistance of their healthcare professionals for any such conditions. Finally, I take all conflicts
02:28:23.500 of interest very seriously. For all of my disclosures and the companies I invest in or advise, please visit
02:28:29.840 peteratiamd.com forward slash about where I keep an up-to-date and active list of all disclosures.
02:28:42.420 Thank you.