The Peter Attia Drive - February 03, 2020


#91 – Eric Topol, M.D.: Can AI empower physicians and revolutionize patient care?


Episode Stats

Length

2 hours and 1 minute

Words per Minute

185.90067

Word Count

22,499

Sentence Count

1,439

Misogynist Sentences

4

Hate Speech Sentences

5


Summary

Dr. Eric Topol is a cardiologist, geneticist, and digital medicine pioneer. He is the founder and director of the Scripps Research Translational Institute, TSRI, and prior to that served as the chairman of cardiovascular medicine at the Cleveland Clinic for 15 years. He s also the editor-in-chief of Medscape, and in 2012 he published a book called The Creative Destruction of Medicine. This is his third book, but it could be his fourth, called Deep Medicine, which focuses on the application of artificial intelligence, deep learning, and machine learning in medicine, which is a field that is, as you probably realize, at times, slow to adopt to technical change.


Transcript

00:00:00.000 Hey, everyone. Welcome to the drive podcast. I'm your host, Peter Atiyah. This podcast,
00:00:15.480 my website, and my weekly newsletter all focus on the goal of translating the science of longevity
00:00:19.800 into something accessible for everyone. Our goal is to provide the best content in health and
00:00:24.760 wellness, full stop. And we've assembled a great team of analysts to make this happen.
00:00:28.880 If you enjoy this podcast, we've created a membership program that brings you far more
00:00:33.280 in-depth content. If you want to take your knowledge of the space to the next level at
00:00:37.320 the end of this episode, I'll explain what those benefits are. Or if you want to learn more now,
00:00:41.720 head over to peteratiyahmd.com forward slash subscribe. Now, without further delay,
00:00:47.740 here's today's episode. I guess this week is Dr. Eric Topol. Eric is a very famous cardiologist,
00:00:55.080 geneticist, and digital medicine researcher slash pioneer. He's the founder and director of the
00:01:01.120 Scripps Research Translational Institute, TSRI. And prior to coming to Scripps, he served as the
00:01:06.640 chairman of cardiovascular medicine at the Cleveland Clinic, a post he held for about 15 years.
00:01:12.120 We actually start the interview by talking about the story that led to him leaving the Cleveland
00:01:17.880 Clinic. And actually, we spent quite a bit of time on this story, which is something that I certainly
00:01:23.020 remember following during its unfolding in the early part of the 2000s. He's also the editor-in-chief of
00:01:29.940 Medscape. And in 2012, he published a book called The Creative Destruction of Medicine. What we talk
00:01:35.400 about today, though, is his, I believe his third book, but it could be his fourth, called Deep Medicine.
00:01:40.900 And this is something I've been wanting to talk with Eric about for some time, because it really goes
00:01:44.840 into, in a non-sci-fi way, the application of artificial intelligence, deep learning, machine learning,
00:01:52.780 in medicine, which is, as you probably realize, a field that is, at times, upsettingly slow to adopt
00:01:59.440 to technical change. We talk about a lot of things in this episode. And in a surprisingly brief period
00:02:04.680 of time for one of my podcasts, actually, we talk a lot about the gut biome. And I actually have a
00:02:09.160 great and spirited discussion about it, because as some of you may know, I'm kind of a skeptic of this
00:02:13.780 whole gut biome is going to be the answer to all of our woes. But I think in the end, Eric and I
00:02:18.560 really kind of end up being much more closely aligned in our views of the utility of this tool
00:02:23.920 as a way to provide predictive insights. In fact, there are a number of things that I came out of
00:02:27.880 this episode with some follow-up notes for myself, as far as people I want to connect with, researchers
00:02:33.580 that I want to connect with, to better understand how I can utilize that information for some of my
00:02:38.940 own clinical interests. I think what comes across in the end of this discussion is that doctors aren't
00:02:43.760 going anywhere. And in fact, Eric has a slightly contrarian view of what the impact of AI in
00:02:50.300 medicine will be. He argues that it's not at all the doctors are going to go away. It's just the
00:02:54.560 doctors are going to change their focus and frankly focus on the one thing that doctors and humans in
00:02:59.280 general can do far better than machines. So with that said, I hope you enjoy my conversation with
00:03:04.860 Dr. Eric Topol. Well, Eric, thanks so much for coming over on a Friday afternoon.
00:03:14.540 Great to be with you, Peter.
00:03:15.640 This is kind of funny because we've both lived in San Diego for over 10 years and your name comes
00:03:21.160 up all the time. Everybody says to me, you must know Eric Topol. And I say, well, of course I know
00:03:25.200 of him, but no, I don't know him. And where it really comes up is every time I'm at Dexcom.
00:03:29.740 And obviously you're on the board there and I know Kevin Sayre very well. And I met several folks on
00:03:35.060 the team. So it's hard to believe this is the first time we're meeting.
00:03:38.740 It is. I've heard a lot about you over the years, Peter. Now you mentioned Dexcom. I've been on their
00:03:44.100 board for nine years and I've watched this early medical wireless world of sensors be transformed.
00:03:52.380 So that's been really a privilege. Whereas most people think about steps as a wearable sensor and not
00:03:58.700 glucose. So, you know, it's been a great company that started nine years ago, at least when I
00:04:04.280 started with them, they were really having a rough time to get people with type one diabetes to use
00:04:10.500 continuous glucose. And that's really changed a lot.
00:04:13.440 Yeah. Kevin and I met about four years ago on an airplane. I still refer to it as certainly one of
00:04:18.940 the top two luckiest seating assignments I ever had to be sitting next to Kevin and immediately clicked
00:04:24.240 and I've never taken the sensor off since. You can see I'm wearing my G6 right now.
00:04:28.700 Wow. And I agree. I mean, it's sort of comical when people think about the number of steps one's
00:04:34.340 taking as a quote unquote wearable or a valuable insight when you think about what could be measured
00:04:39.760 in the interstitial fluid. And glucose, of course, is just the thin end of the wedge on that.
00:04:44.160 Exactly. I think a lot of people haven't realized how where this is headed. The more concern,
00:04:49.600 of course, is using in the right people. That's like, for example, the Apple watch for heart rhythm,
00:04:55.160 where so many people are using it and it's a recipe for false positives. But if you use these
00:05:01.620 sort of things, these more advanced sensors properly, they can be really a big difference.
00:05:06.040 You're one of the earliest adopters of mobile telemetry and mobile devices or outside of
00:05:12.500 hospital devices, ambulatory devices to be able to measure heart rhythm. I wasn't even planning to ask
00:05:17.560 you about that, but I can't resist at this moment. Give people a little bit of background about you.
00:05:21.320 Obviously, you're a cardiologist. We're going to talk about what you've done at the Cleveland
00:05:24.680 Clinic and what you've done here at Scripps. But what interested you in cardiology in the first
00:05:28.560 place? Well, it wasn't really cardiology that got me into medicine. It was the interest in
00:05:33.380 endocrinology. My father had been a type one diabetic and had every complication you can imagine.
00:05:40.640 Was blind by age 49. So I decided that since that seemed to be such a primitive area as far as
00:05:48.500 no prevention and lack of treatments outside of insulin, that maybe that would be the way to go.
00:05:54.000 And so when I went to UC San Francisco for my residency, it was really to get geared up to be
00:06:00.120 a diabetologist. And what happened there was I was completely transfixed by what was going on
00:06:06.560 in cardiology, particularly kind of Chatterjee, who's been a medical hero of mine, had a big influence
00:06:12.920 in really changing the path. It was also a very remarkable time in that it was the first balloon
00:06:19.480 angioplasty to the coronary arteries, the first clot dissolving therapy for heart attack, and so many
00:06:24.940 other things that it was captivating. So I've never regretted that change, but it wasn't what I had
00:06:30.380 initially in mind.
00:06:31.360 The field of cardiology today is so specialized. I mean, to say that one would do a residency in
00:06:38.360 medicine followed by a fellowship in cardiology would be as broad as doing a residency in general
00:06:43.620 surgery today, where the field has maybe stated in another way, an interventional cardiologist
00:06:50.880 versus a lipidologist would have virtually nothing in common outside of their foundational
00:06:56.480 training in cardiology. I mean, I don't know that many people actually appreciate that outside of
00:07:00.740 medicine. No, you're really right, Peter. There's these subspecialties. It could be a heart failure
00:07:07.640 or prevention or the plumbers, interventional or the electricians, electrophysiology, and on and on.
00:07:13.280 So it's a very broad discipline. The field has matured so much in the last decade or two.
00:07:19.520 There are obvious benefits to that. What do you think are the limitations of that stratification?
00:07:24.960 Well, the major limitation is that you get this ice pick view of the patient. You know,
00:07:29.980 when you see a patient as an interventional cardiologist, you're thinking about what arteries
00:07:33.960 can I fix? And the electricians are thinking about the person as having an arrhythmia. So the general
00:07:41.200 cardiologist, which is the group of people that would be the advocates to prevent unnecessary
00:07:47.000 procedures, they don't get enough respect. It's just like, you know, primary care internal medicine
00:07:52.040 doctors. And so we really want to boost them up because they are the ones that are really caring for
00:07:58.520 patients and looking out for their overall cardiovascular health.
00:08:03.280 How long ago did you sort of get the sense that we didn't have to be in the hospital with a 12 lead
00:08:10.940 EKG on a patient to appreciate what was happening in the conduction system of their heart? And in fact,
00:08:16.440 I feel like I even remember seeing you on TV 10 years ago on CNN or Good Morning America for all I
00:08:23.200 know, I honestly can't even remember, but you were in a clinic and you were sort of saying, look,
00:08:28.020 there's going to be a day when a patient at home is going to wear a device and it's going to send me
00:08:34.320 a warning sign that something is going on. Right. Well, it's interesting you bring that up because
00:08:39.360 it was kind of a serendipitous connection to San Diego. So it was, I think 1999, I had gotten to know
00:08:47.460 the folks at Kleiner Perkins pretty well and this guy, Brooke Byers contacted me and he said, you
00:08:53.760 know, we're looking at this company called CardioNet. We don't really understand this thing about being
00:08:59.360 able to do electrocardiogram monitoring over the internet. Could you look at this thing? I'm going
00:09:05.980 to send you the slide deck. So it was, you know, 1999, 20 years ago, I looked at this thing. I said,
00:09:12.920 whoa, this is an eye-opener. And it was a San Diego-based company. And you were still in Cleveland
00:09:17.600 at the time. Yes. Yes. And the whole idea was things were starting to really converge of the
00:09:23.180 idea, at least, of medicine and the internet. But the sense that you could monitor people remotely,
00:09:30.600 continuously for multiple leads of their cardiogram was really exciting because up until that time,
00:09:36.740 the only way we could do that was to put on one of these bulky Holter monitors. This is Norman
00:09:41.920 Holter from the 1950 version. We still talk about his name. They're still using it. And, you know,
00:09:48.480 that is, you can't exercise really. You can't take a shower. You know, it's, you can't wear this too
00:09:54.240 long and it isn't real time. You then send it in and, you know, you get, or you go back to the clinic
00:10:00.700 and take it off. And so it's so antiquated when you think about it. So the idea that we could
00:10:06.400 transcend that era with this mobile continuous monitoring. And by the way, there are people
00:10:11.520 who have been on a Holter who died suddenly. And the Holter, of course, serves no purpose other
00:10:16.200 than to tell you what arrhythmia killed them. Exactly. So now it's a whole different world.
00:10:20.860 And then you start to say, wait a minute, why don't we just do everything? Not just a cardiogram.
00:10:26.340 We could do all the vital signs. We, we get rid of hospitals or at least those hospital patients
00:10:32.400 who are not in an intensive care unit. And I think that's where we're headed. That is 20 years ago
00:10:39.020 was kind of the entry point with this first dedicated wireless company, CardioNet. And it's
00:10:45.400 just where we're going to build on to ultimately eradicate the need for most hospital rooms, which
00:10:51.600 is a pretty big deal. And it's probably the most transformative aspect of where we're headed
00:10:56.920 because that's the number one item for healthcare costs. It's not just the facilities, but the
00:11:03.340 personnel. And so they account for a third of our $3.6 trillion annual healthcare budget.
00:11:10.780 Medicine seems to always take longer to adopt things. I mean, I think in general,
00:11:14.780 as people, we tend to have optimism that exceeds the pace at which technology moves forward. I mean,
00:11:20.920 in some ways, engineering examples get overused. We talk about, uh, the Manhattan project,
00:11:27.720 which is kind of remarkable when you really stop to think about it, that in such a short period of
00:11:31.480 time, they could go from a proof of concept in the early forties to a finished product, you know,
00:11:37.480 in 44, even the space race is kind of remarkable in terms of the accuracy with which they were able
00:11:43.420 to sort of project and map out the steps. It doesn't seem that healthcare follows that curve.
00:11:48.820 It doesn't just follow a straight Moore's law.
00:11:51.660 Well, it's actually defines Moore's law. If you plot that out, you look at, wow,
00:11:58.100 the cost of chips has gotten so incredibly low over the course of now 50 plus years and healthcare
00:12:05.600 costs are going the opposite direction. So their lack of embracement of the digital era and also the
00:12:13.480 lack of it having the impact of lowering costs is notable. It's palpable. It's a general resistance.
00:12:21.420 You know, I liken it to a sclerotic or ossified nature of the medical community, very resistant
00:12:26.960 to change. The only time you see lack of resistance when it is tied to markedly improved reimbursement,
00:12:34.560 for example, you know, the adoption of robots like Da Vinci or, you know, something like that.
00:12:39.720 Otherwise there's just no real incentive to change. And of course we want to be careful
00:12:45.720 because we don't want to adopt a significant change when it isn't validated or proven. But
00:12:50.300 when we, when we see things that are unquestionably advances and still unwillingness to move in that
00:12:57.120 direction, that's disconcerting. Yeah, it is. And there are a few problems I've contemplated
00:13:02.000 where no matter how much time and energy I put into it, I really can't even see the direction of the
00:13:08.060 solution. I think there are some problems, even the political system, when you think about how
00:13:11.480 broken our political system is, I don't think you, you don't have to be a student of political
00:13:15.200 science to appreciate that. But I think most people who spend a lot of time thinking about it,
00:13:19.540 if given a magic wand would know how to move in the right direction, right? If you stopped
00:13:24.240 gerrymandering, if you maybe discarded the electoral college, like there were like five things that you
00:13:29.460 could do structurally that would bring politics back into a sort of more civilized era.
00:13:34.820 But if you said to me, wave a magic wand, what, how do you fix medicine? The only idea I've ever
00:13:42.180 had is one that involves changing behaviors directly, which of course becomes a bit of a
00:13:47.560 tautology. But it seems to be that the disconnect between the driver of demand and the one who pays
00:13:53.080 the bill is the biggest problem. Does that kind of resonate with you? In other words, in this system,
00:13:57.740 which is not a budget driven healthcare system, like it is in the UK, it's a demand driven system.
00:14:02.620 So the system will rise to the cost of the demand. The demand is mostly driven by the patient and
00:14:08.920 the physician, the patient requiring the care, the physician ordering the treatment, but they bear
00:14:14.400 less than, I mean, they bear maybe, I think the last thing I saw said about 11% of the total cost
00:14:19.240 is born by those driving demand, which is sort of like you walking into a car dealership and knowing
00:14:25.480 you only have to pay 11% of the car price. It's going to completely uncouple any reality.
00:14:31.880 To me, that seems like the elephant in the room. And that's why I think the problem is,
00:14:38.080 as you stated, which is why are hospitals so expensive? Well, if people actually saw the bill
00:14:44.960 of, you know, a hospital stay and realized what the, you know, the gauze and the pillowcase were being
00:14:50.200 charged for, I mean, they'd, they'd scream, but of course they don't have to pay it directly.
00:14:54.760 They're only paying it indirectly.
00:14:56.960 Right. It's a really messed up system in so many respects. You touched on one big one, but
00:15:04.040 the basis as this absurd healthcare charges, it's just unfathomable. And all the things that have
00:15:13.800 been done to date like Obamacare and the debate now about Medicare for all or whatever doesn't
00:15:20.640 get to the root of the problem, which is the cost. That's right. It's politically in vogue to deal
00:15:25.320 with access, which is an important problem. Absolutely. But if you expand access without
00:15:31.500 reducing costs, you trade one bad problem for another bad problem. That was so educational for
00:15:36.160 me because over the past year and a half, I worked with the NHS to review their health system in
00:15:44.780 particular, the impact of technology, AI, digital medicine, genomics. And as you already mentioned,
00:15:52.400 Peter, they have a system which can be changed like a light switch. They don't have a single payer
00:15:58.580 and they have far better outcomes than we do at about a third of the cost per person.
00:16:07.240 And what's interesting is they have the will to make these changes. They're adopting things
00:16:12.340 at a rate that is no comparison to the US. Like for example, they already have places in the UK where
00:16:18.800 they've gotten rid of keyboards instead of doctors typing and being data clerks. But they're also
00:16:24.980 interested in making things more efficient than they already are more efficient than we are.
00:16:30.560 But the difference is the incentives, just as you outlined. This is not employer-based healthcare.
00:16:36.280 This is not co-pay related. This is healthcare for everyone. And we're going to make it as good as we
00:16:42.800 can and as least expensive as we can. So that country is in many respects, a different model.
00:16:50.040 Canada is like it. And many countries in Europe are similar. And we're so remotely disparate. It's
00:16:56.820 just unfortunate. You know, I grew up in Canada. So I have sort of mixed feelings about the discussion
00:17:02.400 of a single payer system because I've seen the advantages of it and you've outlined them. I don't
00:17:07.540 know as much about the NHS, which may be better than in Canada. It's not national. It's done by each
00:17:13.900 province. So, but it's still universal coverage within each province. But that said, my whole family's
00:17:18.680 still in Canada. And I will say the following, when they get care, it seems to be pretty good.
00:17:26.620 But boy, is it hard for them to get care sometimes. It depends what you need, right?
00:17:30.560 Right.
00:17:30.760 If you need a coronary artery bypass and you have critical stenosis, you'll get excellent care in
00:17:36.900 Toronto. You know, it'll be no worse than it would be in the best hospital in New York or San
00:17:41.640 Francisco. But if you have a torn ACL, it might take you six months to get that MRI. Now we could
00:17:50.100 debate whether or not in the long run that matters as much, but you know, I've always found it
00:17:56.180 interesting that at least a country like Canada, and again, I don't, you can speak to the UK for me.
00:18:01.420 There's great resistance to have a second layer of private insurance on top of the public
00:18:05.820 that would allow that, which to me seems like the best hybrid solution, which is you have to have
00:18:12.760 a safety net that provides universal coverage for everyone. But if an individual decides, look,
00:18:18.560 I'm willing to pay an extra $10,000 a year, which by the way, is still a fraction of what I pay to
00:18:22.840 insure my family. You now have a separate queue that you can go into, you know, so it's, I mean,
00:18:28.140 it's like Disneyland, right? It's like in Disneyland, you can do the special pass where you don't have
00:18:31.740 to wait in line, you pay an extra, whatever it is. Why do you think that places like Canada for
00:18:36.820 sure, but maybe even the UK have resistance to secondary insurance? Well, they have secondary
00:18:43.080 insurance. People have that. In many respects, it works like you just described. I see. So the NHS
00:18:49.480 has a second tier that one can buy privately. Yes. The main difference is there's a philosophy that
00:18:55.460 if you're a citizen, as you said, healthcare is a right. It's a right, not a privilege. Yeah.
00:19:00.620 Yeah. And then there are people that have this added insurance. It does separate a small fraction
00:19:07.660 of people into this other class of getting access to more rapidly, to perhaps a different queue as
00:19:15.720 you outlined it. But for the most part, it's a small proportion of the people in the UK. And I think
00:19:21.740 it's somewhat similar in Canada. But I think the difference really is that there's two big
00:19:28.440 poles of problems. If you look at it in the US, there's the indigent who either don't have access
00:19:34.920 or if they do, they're just not getting the kind of care that you would like to see. And then there's
00:19:40.300 the affluent who get too much. They get overcooked. They get executive physicals and they get all this
00:19:46.840 stuff that shouldn't be done. And the outgrowth of that is bad outcomes. They get incidental illness.
00:19:52.440 You don't see that in the UK and in a lot of other countries. So we have this problem at both ends. And
00:19:59.240 most of the recognition has been on the end of the underrepresented and indigent, not on the people who
00:20:06.420 are getting overcooked. And that's a problem. I interviewed Marty Macri a little while ago, and he's
00:20:12.940 written very eloquently about this problem, specifically with respect to pharmacotherapy.
00:20:17.780 You know, you mentioned executive physical. Well, your alma mater, of course, is one of the places
00:20:22.340 that, you know, is certainly has to be regarded as one of the finest hospital centers in the United
00:20:27.140 States. And then by extension, the world. And I feel like I've had half a dozen patients come back
00:20:32.500 from their executive physicals there, or more so ask me if they should go and get it.
00:20:37.260 So when did you, when did you get to Cleveland? Early nineties?
00:20:40.180 Yeah, I got there in 91.
00:20:41.520 And what was the Cleveland Clinic in 91?
00:20:43.760 It wasn't as well known as it is now. It was particularly well known for bypass surgery.
00:20:52.140 It was the place where Rene Favalaro essentially invented bypass surgery. And also Floyd Loop had
00:20:59.820 really brought in the internal mammary artery, which was a big advance. It also had some other
00:21:05.060 traditions. I mean, Mason Sones had been there, discovered coronary angiography.
00:21:09.640 Was it considered at that time ahead of Minnesota, which was also arguably one of the earliest
00:21:15.300 pioneers? You know, Stanford and Minnesota were really these huge pioneers in earlier cardiovascular
00:21:19.600 medicine.
00:21:20.540 Well, I think in cardiovascular, that was its signature contribution. I mean, obviously there
00:21:25.080 were others, but it was...
00:21:26.800 Lillehi, of course, was, you know, Shumway and Lillehi were kind of these gods.
00:21:30.800 Exactly. I think the main, you know, for coronary disease, because there had been so many,
00:21:35.980 a cluster of remarkable innovations. That's where it had its biggest footprint. And when I went there,
00:21:43.400 there was by far more bypass surgery done at Cleveland Clinic than anywhere in the world.
00:21:48.620 And you had gone, after UCSF training, you went to University of Michigan?
00:21:53.100 Well, there was one stop in between. That was at Johns Hopkins where I did cardiology.
00:21:57.300 Oh, I don't think I knew that.
00:21:58.340 Yeah.
00:21:58.620 So we have this one overlap in our backgrounds.
00:22:00.940 Yeah. And then I went to, I was seven years at University of Michigan. My first job, my second
00:22:05.840 job at Cleveland Clinic, it was almost 14 years. But when I went there in 91, it wasn't a really
00:22:12.500 strong academic center. It was, in fact, I'll never forget my chief of medicine at Michigan. When I
00:22:20.220 told him I'm going to Cleveland Clinic, he said, well, that's the end of your academic career.
00:22:24.320 It was viewed as almost going into private practice?
00:22:26.560 It was viewed as, well, high volume, factory medicine, you know, high quality, but, you know.
00:22:32.920 Did you have a strong residency fellowship system underneath you? Were you going to be
00:22:36.180 heavily involved in training?
00:22:37.580 There was a medicine fellowship system, but I don't know that I would qualify it as strong
00:22:42.400 academically. You know, large, lots of them, but they weren't doing cutting edge research and
00:22:48.400 it wasn't that kind of scholarly environment. So my mission when I went there to refute the
00:22:55.520 Yamada's view that it was the end of career was actually do just the opposite and enliven
00:23:03.080 it and wake up the curiosity and the innovations. So that's what we did. And, you know, it was
00:23:08.680 a big transformation because it involved a whole new team, you know, bringing, it was like an
00:23:13.760 exchange transfusion because they hadn't written a paper in the cardiology division in a couple
00:23:18.960 of years. Really?
00:23:20.640 Yeah. It was, it was pretty, it was very much dominated by cardiac surgery and it was just
00:23:25.120 limited high productivity in academic side.
00:23:29.240 So it really comes down to incentives again. I mean, today we see the opposite problem,
00:23:32.760 of course, where we have the proliferation of total nonsense journals and absolute horrible
00:23:38.000 things that don't pass for science being written constantly because of course the pendulum on the
00:23:43.720 incentive is you have to publish. Right. And so presumably at Cleveland, that was simply not
00:23:48.540 that the, the pendulum was the exact opposite where you're, you were probably compensated based
00:23:52.300 on clinical productivity and nothing more. Yeah. I mean, I think the cardiologist, when I talked
00:23:57.320 to them, when I was interviewing and then when I got there, beached in, they said, you know,
00:24:02.060 we're the handmaidens of the surgeons. And they were so busy taking care of the patients because
00:24:06.580 the surgeons didn't really see the patients outside of the operating room. And they needed
00:24:10.540 obviously at this high volume of patients need a lot of care. And there weren't that many
00:24:14.800 cardiologists. So the cardiologist would run the critical care and the step-down units and
00:24:18.500 everything. Everything. Wow. So they really were giving great care, but they were consumed by that.
00:24:24.520 So they didn't really have the time and nor did a lot of them since it was highly inbred then
00:24:28.840 really have the knack of asking questions and chasing them down and whatnot. So we brought in a whole
00:24:35.020 group. I mean, I started, there were 30 cardiologists. When I left, there were over 90.
00:24:39.560 Were you brought in as the chairman of cardiology?
00:24:42.320 Right. Right. No, I was age 35. Actually, it was really funny, Peter, when you think about it,
00:24:47.820 Bill Belichick and I started the same day. Bill Belichick was the youngest football coach in the
00:24:53.440 NFL history head coach. And I was the youngest chairman in the history of Cleveland Clinic. So
00:24:58.020 we got to know each other a little bit. It was a very different era for Bill Belichick.
00:25:02.400 Who's my favorite coach, by the way.
00:25:03.800 Is that right?
00:25:04.220 Oh, I mean, I'm obsessed with Bill Belichick. I met Tony Gonzalez recently and he told the
00:25:12.780 absolute funniest story about his experience with Bill Belichick at the Pro Bowl one year,
00:25:18.260 which I won't restate now, but in the show notes, we'll link to a video of Tony telling
00:25:22.920 that story along with an article that was written up about it at some point. But I'm fundamentally
00:25:29.260 just obsessed with Belichick. Well, he's a really interesting guy.
00:25:32.560 It's a bucket list for me to meet him at some point.
00:25:35.260 Wow. No, he's a kind of fascinating figure for many reasons. Actually, one of the most memorable
00:25:40.660 things that happened regarding Bill Belichick was he benched Bernie Kosar. That didn't go over well.
00:25:48.040 And Art Modell, who was the chairman of the board of Cleveland Clinic, we were good friends and he was
00:25:52.920 over for dinner at our house. This had all happened and it was an uproar, you know, with the dog pound
00:25:58.040 and everything. And so Art and Pat were over and they said, you know, well, we had to put a sign
00:26:02.380 in front of our house. Bill Belichick doesn't live here. But, you know, there was as much fear of what
00:26:10.140 was happening then as when, you know, Art Modell moved the Browns to Baltimore. Yeah. Yeah. So
00:26:15.060 during those almost 14 years, it was great to see, you know, this kind of renaissance of-
00:26:21.020 And you were supported.
00:26:22.060 Yeah.
00:26:22.380 Because, I mean, presumably you would not have left Michigan without an explicit understanding
00:26:27.640 that you were not coming to implement the status quo. You were coming to rattle it.
00:26:32.180 Exactly. And, you know, it was really because of Floyd, as we knew him, Fred Loop. He was a very
00:26:38.060 progressive CEO of Cleveland Clinic. He, even though he'd been a cardiac surgeon throughout his career,
00:26:43.580 and in fact, in the early years was still operating, he wanted to see cardiology thrive.
00:26:48.540 He's not alive anymore, is he?
00:26:49.900 No, no. Unfortunately, he died of a rare cancer a few years back, but at a young age,
00:26:55.560 because it was a surprise. He had such longevity in his family. He said, Eric, you know, I want you
00:27:01.680 to come in and, you know, just completely get this place, supercharged, make cardiology the greatest
00:27:07.460 anywhere, and I'll back you 100%. And not only that, but in 2000, in year 2000, when I was thinking
00:27:14.300 about leaving, actually to go to Stanford, he said, why do you want to go to Stanford in medical school?
00:27:20.140 Let's just start one here. And so that gave me the green light to work with Case Western to get
00:27:24.840 a new medical school, and there hadn't been one in Cleveland or in the country for 26 years.
00:27:30.440 So that was the show of Loop to be a great leader. I mean, he wasn't threatened by cardiology. He wasn't
00:27:37.440 threatened by making it a far more academic environment. He actually saw those odds pluses.
00:27:42.540 He was, you know, just an extraordinary leader.
00:27:46.200 We would have had that second overlap if you'd come to Stanford, because I graduated from Stanford
00:27:49.880 Med School in 01.
00:27:50.960 Oh, wow. When I was looking there to be the dean, it was just after the divorce. It was like a low
00:27:56.600 time morale was broken.
00:27:58.280 It was. So what was his name? I'm blanking. There was a dermatologist who was the dean.
00:28:03.600 Eugene Bauer.
00:28:04.560 Exactly.
00:28:04.880 Is that his name?
00:28:05.380 He was the one.
00:28:06.120 He was the one that I think basically in the failure of that merger, it made sense that there
00:28:10.760 was it going to be a regime change? I don't know if Phyllis Gardner was ever in the running for it,
00:28:15.280 but I always liked her. She was top drawer. I was super impressed by her, but a pediatrician.
00:28:21.140 Yes. From Boston. And he was there for a number of years. And I know who you're talking about.
00:28:26.580 Yeah. Yeah. Last name begins with a P. I don't recall.
00:28:28.540 Yeah. Pizzo.
00:28:29.480 Pizzo. Yeah.
00:28:29.980 Yeah. Yeah. He had been at Boston, infectious disease, pediatrics, and he was kind of opposite
00:28:36.180 of the Stanford way. He was anti-entrepreneurial, anti, in many respects, innovation. So it was
00:28:43.000 interesting to see how that worked out. But before I had decided I didn't want to go there, mainly-
00:28:49.160 So the job you were potentially going to take at Stanford was to be the dean, not to be the
00:28:52.340 division chief of medicine or cardiology. Got it.
00:28:54.320 Yeah. No. And I was, at the time, I thought it was a dream job. I thought Stanford, even
00:28:59.640 though it was coming at a tough time in the wake of this UCSF-Stanford breakup, I thought,
00:29:05.860 you know, hey, it's unilateral. It can only get better.
00:29:08.480 Absolutely. But of course, knowing what I know, the little bit that I know about what it means
00:29:12.260 to be the dean of a hospital, it seems like that would have not allowed you to thrive in
00:29:16.340 the way that you ended up ultimately finding a second home.
00:29:19.920 That's an astute point. You know, you have to know what you're good at, and that might not
00:29:24.040 have been a good fit in retrospect. But I was restless. I was looking for a change. And
00:29:28.920 in fact, working on getting the new medical school at Cleveland, which we basically got
00:29:34.180 in 2002, and the first class came in 2004, that kept me busy. I always need a kind of
00:29:40.160 big project to know something that's a reach to keep me going. And so that was important
00:29:46.260 that it actually was a four-year run on top of the other things I was doing, is to get
00:29:51.060 that new med school off the ground.
00:29:53.100 And there's something else that happened in the twilight of your career at Cleveland
00:29:56.700 Clinic that I want to talk about, because it's, on a personal level, it was very near
00:30:00.380 and dear to me, which was your involvement in the uncovering or elucidation of the challenges
00:30:06.100 with a medication called Vioxx. This is such an interesting example of, it's a case study
00:30:13.560 in so many things, right? Because I'll state for you at the outset my bias in this entire
00:30:18.720 story, and then I want to go into the story in detail. If I could go back in time and
00:30:23.660 be czar for a month or a day or a year, I would have put a black box warning on Vioxx.
00:30:29.480 I would have left it on the market for most people who could have tolerated it and made
00:30:34.040 sure that it was very transparent that this is going to increase the risk of a subset of
00:30:38.180 the population and everybody's happy. Unfortunately, that's not what happened.
00:30:42.580 Merck, I think in the hubris of wanting to deny that there was any potential patient subset
00:30:50.820 that could be harmed by this drug, ended up spending, by my calculation, at least four
00:30:56.280 years probably concealing data. You'll tell us the story and it may be longer. And in the
00:31:01.340 end, a lot of people lost what I still consider to be probably the best COX-2 inhibitor that
00:31:06.640 was ever out there. So that's my bias. I could be wrong.
00:31:09.280 Actually, I agree with everything you said. Totally. And we never discussed it. We never
00:31:13.700 met before.
00:31:14.440 No, no. So now let's talk about the story. So tell people what a COX-2 inhibitor was and
00:31:18.740 why was it such a big deal when these drugs came out in the late 90s?
00:31:22.200 Well, this was, at that time, viewed as the most important blockbuster in medicine, Vioxx
00:31:27.200 and Celebrex. They were competing with each other. They, I think, introduced right around
00:31:31.940 99. And it was a race because there's multi-billions of dollars for each drug. The promise
00:31:38.240 was instead of the Advil and Aleve and other non-steroidals that they would replace, they
00:31:44.200 would spare the stomach. They would be more potent to relieve pain and better anti-inflammatories.
00:31:50.340 That was how they were built.
00:31:52.100 And the reason, because they were selective. So these cyclooxygenase enzymes that the Aleves
00:31:56.800 of the world indiscriminately block them. And one of the problems is, yes, you get the anti-inflammation
00:32:02.380 that relieves your pain, but you also rip apart sort of the gastric lining and a whole bunch
00:32:08.200 of other things in the wake. And of course, as you said, Celebrex and Voxx came along and
00:32:12.040 said, we're going to selectively target just cyclooxygenase 2, which almost seemed too good
00:32:17.800 to be true, by the way. In medicine, it doesn't often work that that happens, that you can selectively
00:32:23.560 hit one of these two enzymes. But nevertheless, that was-
00:32:26.280 Yeah. I mean, they did have some selectivity, but not as much as advertised. But nonetheless, I wasn't
00:32:32.200 really paying attention to this because-
00:32:34.180 Right. You're not a rheumatologist or an orthopedist. This was out of your wheelhouse.
00:32:37.400 No. And I'm not even into drug safety. That was not the kind of thing I was into. And in fact,
00:32:42.220 it was only because this remarkable fellow of ours, Deb Mukherjee, who now is the chief of
00:32:48.320 cardiology in Texas. But at that time, he came to me and said, Dr. Topol, I'm looking at this data
00:32:53.540 from the FDA. And what they're saying is that Voxx is really not at all causing any heart problems.
00:33:02.440 It's actually that the comparator, the naproxen, was the one that is decreasing-
00:33:07.440 Providing a benefit.
00:33:08.460 Yeah.
00:33:08.620 That was the argument.
00:33:09.720 And I said, well, Deb, this FDA, they approved this drug. It was a year plus after they approved
00:33:15.420 it. And I said, how did you get this data? Because back then, to get into the bowels of
00:33:19.680 the FDA website wasn't so easy, but he did it on his own. So I give him credit. And I looked
00:33:25.500 at it. At first, I didn't believe it, but then we spent quite a bit of time-
00:33:28.500 Now, do you remember the numbers? Because I remember that it was naproxen, but do you
00:33:31.960 remember what the absolute risk difference was between naproxen and Vioxx in that first
00:33:37.000 cohort?
00:33:37.320 Yeah, I don't. There was this trial called Vigor. I don't remember the exact numbers, but
00:33:42.220 it was something like an excess of heart attacks in the Vioxx arm, rofococciv, that was not trivial
00:33:51.580 if you look at it per hundred people.
00:33:54.040 My recollection, I could be wrong, and we will link to all of this in great detail. So it'll
00:33:57.880 be, for those of you listening, this will be completely accurate in the show notes. I
00:34:02.000 want to say it was like 15 deaths per 10,000, but I don't remember what the baseline, I don't
00:34:08.140 remember what the naproxen number was.
00:34:09.400 Yeah, I think it was 15 per 1,000.
00:34:10.940 15 per 1,000, okay.
00:34:12.260 Or up to 20 per 1,000, depending on how you interpret the data. But there was a definite
00:34:15.980 gap. And so first I was questioning, Deb, and then we raked over the data.
00:34:21.420 And I said, you know what? You're onto something here.
00:34:23.400 But at that time, Eric, did you think that naproxen provided cardioprotection?
00:34:28.000 No, actually, I said, where did that come from?
00:34:30.380 Okay, so in other words, you sort of questioned the premise.
00:34:32.580 Very peculiar, because there was no data to really support that. And it seemed like a very
00:34:36.860 odd explanation for this excess of heart attacks. We went over the data, and I said, you know what?
00:34:43.680 We've got to publish this, because this is really important. And so we put together a paper.
00:34:48.940 It went to JAMA. It was published-
00:34:51.840 This was 01, or 02, right?
00:34:53.620 No, it was published in 01. In fact, it was, I think, August 30th, 01.
00:34:58.000 You know what? I remember it so well. I remember where I was reading it.
00:35:02.740 Yeah, it was the summer of 01.
00:35:04.500 It was on the front page of the Wall Street Journal. It was on a lot of other front pages. But there,
00:35:09.100 it quoted me as saying, we could be facing a public health disaster.
00:35:14.240 Now, did I ever know that that would be the case? Did I ever know that it would be three years later
00:35:20.600 to the date, September 1, that Merck had this abrupt withdrawal? And in the process, by the way,
00:35:27.860 Peter, before we published in JAMA, Merck came out to try to intimidate us to withdraw the paper.
00:35:34.580 Once they heard-
00:35:34.980 How did they know? Did the reviewers give it to them for comment?
00:35:37.620 The reviewers apparently communicated to them that there was this hatchet job on Vioxx coming,
00:35:44.280 you know? And so, they came to us and tried to intimidate us. And they also then,
00:35:48.060 what I learned from the editor, then the editor, they tried to intimidate her,
00:35:52.620 that they would sue JAMA. And it was unfounded. And as soon as we published the paper, and that wasn't-
00:35:59.060 What did they actually say to you guys?
00:36:01.100 Well, they said that, you know, we did data dredging. They had all the lines. They basically
00:36:06.020 said that we were hacks and didn't know how to hack.
00:36:08.200 P-hacking, data dredging, et cetera.
00:36:09.240 Yeah. All we did was basically review the data that was filed on the FDA. And by the way,
00:36:15.260 some of the things that didn't get out in the public, there were other small studies
00:36:19.360 that never really got in the spotlight that also showed the excess of heart attack.
00:36:24.340 So, the signal-
00:36:24.920 So, did your JAMA study include a meta-analysis of those smaller ones as well as the original FDA-
00:36:29.760 Yes.
00:36:29.960 As the Vigor trial?
00:36:30.900 Exactly. And we saw this consistent signal.
00:36:32.900 And you saw this pattern.
00:36:33.720 It wasn't a question. And we also saw a lesser signal for Celebrex. It was in the paper. But the
00:36:39.920 one that was, you know, just so consistent, and you couldn't deny it, was with Vioxx. And that was
00:36:46.640 not just compared to naproxen. It was compared to other things, including placebo.
00:36:51.440 And so, your feeling at that point was the naproxen comparison is a red herring. And whether you're
00:36:58.000 doing this against a placebo or Advil... And by the way, was there a belief at the time that just
00:37:03.040 general ibuprofen had slight prevention or was neutral?
00:37:07.020 Neutral.
00:37:07.580 Neutral.
00:37:07.880 Neutral at best. There wasn't any hint that naproxen afforded benefit or protection. So,
00:37:13.280 that whole premise was off base.
00:37:15.420 And so, we were talking about a difference of one in a hundred and absolute risk?
00:37:19.140 Two in a hundred.
00:37:19.780 Two in a hundred. So, one in 50 additional. And at that point in time, because I think later on,
00:37:25.400 we knew more. But in 01, did you have a sense of which patients were the ones that were at risk?
00:37:31.320 No. I think that we still don't know that, who was at risk. We do know that 80 million people
00:37:36.080 took Vioxx, which is a lot of people.
00:37:37.960 Yeah. But it wasn't necessarily those with hypertension or those with dyslipidemia. I mean,
00:37:43.000 were we able to sort of stratify it at all?
00:37:45.140 No. In fact, that's the hardest thing is that when there were all these lawsuits of people that had
00:37:49.260 heart attacks, you know, Merck defended it saying, well, it could have been their hyperlipidemia and
00:37:53.920 their high blood pressure. And it's very hard in an individual person to ascribe the hit to Vioxx.
00:38:00.120 That's difficult because most people that have severe osteoarthritis are also having comorbidities
00:38:06.840 that would put them at risk for heart attack. The signal kept showing up though. Like when Kaiser
00:38:12.600 looked at their patient base, database, they saw it everywhere. It looked, it was a heart attack
00:38:17.500 problem and stroke problem, by the way, but heart attacks especially.
00:38:21.120 And the strokes were hypercoagulable strokes?
00:38:22.940 As best we can tell. Yeah. In fact, when I did the 60 minutes segment brings me to that idea about,
00:38:30.460 we talked about, it was right after Vioxx withdrawal. And I was upset because Merck was
00:38:35.140 claiming they did everything right. And I knew much better that it wasn't true. In fact, we had
00:38:40.660 called this three years before and they still never took it seriously. And as you said, they could have
00:38:45.340 just admitted there was a problem. It was in all their emails. It was clearly they knew about it.
00:38:50.020 Wait, there's evidence they knew before your paper?
00:38:52.800 Oh, absolutely.
00:38:53.620 Oh, I don't think I realized. I thought it was, your paper came out in 01. That was the shot across
00:38:59.840 the bow. Then they just completely denied it, concealed data until it became undeniable by 04. I didn't
00:39:07.980 realize prior to 01 internally, they had seen the same signal.
00:39:11.580 Absolutely. No, they had emails. They recovered from the, all the way back to, from 99 when the FDA
00:39:18.520 approved the drug, 2000, well before our paper.
00:39:21.680 Because they did make the argument in 99 that naproxen was risk lowering and that's why there
00:39:26.940 was no signal.
00:39:28.260 Yeah. In fact, the term signal was used as the head scientific officer and all the people involved in
00:39:34.960 the Vioxx development said, well, let's turn it on. Let's flip it.
00:39:38.320 The communications experts showed up and yeah.
00:39:41.140 Yeah. No, the whole thing was just so incredibly contrived and it was all clear that they were in
00:39:47.040 this race with Pfizer, with Celebrex. They didn't want to lose it. $5 billion was on the line and
00:39:52.220 whatnot. But when I went to 60 Minutes to discuss this right after the turbulence of the withdrawal,
00:39:59.460 the interviewer, he had just had a stroke on Vioxx. He never revealed it on the show. I said,
00:40:05.460 well, why don't you, you know, in the, just like we're talking before we actually went on the air,
00:40:09.660 I said, well, why didn't you tell people that? He says, well, I'm not part of the story. I said,
00:40:14.980 well, you had a stroke. I mean, that's kind of a big deal. Ed Bradley, you know, I think there was
00:40:20.460 a lot of hits out there. It's a shame because, you know, up until that time, Merck had been.
00:40:25.820 And what finally, what finally sunk the ship in 04?
00:40:30.180 Well, when they withdrew the drug, there was another new trial. And this one, again,
00:40:36.340 the same exact signal.
00:40:37.540 This was a phase four. This was a, yeah.
00:40:39.600 Actually, I think it was phase three for an expanded indication. Whereas the early one was in,
00:40:45.200 you know, one condition. This was in another, it was a large trial. The heart attack thing was
00:40:49.000 right there again. And they just couldn't deny it anymore, especially on top of everything they've
00:40:54.940 been trying to suppress for years. So they just pulled the plug on it. But-
00:40:58.700 Was there any ramification?
00:41:00.180 No. That's actually, when you mention it, you know why I never should have been involved with this.
00:41:05.820 I regret it because-
00:41:07.680 You do regret it?
00:41:08.460 Oh, absolutely. Because nothing ever happened. I mean, no one at Merck ever-
00:41:13.080 In other words, you believe that had you not written the paper in 01, they still would have
00:41:17.540 withdrawn it in 04?
00:41:19.000 They might have because after we wrote the paper and published it, others started to come alive,
00:41:24.360 like Kaiser and others about this signal. So it was getting more and more undeniable.
00:41:29.520 So I don't know that our paper, even though it was the first one and it was in a high profile journal,
00:41:34.100 I think they still would have had a hard time keeping that drug. Well, they might've done what
00:41:38.400 you suggested, Peter, which is put on a warning and keep marketing, which is what they should have
00:41:43.040 done. It was a good drug. But the problem was that the doses that they were recommending,
00:41:47.220 certain people were getting exposed. And you say, well, two out of a hundred is not a lot,
00:41:52.260 but when you have tens of millions of people, it's a lot.
00:41:55.380 Yeah. One out of 50 absolute increase in risk for a hard outcome like mortality is a huge deal,
00:42:00.960 especially if you can't know who that patient is. So this is where, again, because I never,
00:42:05.500 you know, I was in the middle of my residency when this was going on and I was a surgeon. So it's not
00:42:09.020 like this was top of mind. I just had a personal interest because I remember using Vioxx and finding
00:42:14.180 it so efficacious and finding it to be personally much better than Celebrex and much better than,
00:42:22.040 you know, single day dosing. I think he took 50 milligrams once a day. I mean, it was like,
00:42:26.500 you know, and I had just had a horrible back injury in 2001, which is actually another story where
00:42:32.160 they'd operated on the wrong side. And I had multiple trips to the OR. So I was really debilitated.
00:42:36.920 And in the midst of a surgical residency, Vioxx was the saving grace for me. But my recollection was,
00:42:44.620 oh, but there's a subset of patients in whom you could sort of carve out to not take it. And that
00:42:50.160 would have been the interesting question. That would have been the clinical question, which is like,
00:42:53.320 for example, like if you, if you look at drugs that cause birth defects, something like Avodart or
00:42:58.440 Dutasteride or something like that, like don't take it. If there's a pregnant woman nearby kind
00:43:03.020 of thing becomes a very clear and obvious way. Not that that causes birth defects, but that
00:43:06.960 interferes with androgens. I don't know. So it's interesting to hear you say that, that basically,
00:43:10.940 I don't want to put words in your mouth, but it almost sounds like you said, if you go back in
00:43:13.860 time, you wouldn't have done it. No, because it wound up being a horrible phase in my career,
00:43:19.240 the true nadir, not only during that time after the withdrawal, whether threats from whether it was
00:43:27.220 Merck or friends of Merck, you know, calling up saying, if you don't stop talking about this,
00:43:32.460 you know, bad things are going to happen to you. I remember being out of town one night
00:43:36.440 and my wife got a call like that. You better stop. You better tell your husband to stop saying
00:43:41.820 things about Merck or you're going to regret it. It's hard for people to believe what you're
00:43:45.280 saying, right? It sounds like the sort of thing you'd see in a mob movie. Yeah. No, it was the worst
00:43:50.340 experience. And then I even had my own institution. Unbeknownst to me, the chairman of the board,
00:43:57.220 of Cleveland Clinic, a fellow named Malachi, who was the CEO of InvaCare, but he and Gil Martin,
00:44:03.720 the CEO of Merck, were best friends from Harvard Business School. And so he and the CEO of Cleveland
00:44:09.140 Clinic were basically getting up to suppress me and gag me and also to turn on me. So I had my own
00:44:16.240 institution. I had Merck against me. It was a nightmare. I mean, a veritable nightmare.
00:44:21.360 But without redemption, when Merck finally pulled the drug, you would think that
00:44:26.440 one, it would sort of give people pause to realize that this was, as you said, probably inevitable.
00:44:34.120 And two, it was the right thing to do. Unfortunately, it was too big a hammer for,
00:44:38.920 you know, like I said, I was naive.
00:44:41.000 But they were backed into a corner.
00:44:42.460 No, they were in a corner, but, you know, to be able to-
00:44:45.220 But nobody came around.
00:44:47.140 You know, to me, it's kind of nowadays, everybody talks about truth and fake and whatnot. But to me,
00:44:54.480 then was the beginning of seeing that syndrome because here was truth and it was just being
00:45:00.340 basically turned into fake news by Merck. And they had gone years of marketing a drug,
00:45:07.800 mass marketing a drug. You couldn't turn on a TV set without seeing ads for Vioxx. And they never
00:45:14.700 fessed up. And they just, every single patient case that went to court, they basically prevailed
00:45:20.640 eventually, whether it was the original case or the appeals, by this whole inability to proof,
00:45:27.620 for proof in an individual patient. So they didn't pay anything.
00:45:31.720 There was no restitution whatsoever?
00:45:32.940 Nothing that I know that's significant. And most importantly, the executives who oversaw this,
00:45:38.220 who knew exactly what they were doing, they didn't go to jail. They were never indicted. There
00:45:43.080 was never any charge.
00:45:44.820 So no civil suits at all?
00:45:46.340 Nothing, nothing.
00:45:47.260 Which is interesting. It tells you something about how difficult it is when the complication
00:45:52.760 is a ubiquitous disease. You see, it's different when you're dealing with, well, and of course,
00:45:58.800 we think of the examples that turned out to be wrong, right? Like the use of silicone breast
00:46:03.320 implants and lupus. Well, it turned out to be incorrect, but you at least had a signal to talk
00:46:07.380 about because lupus was so rare or what other connective tissue disorders they were talking
00:46:11.160 about. But as you said, like, how can you possibly look at any individual and make that case
00:46:16.780 probabilistically? You would need a very large trial to determine that.
00:46:20.940 Oh, yeah. No, you can't single it out. It's almost impossible. If you had assays to show
00:46:26.220 that the selectivity of the COX-2 inhibitor was prothrombotic, making a clot in a person,
00:46:32.320 and that person then had a heart attack or stroke, but who had that? I mean, these were sudden events,
00:46:37.820 and no one had a proof in that person that their clotting status changed from the drug.
00:46:42.440 It seems like that's got to be the most likely mechanism.
00:46:44.960 Oh, yeah. The mechanism of how these people went down is not elusive. But what's sad about this too,
00:46:52.440 Peter, is I had known Roy Vagelos to some extent. I had the highest regard for this company. We were
00:46:58.600 doing trials with this company when it happened. And so just to see a company that was viewed as
00:47:05.000 one of the most ethical-minded companies, not just in pharma, but across all companies,
00:47:11.860 to see it take these tactics of marketing, it's really sad. But of course, that's many years ago.
00:47:19.000 You know, that's 2004. We're 15 years later now. Yeah, but I can hear it in your voice, Eric. It's
00:47:24.140 still quite traumatic. Yeah, it was almost the end of my career. You know, that's what precipitated
00:47:28.640 leaving Cleveland and fortunately, you know, coming to San Diego, which was the greatest thing ever.
00:47:34.600 But who would have known for a year or two, it was a question of whether there would be a new
00:47:39.400 position and whether it would be suited to things that I would want to do. So how did San Diego,
00:47:45.500 I mean, were you a pariah at the time? Yeah. You know, one of my people who I regarded
00:47:50.460 in extraordinary, the Pope of cardiology, Gene Brownwall, he was trying to help me. He says,
00:47:56.460 you know what, Eric, you're radioactive right now. And I was. And I even had people at Cleveland Clinic,
00:48:02.220 by then there was a new CEO and others who were charged to try to nuke me. That is any place I
00:48:09.400 interviewed for a position. They were calling them and actively trying to take me down. So
00:48:15.740 ultimately, over that course of a year, when I was looking to move, I started realizing I have to do
00:48:22.020 this in stealth mode because, you know, I've got people who are trying to get me. And fortunately,
00:48:26.780 a very close friend of mine here in San Diego, who I'd known for decades, Paul Tierstein,
00:48:32.900 who is at Scripps. And I had collaborated some with the people at Scripps Research. And so
00:48:37.120 we started talking, they were excited about what I had a vision for. And then I was ultimately
00:48:43.320 recruited in fall of 2006. And what was the role they brought you into at that time? Because at the
00:48:51.520 time wasn't Scripps, I mean, it wasn't really a clinical powerhouse. It was a research powerhouse.
00:48:56.780 Well, kind of both in some respects. The research-
00:49:00.020 Was the affiliation with UCSD on the clinical?
00:49:02.420 No, no. Purely Scripps Health.
00:49:04.680 Purely Scripps Health, yeah.
00:49:05.640 So Scripps Health was on the move. They had, Chris Angorder as the CEO had basically put together,
00:49:13.040 stitched together many different Scripps entities into one called Scripps Health. They were and are
00:49:19.440 completely different entity than Scripps Research.
00:49:22.060 So TRSI, the Translational Research Institute, that was totally separate.
00:49:25.840 So totally separate. Although prior to 2000, year 2000, they were one entity, but it was Scripps
00:49:31.680 Clinic then, not this big health system with multiple hospitals and 30 clinics and whatnot.
00:49:37.140 So what I did was to come in to be a cardiologist at Scripps Clinic, but also to develop a new institute
00:49:45.020 that was dedicated to translational research, particularly genomics. That's why I came here.
00:49:50.440 And it was only, you know, within weeks I realized, wait a minute, what about wireless? What about digital?
00:49:55.840 Because you don't want to just rely on a genome, even though back in 06, there was tremendous-
00:50:01.400 Now what was Craig Venter doing at that time?
00:50:04.220 So Craig, he had the Venter Institute in Maryland. He was also working in synthetic biology,
00:50:10.780 had a synthetic biology company here. And I think he was aiming to develop another Venter Institute
00:50:17.480 in San Diego. But he, as a pioneer of pushing the whole sequencing project, of course, in year 2000,
00:50:25.760 announcing it with Francis Collins and Bill Clinton at the White House. But he had moved
00:50:30.640 not just from sequencing, but also to writing the genome with synthetic biology. That was his
00:50:36.000 interest at the time.
00:50:36.940 I see. Got it.
00:50:37.680 So I came here to try to make human genomics and genetics center stage for the two Scripps
00:50:43.760 institutions.
00:50:45.320 Right. And you wanted to translate this as quickly as possible to basically patient care.
00:50:49.760 Yeah. To change practice, which is, we're still working on that. But that was the goal. And we
00:50:55.140 basically very quickly, fortunately, were able to get a big grant called the CTSA grant, one of the
00:51:01.680 57 now hubs of that in the country. And we're the only one that's not a university or without a medical
00:51:07.800 school. But basically, Scripps Research is a storied institution with some of the best life
00:51:13.540 science in the world, ranked number one in nature for innovation and influence above some of the very
00:51:20.240 top known centers. So it has had a phenomenal track record. And to work with them, this great brain
00:51:26.800 trust of scientists, and to try to bridge that with this big clinical entity, Scripps Health, which is
00:51:32.500 a dominant player in the San Diego region, a big region. For me, it was perfect. And basically,
00:51:39.220 the big grant we were able to get led to an innovation space, you know, just to do whatever
00:51:44.380 you think would be appropriate to make medicine better. And there's no shortage of ways we could do
00:51:50.680 that.
00:51:51.800 And you're also the editor-in-chief of Medscape. Is that right?
00:51:54.080 Right, right.
00:51:54.760 How did you get involved? And I think anybody listening to this who's ever gone onto Google
00:51:59.660 and searched for something will notice Medscape is usually coming up with information. So what is
00:52:05.140 Medscape?
00:52:05.760 Yeah, well, Medscape is the professional side for healthcare professionals of WebMD. The way I got
00:52:12.220 into it was in the mid-90s, when the internet was kind of warming up, I started with a couple of
00:52:19.640 friends, the heart.org, which was the first cardiovascular website for cardiologists and
00:52:26.520 anyone working in this space. So we started that and, you know, it's all about getting great content,
00:52:32.820 getting journalists. And it was, for many years, a big magnet for not just the information, but also
00:52:41.200 a forum for education and for debates and whatnot. So ultimately Medscape started to cover every
00:52:49.660 specialty and they acquired the heart.org. And in that acquisition, being the editor-in-chief of that,
00:52:56.360 they ultimately asked me, would I be the editor-in-chief of Medscape? And so I've done that now for several
00:53:01.720 years. It's been great.
00:53:02.700 How much time does that take? I mean, that's, you must have an editorial staff under you because
00:53:07.360 it's such a voluminous, I mean, it's like an encyclopedia.
00:53:11.280 Yeah, no, they have an amazing crew of medical journalists and they cover everything that moves
00:53:16.220 in medicine. I don't do so much day to day. I set general direction. We have a monthly call to go over
00:53:23.460 features that I usually try to introduce ideas for that. I do a lot of interviews. I try to find,
00:53:29.220 like, you know, this week was the big Wall Street Journal issue with a Penn Medicine former dean
00:53:37.140 taking on medical education today, saying that it was completely off base to nurture students
00:53:43.440 on climate change or gun control or any social injustice. And of course there was a revolt that,
00:53:51.380 you know, and we're going to have a lot of that in Medscape. So I tried to bring up my, you know,
00:53:56.500 when I first got involved, the website was much more pharma oriented. And what I've tried to do is
00:54:02.880 round that out with not just devices and medical education, but also, you know, the whole genomics
00:54:09.580 and digital medicine and AI and all those sorts of topics. How big is the staff?
00:54:14.700 Oh gosh, there's probably over 30, 35 journalists. Ivan Oransky recently joined as the VP for editorial.
00:54:22.340 He runs retraction watch, which is really formidable, but no, it's, it's a big staff.
00:54:28.120 It's a for-profit or not for-profit?
00:54:29.800 Well, it's part of WebMD.
00:54:31.020 It's part of WebMD.
00:54:32.500 WebMD used to be a publicly traded company, but they were acquired about a year ago by a
00:54:37.620 company called internet brand. So they're now a private, but for-profit company.
00:54:41.880 Got it. I didn't realize that I should have known that I suppose, but I didn't realize
00:54:45.120 Medscape was under that umbrella.
00:54:46.600 Yeah. And I've always tried to weave in the WebMD side because WebMD has a big reach to
00:54:54.000 consumers, as you pointed out, to kind of go to search for lots of common things in medicine.
00:55:00.200 And we don't do that enough. I'm hoping that over time we'll, we'll see better crosstalk
00:55:05.980 because we may have some really interesting things on the Medscape side or the opposite
00:55:10.720 on the WebMD. We don't get enough trying to get that mixed audience.
00:55:14.160 Yeah. It's funny that it's taken us this long to get to your, your most recent book,
00:55:18.520 but, um, I think it was a worthwhile route to get here because I think that the story
00:55:23.840 of Vioxx alone, I think is, um, well, I learned a lot because I, again, I think I knew parts
00:55:29.160 of it, but I don't think I appreciated the severity with which you've paid a price.
00:55:34.160 Fortunately, you passed tense and didn't hold out.
00:55:36.520 Yeah. And it sounds like it's worked out for the best, but that's phenomenal.
00:55:40.520 That seems to be one of those experiences that falls in the category of you're probably better
00:55:44.900 for it, but would be, you'd never want to redo it.
00:55:47.920 Exactly. You know, you get much stronger, you learn that, learn who your friends are and aren't. And
00:55:52.600 basically I, when I got here, it was like being in the witness protection program
00:55:56.520 and, uh, you're starting all over. I remember had this big lab where we're going to do all the
00:56:01.580 sequencing and, you know, and I, I'm sitting in this big lab and I'm the only one in the lab.
00:56:07.140 And I got a lot of recruitment to do. Now we have just in our group, you know, well over a hundred
00:56:11.500 people. We have one of the largest NIH grants in history to do all of us, the big million person
00:56:18.640 participants, a diverse group that we're doing so much with over the years ahead. So, you know,
00:56:24.400 things are really humming. So it's been great.
00:56:26.900 So your book is called Deep Medicine and the picture on the front really points to AI,
00:56:33.460 but the book is about more than that. But I want to start with that. Now let's assume for a moment
00:56:38.120 that someone listening to this has heard the term AI, but, and sort of knows from science fiction
00:56:43.740 movies, what it kind of means, but that's the limit of the knowledge, right? So they don't maybe
00:56:49.360 necessarily know the difference between machine learning and artificial intelligence or those
00:56:53.000 terms synonymous, let alone, how would that even factor into medicine? And how do you separate out
00:56:58.560 the sci-fi from what's how already happening, right? And to, you know, what you think a path looks like.
00:57:02.680 So take that in any order you like.
00:57:04.940 Well, I mean, I think the problem with AI is it's been around the concept since the fifties,
00:57:09.740 1950s, and it's diffuse. Yes, there's lots of sci-fi and movies and misunderstandings, but
00:57:16.200 what we're talking about now is a specific subtype of AI, which got its birth just over 10 years ago
00:57:23.880 called deep learning, neural networks that allow for inputs. And they could be millions,
00:57:30.580 billions of data points, could be images, could be speech, could be text. And then it goes through
00:57:36.080 these layers of artificial neurons, which are not very much like neurons, but nonetheless,
00:57:41.160 they can distinguish features progressively as they go through this, this network, and then you get
00:57:47.640 outputs. And what's remarkable about this era and why it recently won the Turing prize for
00:57:53.460 Jeffrey Hinton and his colleagues from University of Toronto. But the thing that's so-
00:57:59.500 The Turing prize being basically the Nobel prize for computer science.
00:58:02.540 Yes, I should have mentioned that. Exactly. The reason why this is such a big advance in medicine,
00:58:07.960 the biggest advance I've ever seen as a student of medicine for many decades now, but it's so big
00:58:13.820 because you can take particularly now images and you can get accurate definition of the image
00:58:22.200 better than experts, doctors. So whether it's radiologists or dermatologists, pathologists,
00:58:29.300 cardiologists, I mean, you go down the list, ophthalmologists, and you will see studies now to show
00:58:35.380 superiority of accuracy, or at least as good through a machine.
00:58:40.340 Now, to be clear, this is in initial recognition, not comparison. I mean, I think this is an area I
00:58:45.720 don't know very much about, Eric, but the last time I thought about this and did some reading about
00:58:50.140 this, I came away with the impression that if you took an MRI of a person and you showed, so this first
00:58:57.240 time this person's getting an MRI, you get the best radiologist to look at it. You get the best
00:59:01.520 computer to look at it. The computer still struggled for macro context. It still didn't even realize that
00:59:07.780 was the liver per se, but it could certainly with greater fidelity and resolution once told that
00:59:14.180 was the liver identify and maybe be more clear about, well, what's a cyst versus what's a hemangioma
00:59:21.800 versus what's a hepatoma. So it had superiority there. It also had superiority when it came to serial
00:59:27.620 studies. So, you know, Mrs. Smith had a chest X-ray a year ago. She has a cough now. She has another
00:59:33.220 chest X-ray. Is there a difference? But am I right in my recollection? No, no. Actually, I'm really glad
00:59:38.700 you put some anchoring on that because what we have, deep learning is in many respects extraordinary,
00:59:45.060 but it's very narrow. So if I say, find me pulmonary nodules in a chest X-ray, that's where I say it can be
00:59:53.260 superior. And clearly the best is the combination, the synergy, the symbiosis of what the machine can,
01:00:00.140 quote, see versus what the doctor could see. So yes, it's a very narrow thing. But what we're
01:00:07.340 talking about here is there's so many mistakes in medicine because things are missed or are
01:00:14.300 inaccurate. And, you know, this extends through pathology and every different specialty.
01:00:18.620 Yeah. Your thesis is not, I mean, many people have said to me when they talk about this sort of loosely
01:00:24.420 that the radiologist is the first doctor on the chopping block. That's not really your thesis.
01:00:29.500 No, I actually think that's completely wrong. Jeffrey Hinton said that once, and I think
01:00:34.540 he will ultimately regret it. The point being is that it basically tees it up. That is, you get a
01:00:41.520 different complementary read of something, and that helps for speed and accuracy, and it could
01:00:49.280 ultimately lower cost, and it could ultimately improve medicine. The thesis of deep medicine is
01:00:53.640 if we lean on machines more. In many respects, we can get into that. But if we do that more,
01:00:59.200 we can free up to have time with patients, and we could get the doctor-patient relationship back to
01:01:05.320 where it ought to be, where it was, you know, some 40 years ago. That's the main premise that is unique
01:01:11.760 about the book in which, you know, I really build up to deep empathy with the last chapter. But the real
01:01:18.220 thing that's different now is that we have lots of promise, lots of potential for AI. We haven't
01:01:26.340 actualized that. We haven't proven it for the most part. You know, one of the only randomized trials
01:01:32.040 to date is in colonoscopy, because a lot of polyps, particularly if they're flat or sessile or small,
01:01:38.000 are missed. And it's very much operator dependent, how much time they take to do a thorough colonoscopy.
01:01:44.660 And so now there's a Chinese randomized trial that shows, hey, you know, if you use deep learning
01:01:49.880 machine vision, you can pick up polyps that are routinely missed. And so then people say, okay,
01:01:56.300 so what? Maybe the ones that are missed are not important. Well, you know, that's where we are
01:02:01.080 today. That's the study, is you look at the denominator of the missed versus the not missed,
01:02:05.040 the machine caught versus not, and what's the prevalence? Because if the prevalence of pathology
01:02:09.580 in them is at least the same, you could argue, they shouldn't be missed. And it might be higher
01:02:14.820 if it's sessile. Yeah. So, you know, 20% of polyps were picked up by machine vision. And then
01:02:21.680 we still don't know how much of that were true disease likely. I've always felt the field, if
01:02:28.600 radiology is the first pit stop on this journey, I've always felt like the ICU needed to be a very
01:02:34.440 close second. How much is really being done there? Because A, it's the, obviously the most data rich
01:02:41.300 environment after radiology, radiology also, of course, informing the ICU, but in terms of just
01:02:47.380 raw numbers coming out about a patient, you know, if you think about a patient on a ventilator with,
01:02:52.520 you know, CVVHD and like you pick, pick every device strapped up to a patient. It's not the same
01:02:58.960 as a formula one car, but you're, you're in the ballpark of that much data. Yeah. And you're
01:03:04.020 touching on one of the big deficiencies of AI and deep learning today, which is multimodal data.
01:03:11.000 So when you have all these inputs of varied types, you know, not just their vital signs,
01:03:17.800 but, you know, could be machine vision of their facial recognition. It could be, you know,
01:03:24.020 so many different parts about that person, no less their prior electronic record. And we don't do well
01:03:31.300 with that because deep learning today is, as I say, narrow task. It's like, you know, what's in this
01:03:36.940 eye ground for ophthalmologists? Is this a diabetic retinopathy? Is this something else? So the ability
01:03:44.260 to take many layers of data, which would be the ICU story is in the early stages, even more so than
01:03:50.300 the image recognition. Yeah. What realistically, where do you think that is in terms of, again,
01:03:56.600 caveating it with the, it's always going to take longer than we think it is. Is this something
01:04:02.140 where, I don't know, in 10 years or in 20 years going to an ICU will afford a patient, the luxury of
01:04:11.900 a true supercomputer. That's basically assimilating the CVVHD data with the ventilator data,
01:04:21.060 with the swan GANS data, like stuff that, as you point out, like it's, it's even the most
01:04:28.700 analytical physician can't really recognize the patterns that are deep within all of those data.
01:04:36.120 Well, you just touched on with that statement, the essence of why we need AI support, not just in an
01:04:43.240 ICU patient, but in every patient, there's more data than we can handle. Especially when you say people
01:04:49.580 are wearing sensors, they're going to be wearing more, people are going to have their genome sequence
01:04:53.340 or they already have a genome chip or array, a microbiome of the gut. I mean, no less all their
01:04:59.880 records, not just the one healthcare place that they happen to be visiting that moment.
01:05:04.400 So this flood of data per person, no less the intensive data collection in an ICU setting,
01:05:11.880 this is overridden human capability. We need machines. We have to fess up that we can't do this,
01:05:17.780 but we also have to acknowledge that we're not there yet now. So when are we going to get there?
01:05:22.040 Well, you know, Fei-Fei Li and the group at Stanford has done ICU work, machine vision to see
01:05:28.380 whether- Is it single machine or is it integrated?
01:05:30.760 Their studies have been mainly single whereby, for example, they're looking to see risk of
01:05:36.620 extubation so that you don't have to have a nurse in the room all the time that what's going on with
01:05:41.900 that patient that they're going to self-extubate or others have looked at likelihood of sepsis or
01:05:47.980 different pieces of the story, but no one has integrated it all yet today. And I think that's
01:05:54.180 where it's headed. We're seeing these hybrid models of bringing the data together, but a lot of the
01:06:00.100 problem with this field has been way out of bound type of where it can go. And when I did the research
01:06:07.400 in the book, which involved a few years of work cumulatively, I spoke to a lot of the real gurus
01:06:12.400 in the field and they made it clear that, you know, we are going to get there eventually, but we're not
01:06:18.140 there yet. That is the challenge because when you think about other big breakthroughs that we look
01:06:23.080 back on, we don't realize that they were more discoveries than creations sometimes. So for example,
01:06:31.440 look at germ theory, right? This is, again, it's something we take for granted today. And it's hard to believe
01:06:38.020 there was a day when you wouldn't wash your hands before operating on a patient or you wouldn't wear
01:06:42.420 sterile gloves. So we acknowledge how that transformed medicine in a step function manner, but two things
01:06:49.340 are a little bit missed when we contrast it. One is we didn't have to build it. We accepted it and
01:06:58.040 discovered it. And two, it didn't happen overnight. Like there's still a generation that it takes to
01:07:04.140 implement these things. And so that's the best case scenario, right? It doesn't get any better than
01:07:10.300 that. This is something that has to be built. This is almost, this is, I can't think of an example, maybe
01:07:15.960 I'm wrong. And if anyone's going to think of it, it's you, but is there an example before when we had the
01:07:20.960 idea and the promise, and then we set out to engineer the solution building into it? So for example, I'll give
01:07:27.480 you, I'll give you a failed example, which is the EMR, right? Like literally the worst thing on the
01:07:33.960 face of the earth. I heard you once talk about this and I think, I think it was you actually,
01:07:38.000 but maybe it wasn't, but I'm going to give you credit for it. But I think you best summarize it
01:07:42.900 by saying, look, the EMR was created as a billing solution, not a clinical solution. And I couldn't
01:07:50.420 agree with you more, but there's an example of, okay, we have a problem. Medical records are so
01:07:55.500 cumbersome, so voluminous, although really they're just a two-dimensional problem. It's really not,
01:07:59.480 it's three dimensions if you include time, I would say. And we now have computers, quote unquote. So
01:08:04.920 computers will solve the problem. Let's build it. Well, one, it took a lot longer than people expected.
01:08:10.400 It took much longer to implement it and it sucks much more than people could have ever imagined it
01:08:16.020 could suck. When I think of those examples, I keep saying, is there a positive story? Is there a great
01:08:22.500 case study in medicine where the engineering solution lived up to its expectations?
01:08:28.040 I think you nailed it. I don't believe there is one. This is a unique story that's being written
01:08:34.920 as we speak. It's so different than we have a technology and we just want to implement it. This
01:08:42.240 is one that there's a lot of construction that's still required. We know what the house is likely going
01:08:48.800 to look like when it's built, but we're, you know, still on the foundation stage.
01:08:53.960 Do you think that these are problems that are going to be solved by the giants? Is IBM, is Google,
01:08:59.500 I mean, are these the entities that figure this out or is this going to be solved in more of a pharma
01:09:07.040 model where the early discovery and the early stage, you know, even the stage one, the safety trials are
01:09:14.820 done by small companies that ultimately get acquired by rolled up into larger companies.
01:09:19.440 You know what I mean? Today, like the, the Merck's and the Pfizer's of the world aren't really doing
01:09:22.960 drug discovery anymore. They've decided we're going to outsource that to more nimble companies.
01:09:27.220 And basically the private markets now subsidize that while the public markets subsidize late stage
01:09:32.160 drug development. Do you think that's the way this is going to be? Or do you think this is going to
01:09:35.800 have to start and finish within the behemoth companies that have their enormously deep pockets?
01:09:40.080 I think this is a story of innovation from the outside. I think it's very different than what
01:09:45.180 you're seeing now with, you know, the consolidation and pharma and outsourcing here. You see the big
01:09:50.620 titans like Google and Microsoft, Amazon, and the rest of them, they all recognize this is the
01:09:56.400 greatest opportunity for growth and also some, a noble mission of improving health. So you have
01:10:03.740 that group, you have startups that are, there's no shortage of those. And you also have some
01:10:10.400 governments like in China, in the UK and other places that are nurturing this, that are investing big
01:10:16.080 in this area. So, you know, I think this combined force of multiple entities is where we're going to
01:10:23.160 see this, you know, really take off. It's starting to happen much more in China out of need. That is, the
01:10:29.960 implementation is way ahead of what's going on in the U.S. because they have so few.
01:10:34.340 What are some examples?
01:10:35.460 Well, the radiologists, whereas here, we're just starting to get a bunch of FDA approved algorithms
01:10:40.360 for reading various types of scans. They already have that widespread throughout China. They already
01:10:46.840 are doing, you know, many things on the, well, you know, we hear the only FDA consumer approved or
01:10:54.000 cleared is the Apple watch for heart arrhythmia, a deep learning algorithm for atrial fibrillation.
01:11:00.420 Whereas.
01:11:00.800 So explain how that works. Let's use that example because that's near and dear to everybody's wrist.
01:11:04.840 And I see you're wearing your Apple watch there as well. So let's just say you went into the
01:11:09.400 Apple store today for the very first time and you bought an Apple watch.
01:11:13.500 Okay. So first of all, it's on the back surface of the wrist, the volar surface of the wrist.
01:11:18.240 And what is it shining through? And I assume it's shining it onto the veins in the back of your arm.
01:11:23.300 Yeah. No, it's, it's picking up optically each heart rate and you can see the light that it's
01:11:28.620 used. And for the deep learning algorithm, which actually was first cleared by a startup,
01:11:35.180 a live core. And then a year later, Apple, which they didn't even acknowledge that they had been
01:11:39.360 a year after the first, but nonetheless on their watch, they get heart rate. So at rest,
01:11:45.660 and then when you are active and then they basically for you, it has your data whereby when
01:11:53.240 you have heart rate at rest, that's off track for you, it says, Hmm, get a cardiogram and you get a
01:12:01.140 one lead cardiogram when you press the crown on the EC and you get a good quality cardiogram.
01:12:06.080 And then if it has atrial fibrillation, which lead does it most closely approximate on the 12 lead?
01:12:11.400 It's a lead one. You get a cardiogram read for atrial fibrillation, which one thing it's pretty
01:12:19.400 good for that. I was about to say, not to minimize that, but AFib seems like about the easiest thing
01:12:24.160 to pick up because of the irregularity of it, right? Yeah. Although there are, there is some false
01:12:28.980 positives and negatives because sometimes the P ways are that you're, you're looking to be
01:12:34.940 absent, you know, sometimes you can get faked out. And so it's reasonably good. And, you know,
01:12:39.720 it's in the 90 plus percent accuracy level, but it's all about the base theorem of, for people,
01:12:46.220 more information. Well, for people who are not risk, a lot of people have an Apple watch who are,
01:12:51.040 you know, young and have zero risk of atrial fibrillation and they get a cardiogram and gets
01:12:55.100 anxious and they may even get workups by a cardiologist. So this is a problem where we have
01:13:01.000 marketing of an algorithm, the first deep learning algorithm.
01:13:05.440 How long does it take by the way, um, to learn a person well enough that it would be willing to
01:13:10.880 make a recommendation like that? Oh, just a matter of hours. Wow. Or certainly by a couple of days,
01:13:16.380 it's got it down. Okay. But yeah, I mean, you know, your heart resting heart rate by the
01:13:21.340 accelerometers, it knows that you're not moving and why did your resting heart rate used to be 60?
01:13:26.420 Why is it a hundred something? And then it'll tell you to get a cardiogram, but...
01:13:30.400 And it can't make any other diagnosis. It can't diagnose any ventricular rhythm.
01:13:34.140 Not now.
01:13:34.680 Or atrial tachycardia or anything else.
01:13:36.880 Ultimately it should be able to, but those algorithms haven't really been validated yet.
01:13:41.940 But ultimately we know now I use a six lead cardiogram. It isn't on the watch,
01:13:47.140 but you can just do that with sensors and put it on the leg.
01:13:50.300 Wait, wait, how do you do that? That's interesting.
01:13:51.820 Yeah. You know, it's basically half the size of a credit card.
01:13:55.360 Where do you get this? This is, it's an aftermarket product or...
01:13:58.880 No, it's actually marketed now by a live core. The one that came with this ECG on the watch first,
01:14:04.740 they actually put it on the Apple watch, but it was their algorithm. They came up with a six lead,
01:14:09.680 which you then put that on your leg, your left leg, and then you get six, all limb lead.
01:14:14.880 And you do this with your patients as well?
01:14:16.500 Yeah. Every patient, when I see them, instead of just taking their pulse,
01:14:20.260 I also do a six lead cardiogram. It's been remarkably insightful because it's free.
01:14:25.300 It takes a second. And then I can really be much more certain about if they have an arrhythmia,
01:14:30.720 but also diagnose conduction system abnormalities.
01:14:33.880 So it's accurate enough that you can measure your intervals perfectly.
01:14:37.600 Oh my gosh. It's, it's the quality is amazing. Yeah. I mean, the six lead...
01:14:42.420 Now, can you send them home with the same kit and then can they get a six lead on themselves at home
01:14:47.500 and let you see the data?
01:14:49.240 They could. I haven't done that yet, but that's probably where this is headed.
01:14:53.240 The reason why this is actually funny, you mentioned it, Peter, you can even do your own
01:14:58.160 stress test with this.
01:14:59.700 Yeah, of course. In fact, you could do a real stress test, which is in the actual environment
01:15:03.420 under which you need to be stressed.
01:15:05.040 Yeah. I did that the other day. I did a rest ECG and then I got on a bicycle, stationary bicycle,
01:15:10.980 and went really hard. And then I, just after I got off and did a six lead again, I, so I said,
01:15:15.520 wow, you can do a stress electrocardiogram, high quality, six lead, and never go near a medical
01:15:21.460 kind.
01:15:21.640 Where's the output? Where are you seeing the output?
01:15:23.080 Oh, on your phone.
01:15:24.260 Okay. And you can make it a PDF and send it off to your doctor.
01:15:27.860 It makes it automatically. Yes.
01:15:29.740 Huh.
01:15:30.460 Yeah. It's pretty cool.
01:15:33.560 So, I mean, that strikes me as proof of concept now.
01:15:38.600 Yeah. Well, and that's where we're going to get, like you alluded to, when are we going to get all
01:15:42.960 the heart rhythm abnormalities diagnosed and the heart conduction, you know, which is a precursor
01:15:48.100 to arrhythmia. So that's where we're headed because one lead, it's hard to do that. But when you have
01:15:52.660 all the limits, in fact, with AI, you could impute all 12 leads. You don't even need to get the other
01:15:57.880 six leads. So pretty soon we're going to see that six lead become really valuable entry for what's
01:16:05.180 going on in a person's heart. In fact, Mayo Clinic just published a series of papers on 12 lead
01:16:10.220 cardiograms that you could get heart function. You could predict from a cardiogram whether they're
01:16:15.020 going to have atrial fibrillation. You can get the potassium level of the blood through that.
01:16:19.680 Wow.
01:16:19.860 I mean, the amount of data that's sitting in this pattern, which we can't see, is amazing.
01:16:26.780 Well, think about the number of times I see this once a month, and my practice is really small. So if
01:16:32.280 I'm seeing this once a month, let's extrapolate to how many times this happens in the United States.
01:16:36.960 The blood hemolyzes slightly on a blood draw and the potassium comes back at 5.5.
01:16:41.820 Oh yeah. Or higher.
01:16:42.820 Yeah. And you don't know what to do.
01:16:44.460 Right.
01:16:45.040 Well, imagine you had that EKG, you wouldn't have to panic because every time that happens,
01:16:49.920 you have to call that patient, send them into the ER, get a blood draw, confirm what you know is
01:16:55.640 likely true, which is the stupid sample hemolyzed. Their potassium is really 4.7.
01:17:00.120 And, but imagine you didn't have to do that. You could just say, push this button on your watch.
01:17:06.600 That exemplifies-
01:17:07.960 That's a great example.
01:17:08.620 It exemplifies what we can't see, but having a machine trained by a million cardiogram,
01:17:15.980 what it can see. And in the book, in deep medicine, I have a chapter and it starts out
01:17:21.320 with that story. How did they discover the potassium story? Something we can't, we can tell
01:17:26.820 the potassium is really high with the tall T waves, you know, but we can't get to the decimal point.
01:17:32.500 Right. We can't distinguish between 4.9, which is do nothing and 5.6, which is you better be
01:17:38.040 careful.
01:17:38.320 Right. And that's what machines are good for. And we're going to be seeing a lot more of that
01:17:42.900 kind of stuff. That is the eyeopening thing to me is to learn about all the things that we humans
01:17:48.700 can't do, that machines can be trained to do. And they're just going to get better over time.
01:17:54.440 So if you, do you wear a Dexcom sensor?
01:17:57.220 I have. Not regularly. I'm not a diabetic, but I have, and I've learned a lot from it. I've,
01:18:02.000 you know, I've tried Dexcom and the Libra. I mean, I've, I've really found this glucose thing
01:18:07.640 because of how it interacts with what you eat, with your sleep, with your physical activity. It's
01:18:13.780 amazing. Yeah. It really is. You know, people ask me why I still wear it because I'm not diabetic
01:18:19.600 and even my patients, so about a third of my patients to a half my patients wear this,
01:18:25.780 none of whom have diabetes.
01:18:27.320 Wow.
01:18:27.840 And I always ask them for 90 days. If every one of my patients would wear it for 90 days,
01:18:31.740 at least I'd be happy. And then we could decide, but what's in, what invariably happens
01:18:35.780 is people realize the, they go through the following cycle, which is Peter, you've been wearing this
01:18:41.340 thing for four or five years. Haven't you already figured out what to eat and what not to eat?
01:18:45.540 And I say, well, yes and no, but it's more complicated because like, for example, let me
01:18:50.700 show you this. I have not eaten anything since 4 PM yesterday.
01:18:56.440 Wow.
01:18:57.100 It's 1130. So I'm coming up on 20 hours of no food.
01:19:00.260 Yeah.
01:19:00.540 Look at the variability in my glucose for the last 12 hours. It's been as high.
01:19:06.340 It peaked at 118, which was, it peaked at 118, which was right after a workout
01:19:14.440 this morning. And by the way, it was just weight training. It wasn't like high intensity interval
01:19:19.160 training. If it's high intensity interval training, it's going to go much higher.
01:19:21.960 Now it's sitting at 94 and you'd think, well, if knowing that it's 94, like if I ate a bagel right
01:19:30.700 now, could I predict what it would go to just knowing it's 94? The answer is not a chance.
01:19:35.480 You see, just knowing it's 94 isn't enough to tell me my response to the bagel.
01:19:40.380 I have to know how much glycogen I have. I have to know how much cortisol I have. I have
01:19:45.040 to know how much insulin I have. Like there's so many variables and that's why four or five
01:19:50.800 years later, there's nothing about this that is boring to me because I'm constantly learning
01:19:55.980 a new physiologic experiment. I mean, if there's anything that's ripe for AI, it would
01:20:01.340 also be CGM coupled with other data. So in other words, I don't think the CGM data as
01:20:07.720 a, as the input feed would be sufficient. You would have to constantly be pairing it with
01:20:11.760 your activity and other sensors because if we had like, if you had the cortisol sensor
01:20:17.300 and the lactate sensor, I mean, that starts to become remarkable predictive power. And when
01:20:23.060 you could get to the point where, cause this is my dream, I want to know, can I eat that
01:20:27.260 right now or not for my parameters? So this is my pipe dream is I want to be able to say,
01:20:33.540 go into the algorithm and tell it your desired average glucose, your desired variability. So I
01:20:39.560 want to average glucose that's below a hundred milligrams per deciliter or below 110 milligrams
01:20:44.860 per deciliter. I want a standard deviation that doesn't exceed 15 milligrams per deciliter.
01:20:49.880 And now you tell me what I can eat. Spend the first month watching me eat, learning how my body
01:20:57.160 responds to every different food and go from there. I mean, directionally speaking, how long
01:21:03.220 would it take to get us there? We're getting there. I mean, we're chipping away at that. So
01:21:07.180 the gut microbiome is a big part of the story too. And I know you're such a proponent of this and I
01:21:13.160 am, I call myself a gut skeptic because, well, why would I say that? I certainly don't
01:21:21.400 disregard the importance of that. I think I'm waiting to see a great example of how I can use
01:21:28.760 it outside of like the really clear clinical ones. Like certainly knowing how to change the gut
01:21:35.140 microbiome in the context of C. diff colitis is profound. It seems very likely that something about
01:21:42.000 the gut changes in patients with diabetes who undergo gastric bypass, that seems to really
01:21:47.900 suggest, but it could be as high as the duodenum. And the most compelling evidence I've seen is that
01:21:51.900 it's actually the change. It's the duodenal bypass that specifically gives them this incredible
01:21:58.000 remission out of the gate more so than the lower GI tract. But I think most of my skepticism comes from
01:22:05.220 the fact that it's not clear to me what to do with all those data, which may be exactly your point
01:22:10.120 that when I see patients constantly show up to me with their gut sequence and they say, well,
01:22:16.380 look at this pathology state here. And I say, well, first of all, I don't know that that's a pathology
01:22:20.000 state. And if it is a pathology state, is taking a probiotic the answer? I don't have any evidence
01:22:26.060 that that's the case either. So is it more a readout state or is it a form, is it a malleable
01:22:32.980 state that we directly interfere with?
01:22:35.280 Right. So those are all important questions. I think the real insight here is that up until
01:22:41.880 when Aaron Siegel and his group at Wiseman Institute in Israel, up until they did what
01:22:47.440 now has to be seen as a classic study.
01:22:49.740 This was the cell metabolism paper from about a year ago?
01:22:52.680 Well, there was a paper in Cell 2015, which was really the seminal work. And now there's been
01:22:58.280 several more and it's been replicated by many others. They took now thousands of people healthy
01:23:04.320 like yourself and they went ahead and got gut microbiome, but they also got exact same amount
01:23:10.840 of food, exact same time. They also got all their labs and every piece of data they could get on these
01:23:17.280 people. And they found that you could predict if they had a bagel, which ones are going to have and
01:23:23.320 what level of glucose spikes they're going to see. And they, they found that so many spikes, you know,
01:23:29.180 even very significant spikes, 160, 180, 200 in healthy people with no sign of diabetes.
01:23:35.680 And how, how durable do you think the knowledge is from the sequence? So for example, like if you
01:23:40.640 sequenced that patient Monday morning at 9am, how much do we know that Friday at 5pm, the data are
01:23:50.360 still relevant, assuming you could even, cause you can't get the data in real time.
01:23:53.860 Right. So they didn't do any DNA sequencing. And of course that wouldn't change. So we don't know
01:23:58.780 the genomic side of this, but we do know the microbiome, unless you do something significant
01:24:03.740 like that. Oh no, sorry. I didn't mean their DNA. I meant the DNA of the bacteria.
01:24:07.680 Okay, good. Yeah. Yeah. The DNA of the microbiome is pretty darn stable everywhere you look at it,
01:24:13.900 unless you change radical change in your diet, like change fiber content, or you take antibiotics,
01:24:19.300 but it's very stable from day to day. I see. So you would say that, look, Peter,
01:24:24.400 only if the patient does a course of antibiotics, do we need to recheck them or make a radical dietary
01:24:30.840 change. But if a person's in quasi steady state, you could sequence them every quarter and basically
01:24:37.980 update your pretest probabilities of what the distribution is. That's right. And another tier
01:24:43.420 of complexity, because it's good that you're a skeptic on this, but early on these various
01:24:49.460 companies that would do microbiome assessment, they just said how much you have of this bacteria or
01:24:54.760 that bacteria. It was like a density of bacteria. It turns out that you touched on it. If you sequence
01:25:00.220 the bacteria, the changes in that bacteria species sequence is just as important as the density of
01:25:07.080 the type of bacteria. So this is not easy. It's expensive to do it right.
01:25:11.160 And we've already seen a big fraud on this front quite recently, right?
01:25:14.520 Yeah. Ubiome.
01:25:15.880 Ubiome.
01:25:16.460 Is the Theranos of this space, it seems like.
01:25:18.700 Yeah. Well, they were billing people illegitimately, and they were only reporting on
01:25:22.740 density of bacteria. I don't know that they're reporting-
01:25:24.660 So they weren't object fraud. It was just bad practices?
01:25:27.420 Yeah. I don't think they were doing things wrong with respect to the microbiome density. It's very
01:25:33.240 rudimentary. They didn't do sequencing. They did basically a bacterial density of-
01:25:38.380 I mean, I found them to be the most useless company in the history of civilization because
01:25:42.100 back in 2012, 2011, I was, I mean, at least acting like I was on the forefront of this,
01:25:50.420 trying to understand it, and ordering these sequences on myself and all my patients. And
01:25:55.580 I don't understand how this company stayed in business. I mean, they didn't, but they couldn't
01:26:00.040 run a sequence to save their lives.
01:26:01.420 Well, yeah. I think the biggest thing is they were fraudulently billing people, double billing,
01:26:07.200 triple billing, you know, all that sort of thing. That's what got them into, you know,
01:26:10.800 basically collapse mode. I don't know enough about their sequencing.
01:26:14.380 I mean, I always found Larry Smarr's stuff to be the most interesting because Larry's doing it at
01:26:18.520 a level that you couldn't do commercially.
01:26:20.740 Yeah. So shotgun sequencing, where you do true metagenomics, you know, there are only
01:26:25.240 certain labs, like the one I mentioned in Wiseman. Here in San Diego, the Knight Lab does
01:26:30.340 metagenomics. That is-
01:26:32.620 K-N?
01:26:33.060 Yeah. K-N-I-G-H-T, Robert Knight. So these are the centers that are doing it right,
01:26:38.160 that are sequencing each species of every organism that's found. And we now know that
01:26:44.400 that sequence is equally as important as the type of bacteria. So that's the sort of data. Now,
01:26:51.120 the other thing you're bringing up that's really important is we have no idea how to manipulate
01:26:56.000 the gut microbiome. The only thing we know is a fecal transplant in certain people with,
01:27:01.620 you know, pseudomembranous colitis, C. difficile. Outside of that, we don't have,
01:27:06.300 we have crapsules that are being made that are being tested.
01:27:10.660 So I think you and I definitely see probably more closely on this than I would have guessed
01:27:15.360 initially, because we agree that at this point, it's an output of data, not an input to manipulate
01:27:20.760 necessarily.
01:27:21.280 So I probably need to go back and look at the Weissman paper again, because I don't think I've
01:27:25.620 looked at it in over a year. And my view was, which is probably incorrect by the way, that CGM
01:27:32.700 and dietary logging would have been sufficient. So what I really want to do is go back and look
01:27:39.160 at that paper and see what the gut biome added above those things, which I'm guessing there is
01:27:44.540 something there.
01:27:45.180 Yeah. No, there is. And we need more. I mean, basically right now is you could predict if you
01:27:50.620 had all the data and the right algorithms, you could predict which foods you'll spike from.
01:27:56.340 And then this was taken to another level by the group in London, King's College, led by Tim
01:28:01.040 Spector. He brought in all these twins from all over the country, identical twins. So they had their
01:28:06.800 gut microbiome, and they also put in a line to a vein to get blood samples for triglycerides.
01:28:12.720 Yep.
01:28:13.220 And they saw the same thing.
01:28:14.720 Which by the way, you could get out of a sensor.
01:28:16.860 You could, you could.
01:28:17.600 Ultimately.
01:28:18.200 Yeah.
01:28:18.420 We don't have one yet.
01:28:19.640 No. And you know why I mentioned lactate earlier? If you have real-time lactate, you
01:28:24.980 are estimating with really great precision, mitochondrial oxidation. Now you understand
01:28:30.460 fuel partitioning. You see, to me, if you asked me a year ago, how would you want to
01:28:35.880 best estimate fuel partitioning? I'd say, oh, it's tough because you got to have somebody
01:28:39.160 basically walk around with a respirator or something that can measure oxygen consumption
01:28:44.800 and CO2 production. But I think lactate's telling you that. I think if you really know how to
01:28:50.060 calibrate lactate, you can estimate fat oxidation versus glycolysis and, or glycolysis through
01:28:57.880 to lactate. And so all of a sudden you now get into this. So the reason that right now
01:29:02.020 my glucose is 94, but if I ate a bagel, it would go to like 104 is because I'm so glycogen
01:29:08.360 depleted because I haven't eaten in 20 hours and I've worked out very hard, or at least for a long
01:29:13.980 enough duration. Conversely, it's not uncommon after dinner. Let's say you have dinner, you have
01:29:19.380 a glucose spike up to 120, and then it comes back down to 90, 94. You eat that same bagel, you'll go
01:29:26.780 higher. Well, a very important input into predicting that is knowing glycogen stores and insulin
01:29:33.620 sensitivity of the muscle and all these other things. So what I need to better get is how many
01:29:39.760 different phenotypes, macro phenotypes of gut biome are there that really matter?
01:29:45.900 Right. The bigger picture though, I agree with all your point and it'd be nice to see a lactate
01:29:50.740 sensor that's tested at scale and is accurate. It, you know, took a while to get that for glucose and
01:29:56.040 we're earliest stages on the lactate, but there's still a lot of naysayers here and I understand their
01:30:01.760 perspective and that is, so what? So what if your glucose goes to 180 or your lactate goes to this
01:30:07.320 or your triglycerides go to that? The point is, do we know that changing that, that keeping everything
01:30:13.700 nice, level, keel? You don't think the diabetes literature has made it clear enough that normalizing
01:30:19.680 glucose and insulin is beneficial? Not enough, no. The only way you can get at this, inferentially,
01:30:27.340 yes. But you know what we've... Oh, you're saying I can tell the story that a glucose of 120 is better
01:30:33.620 than 180 because I have clinical trial data to demonstrate that all day long. And I can even
01:30:38.800 tell you that how you achieve that matters. But you're saying I don't have the data to tell you
01:30:46.640 that 100 is better than 110. No, no. Another way to put it is I don't have the data to show
01:30:53.320 that if you wear a sensor for X number of time, 90 days in your case or forever or a week,
01:30:59.680 that you're learning about avoidance of glucose spikes changes your prognosis. We don't know that.
01:31:06.580 And the same thing for triglycerides, which by the way, they don't correlate. And we're learning that
01:31:11.200 each person's individual response throughout their day is so incredibly unique. And we're learning some
01:31:19.160 of the factors. I don't even know. We know all the factors that influence that. You've mentioned,
01:31:23.020 of course, glycogen stores and physical activity and the microbiome.
01:31:27.140 And cortisol, in my experience, plays a staggering role.
01:31:30.120 Stress. No question. Stress. I mean, you know, just if you get an intermittent,
01:31:34.060 intervening cold, no less stress, you know, in your family, your life, whatever experience.
01:31:40.280 So yeah, this is a really interesting area. We're learning about ourselves. It's like, you know,
01:31:44.680 know thyself sort of thing. We're in the early stages. And I see the skeptics. I understand their
01:31:50.220 perspective. I think that we have to prove it. I lean where you are, which is why not have this
01:31:56.560 information? I've learned a lot about myself, no less the feel from it. But I think we have to admit
01:32:03.020 that we got a ways to go. What do you think's a significant blind spot in medicine today at the
01:32:09.060 macro level or at any level? No, I think the biggest blind spot is how poor we are in diagnosis,
01:32:16.660 no less in treatment. I mean, I think that when you really look hard at the data,
01:32:20.960 what's amazing, Peter, is you see all these clinical trials that declare, you know, a triumph.
01:32:26.780 And they're helping three out of a hundred people. I mean, a great example of statins that,
01:32:31.920 you know, primary prevention of statins, if not the number one, close to number one class of drugs
01:32:37.820 that are used today. And you see that out of a thousand people for primary prevention,
01:32:43.920 988 derive no benefit for five years of taking a statin and 12 out of a thousand get benefit.
01:32:51.460 So whether you look at the diagnosis where, you know, if a doctor...
01:32:54.820 That's another topic I know that we may disagree on. My view on that has always been that
01:32:59.660 because the time course of atherosclerosis is so long, you know, it's a disease that begins in
01:33:05.040 infancy. We certainly know from, you know, the starry stuff of the, you know, the seventies that,
01:33:11.340 you know, basically by the time you're 18 years old, you've, most people have a stage three lesion
01:33:16.880 at that point that the challenge with studying primary prevention is you could never study it
01:33:22.060 long enough to really see where those curves start to diverge.
01:33:25.720 Well, no, they diverge. But the question is, are they going to keep diverging? And, you know,
01:33:29.820 most of the benefits starts to kick in right around 18 months. And yes, they're still slightly
01:33:34.940 diverging, you know, after five years, but we don't have any data beyond that. So I should restate
01:33:39.840 that is that we don't have any proof that more, you know, you can suggest that instead of 18 out of
01:33:46.960 a thousand people benefit, that it goes to 36. Exactly. But what about the 970 that don't derive?
01:33:53.260 No, no, no, no. That's fair. I think the point is, do you believe the Mendelian randomizations
01:33:58.940 or do you think that the Mendelian randomizations have artifacts in them, which any Mendelian
01:34:05.280 randomization will have an artifact if that which changes the variable of interest also changes
01:34:10.320 something else that you don't know? That's always, that's, that's the blind spot of the
01:34:13.400 Mendelian randomization. No, they're, they're a neat way to get a readout, but they're not perfect.
01:34:17.120 You know, I think that whether you look at diagnosis where, you know, if you take people
01:34:22.260 who have died under a doctor's care and you say, why did that person die? What was the cause of
01:34:27.860 death? 40% of doctors say, I absolutely know the answer are wrong. 40%. That's how many it autopsy are
01:34:39.040 show a different reason for the death as what was pre-mortem. Wow. That's, that's a, that's a,
01:34:44.980 that's a big gap. Yeah. If you ask doctors to make a diagnosis, if they don't think about it in
01:34:50.600 the first five minutes, five minutes is 95% accurate. But if they don't, it doesn't come
01:34:56.000 to mind in the first five minutes, it drops down to 24% accuracy. Basically what we have is type one
01:35:02.740 system, type one thinking system, one thing I should say at Kahneman's work. And we just are
01:35:08.420 reflexive. We don't reflectively go over anything. We don't have time. We don't simulate all the data.
01:35:14.260 We can't because of so much. And so we have to basically the whole, the big whole is acknowledge
01:35:21.600 that we can do far better and it can't all be through human support. We need help.
01:35:28.340 Do you think there's any area where, I think you've made a compelling case that machine plus
01:35:33.380 human should be better than human. Eventually it will be. I mean, I think even in radiology today,
01:35:39.960 it should be. Oh yeah. Yeah. Hopefully critical care would be an amazing place where machine plus
01:35:45.140 human should be better than human. Do you think that there are extremes on either way? Do you think
01:35:49.620 there are places where humans will always be better than machines plus humans?
01:35:53.880 Yes. And that's in being human, which is the bond, you know, just like our conversation,
01:36:00.520 this deep conversation, it's really, uh, illustrates the human connection. We don't have
01:36:06.660 those kinds of conversations in seven minutes or 10 minutes where the patient can't, and, uh,
01:36:11.760 you can't get to know a person's life history. That's never going to get really digitized story,
01:36:16.600 their story. We interrupt patients within 18 seconds from, we don't listen. So the point being is that
01:36:23.900 machines, that's the, what we don't want machines. They don't have context. They're not going to be able
01:36:29.100 to truly understand all the nonverbal communication and the real issues of a person that are deep.
01:36:37.740 And that's where the humanity, we need to bring it back. I mean, the essence of medicine is that,
01:36:44.000 and it's been lost. It's become this big business and it's,
01:36:48.360 Do you worry about the, um, I don't know, well, it's not really a brain drain, but do you worry that
01:36:52.840 the war for talent has shifted? And back when you went into medicine, it's probably safe to assume
01:36:59.760 you were one of the best students in your high school, the best student in college. Like it
01:37:03.680 would have been skimming at the very, very top of the pyramid of students. Is that still the case
01:37:09.360 today? Or has, I mean, you've already alluded to a number of things that I've seen. Luckily I don't
01:37:15.080 experience them just on the nature of my practice because it's private. But I mean, these stories you
01:37:20.120 tell, I know so well, the doctor that gets seven minutes to see a patient. And in that seven
01:37:24.920 minutes, six minutes is typing into an EMR, the moral distress and the absolute erosion of a belief
01:37:34.780 in what you're doing is a huge cause of burnout amongst physicians. And I don't understand like
01:37:41.880 if you're the top student in college and you're interested in life sciences, why you'd pick medicine
01:37:46.240 today? Unless you had a profound confidence that you could carve a path distinct from what most are
01:37:52.220 probably going to do. I mean, you have to be an optimist, I think, to pursue medicine today.
01:37:57.120 Do you worry that it's going to be hard to recruit the most talented kids out of college into medical
01:38:03.820 school? Well, I'm hoping it won't be, but you know, I do hear constantly friends who are doctors who
01:38:10.580 tell their kids, don't go into medicine. Yes. I hear that. I mean, that's really bad because here
01:38:17.660 is the ultimate profession for sense of really helping people. And then you have the people who
01:38:24.120 are in it saying it's horrible. And as you know, Peter, the physician burnout, no less all clinician
01:38:30.300 burnout is at peak. And why is that? It's because it become data clerks and they're squeezed for the
01:38:37.600 time. They can't care for people when you don't have time. So this is, of course, going back to
01:38:44.640 that main thesis of gift of time. We can get that. But I think we have to restore medicine
01:38:50.900 the way it was in order to attract the talent that you're referring to. And I think it's doable.
01:38:58.360 It's not going to be easy. It's going to require a lot of activism, which we haven't had that much of
01:39:03.040 in medicine. Yeah. This is something doctors are quite ill-equipped to do. It seems.
01:39:09.160 We're seeing the light on activism. The gun control, NRA really brought it out when they said,
01:39:15.460 stay in your lane. This was when the AMA said a physician should be asking a patient if they own
01:39:20.600 a gun. It was American College of Physicians. They published it in the Annals Internal Medicine last
01:39:26.140 fall. And then NRA said, you know, these doctors should stay in their lane. And then you had all
01:39:32.100 the doctors came out. One of them, Judy Mellinick, saying, this is my fucking lane. And, you know,
01:39:37.220 it went everywhere. It went viral. The idea being, if doctors are going to be killed potentially by
01:39:41.920 patients, it's not an unreasonable question to say. Well, that and caring for all the gunshot
01:39:46.420 victims. Yeah. You know, the emergency room. And the suicides. And the suicides. Which is probably the
01:39:50.940 greatest cause of gun death. Yeah. You've got both suicides and homicides and mass killings and,
01:39:56.420 you know, AR-15s and all this stuff. So, you know, the fact that this was taboo, that doctors
01:40:00.980 didn't, weren't allowed to talk to patients and they weren't allowed to do research. I mean,
01:40:05.500 there was no research in this area. So this was a void that now has brought out a lot of the activists
01:40:11.980 and social media. And it's this new era of young physicians, a lot of them women. And we're seeing
01:40:20.300 activism like never before. I wrote about it in the New Yorker recently about this and, you know,
01:40:24.920 should doctors organize. And this, what we're seeing, it is happening. And the hope is that
01:40:30.300 we're going to see an organization take hold where all doctors can join as well as ultimately
01:40:35.780 patient advocates and others. And that we will turn this around because this is the biggest concern I
01:40:43.300 have, Peter, is that we're going to see AI kick in more and more over the years. But what will it do
01:40:49.080 if the administrators who are the overlords, who are overrepresented as compared to the people
01:40:57.000 taking care of patients, if they keep the squeeze on, we're just going to see things get worse.
01:41:02.600 So we have to override that. And the only way we're going to do that and turn inward and get the
01:41:07.640 humanity back in medicine is to have doctors organize. And the gravitas of a million doctors
01:41:14.620 in America, all being part of one entity, it could be enormous.
01:41:19.140 You know, and this, this gets to another problem within medicine that I alluded to earlier, which
01:41:23.360 is the patients aren't footing the bill based on the system directly and therefore demand in a
01:41:30.480 demand-based system. So if it's not a demand-based system, it doesn't matter. Like in other words,
01:41:34.340 in the NHS or the Canadian system, the patient is not footing the bill, but they're also not driving
01:41:39.760 the demand. The demand is budget set. But in the US where you have this paradox, it also means the
01:41:45.800 patient's voice doesn't matter. And that's the irony of it, right? So it's the opposite of the DC
01:41:52.120 license plate, you know, which is taxation without representation. It's like no taxation and no
01:41:58.320 representation. And that's why I don't, I worry that patients can't be the ones to drive this change,
01:42:05.240 which they should be able to, like the patients should demand this change, but because they're
01:42:11.440 not the ones writing the checks to feed that $3.8 trillion machine directly, they're only doing
01:42:17.260 it indirectly. In other words, they don't get to control it, right? They're paying their taxes
01:42:21.840 and their employers are withholding it, but they, it's not the same as saying, here's my dollar,
01:42:28.020 go and do this thing with it.
01:42:29.880 Yeah, no, that's why if you get the doctors to come together to start this, and the sole purpose
01:42:35.240 is the patient-doctor relationship. It's not about better reimbursement or all the other
01:42:41.760 trade guild activities, but rather it's about, we want to fix this relationship and bring the
01:42:46.980 humanity back in medicine. Then we start to see, you know, that the ability for that patient interest
01:42:53.080 to be recognized. Because now all you have is you got a lot of patient advocacy groups,
01:42:57.720 but they're, you know, they're just like the doctor organizations, they're all balkanized.
01:43:01.760 We need one entity to stand strong. And, and I, I hope we'll get there.
01:43:07.120 Well, that's a good point, right? It should be made up of physicians and patients, actually.
01:43:10.760 It really shouldn't be, they shouldn't be separate.
01:43:12.480 No, no, I think, you know, it started with the physicians because, you know, there's just a million
01:43:16.920 of them that we can identify and get them together as many as possible. Then you start adding on
01:43:22.620 Are there really a million physicians in the United States?
01:43:25.120 Yeah. You know, the fact checking of the New Yorker is amazing when they, I've never experienced
01:43:29.480 that before, but they tracked down everything and they got to the numbers that I never could get
01:43:34.620 to. So not all of them are practicing. There's about 900, almost 900,000 are actually practicing
01:43:40.420 in some respect, but there are a million docs in this country.
01:43:43.520 Have you done the math about how many doctors we would need under this new regime of empathic,
01:43:52.420 intelligent, artificial, this symbiotic relationship? In other words, because, because
01:43:57.380 there's a bunch of moving pieces, right? It's radiologists still exist, but now they're there
01:44:02.740 to talk more with patients and interpret the diagnosis as opposed to make the diagnosis, right?
01:44:08.840 And you're still going to need the internist, but now they have more time with the patient and
01:44:14.140 they don't have to worry about the diagnosis as much as they have to worry about the treatment
01:44:18.780 directionally. Eric, do we need the same number of physicians? Are we going to need more physicians
01:44:24.800 or are we going to need less physicians in 30 years on a per patient basis?
01:44:29.400 You know, when I did the UK review, we got into that and we had economists and all sorts of brilliant
01:44:35.980 people modeling on that. And I think what we'll see is even though everything now would suggest we
01:44:42.440 need a lot more doctors because of the aging population and all the comorbidities and the
01:44:46.600 complexity. I mean, if you just look at like, for example, how do you care for a patient with cancer
01:44:51.100 today? It's gotten very complex. So you would project we're going to need, you know, a steep growth
01:44:57.420 curve. But what we're going to see, I think, is a big blunting of that because we are going to be,
01:45:02.780 the machine story is not just about doctors relying more on machines getting support. It's also about
01:45:09.240 consumers, patients. And so when you get the outsourcing and the offloading, you start to
01:45:14.960 see a pretty big, and then when you get rid of the hospital story and just have surveillance centers,
01:45:20.260 remote monitoring, you start to see a very less need for expansion. So I don't see what we're going to,
01:45:26.440 you know, be a decline, but just a difference in the curves as they go forward over the next few
01:45:31.980 decades. And there will also be this, I don't know, for lack of a better word, kind of a growing pain
01:45:37.560 as people transition. I mean, most of the radiologists I know today, you know, for example,
01:45:43.480 have very technical backgrounds. I mean, any of the MRI folks I know, they usually have a great
01:45:48.860 background in physics and things like that. So all of a sudden there's going to be a different
01:45:52.660 selection criteria. For example, you may want to choose radiology independent of how technical
01:45:58.120 your background is in physics or mathematics. And so it's like, it takes a generation to make these
01:46:04.020 switches. Do you see any other types of changes in how people will select into different specialties?
01:46:10.960 Well, that's hard to know. I think we have theorized, Saurabh Jha and I from Penn Radiology,
01:46:18.080 that there might be a new specialty that would just be radiology and pathology combined,
01:46:23.060 at least the pathologists who work with slides, because it's very similar interaction with the
01:46:29.040 computer. And they're often very much, as you know, integrated. So that might be a whole new specialty
01:46:33.960 over time. But, you know, overall, one thing we don't want to forget here is that the empowerment
01:46:40.060 of patients to do doctorless diagnoses of most common conditions, whether that's an ear infection
01:46:46.660 of a child or a urinary tract infection, a skin rash or skin lesion, and on and on. The routine things
01:46:54.520 are not going to have doctor in the loop, only if, you know, treatment is needed perhaps. And that's
01:46:59.660 only in the U.S., not in a lot of other countries. So that is going to change also specialties. Because
01:47:05.300 today, you look at pediatrics, you know, it's a wonderful specialty, but a lot of that could be
01:47:10.280 decompressed if you give parents more autonomy for their kids. So we're going to see lots of changes
01:47:17.000 based on the patient side or the parent side of things, which I think has not been adequately
01:47:23.240 appreciated. If you could conduct any experiment and there were no limits on the resources,
01:47:29.420 you had. So you could, and it could be a, an experiment within the real recesses of basic
01:47:36.700 science. It could be a real translational experiment that takes something to the, from the cutting edge
01:47:43.320 of the bench to the bedside, or it could be the largest clinical trial ever done to test a question
01:47:50.320 that vexes you without any new introduction of a new technology, but just simply asking a question
01:47:56.220 like the Vioxx one, for example, you know, take as much time as you want, but I'd love to know
01:48:01.220 what would be a dream experiment for you if this is the one shot on goal where you've got billions of
01:48:08.540 dollars and no holds barred. Yeah, no, it's easy. Oh, it is. Yeah. For me, I mean, I'm, I, it's a dream.
01:48:15.080 You know, I wrote about this with Kai-Fu Lee and Nature Biotech last month, and it was called,
01:48:20.520 it takes a planet. And basically it would be to, the experiment would be to develop a
01:48:25.300 planetary digital infrastructure with all the data of each individual and continually being assessed
01:48:33.380 and assimilated process inputs, federated AI. So the data never leaves the country, whether it's China
01:48:39.540 or the U.S. or wherever. So it's not a privacy security. But the point being is when a new person
01:48:46.000 comes in and we want to prevent a condition or better treat it, and we have billions of other
01:48:51.260 people to draw from, and we have these digital twins, if you will, because today we learn from
01:48:58.400 clinical trials and it's really farcical in many respects because those clinical trials are contrived
01:49:03.620 and the benefit is three per hundred or something like that. What about the other people?
01:49:07.540 Well, not only that, it's often three per hundred because of the heterogeneity of the population.
01:49:12.100 Exactly.
01:49:12.520 It might be 30 per hundred if you knew who to apply it to.
01:49:15.540 Yeah. So, so my experiment would be just for that. You have pinpoint precision because I know
01:49:21.060 Peter's twins, all of them around the world and how, what treatment they got and what outcomes they
01:49:27.580 got and what, how I could prevent their issue that they otherwise had. And so it would be to develop
01:49:33.080 the ultimate learning health system.
01:49:34.800 And the twin you're defining is obviously not just genetic, but it's every layer, every layer.
01:49:40.820 So it's my gut twin, my epigenetic twin, or my approximate genetic twin, my phenotype twin,
01:49:47.820 my metabolic twin.
01:49:49.580 Right. And that's, I think where we can go in the future. And it involves many different types of AI.
01:49:56.780 And I think we'll get there someday. And it's an experiment.
01:49:59.280 Is that a 50 year? I mean, what realistically is.
01:50:02.720 I think it could be done in 20 if we were, if we really were going after it, because it's doable.
01:50:08.340 You know, the question is.
01:50:09.140 That strikes me as bigger than any one country though, right?
01:50:11.860 No, no, but if you get, if you get U.S. and China to start it because of the diverse population
01:50:16.920 and the largeness, the U.S. being third in population in the world, and you get that going,
01:50:22.220 then the rest of them join on, you know, you pretty quickly, you will have twins.
01:50:25.980 Do you have a sense of how much that could cost per person?
01:50:29.440 It just depends on the layers of data and the analytics. I mean, it's not going to be
01:50:33.940 trivial, but less than you would think. This is mostly, you know, in silico work. It's not,
01:50:40.180 you know, it's happening anyway. This data all sits in places that's all fragmented.
01:50:46.120 What's the, what's the price of a full exome sequence today? Both, a full genome sequence
01:50:51.840 today. So not a 23andMe, but where they're doing the full sequence.
01:50:54.440 Yeah. Well, exome 400, a full genome, eight, 900. So, you know, they're-
01:50:59.340 So they're sub a thousand.
01:51:00.420 They're going to keep coming down. And at scale, perhaps, you know, that's going to happen even
01:51:05.780 faster. But you start having all these things done at scale and, you know, right now we don't
01:51:12.560 have enough. The reason why a genome isn't valuable is because we don't have a billion people with a
01:51:18.460 sequence and a phenotype. Once we start to get into those big numbers, then we start to decode it.
01:51:24.020 So you don't think it's that the genome is not deterministic enough? You think it's
01:51:28.260 too small an N so far?
01:51:29.960 That's a bigger problem?
01:51:30.920 Big part of it. Yes. I mean, a genome will never be fully deterministic. I mean, that's
01:51:34.960 not possible. Probabilistic, yes. But the probabilistic side of it is hampered because
01:51:40.580 of inadequate numbers.
01:51:42.180 Yeah. Because I got to tell you right now, I find every time my patient sends me their genome,
01:51:47.360 I just roll my eyes and say like-
01:51:49.640 No, there's a limited amount of value.
01:51:51.380 Yeah. Well, you know what I usually say to them? I say, look, anything in here that matters,
01:51:55.820 we already know. You know, if you're a 40-year-old, unless you were adopted, you know,
01:51:59.840 sometimes you figure out, you know, this person has Lynch syndrome or something that you had to know.
01:52:04.660 So there are some numbers on that. If you look at the Danville, Pennsylvania,
01:52:09.540 Geisinger Health System, where they've done over 100,000 people, they have excellent,
01:52:14.920 not full genomes, but the coding elements. And they have 5% that they find something quite
01:52:21.100 important, so-called pathogenic. So like Lynch syndrome or BRCA or sudden death arrhythmia.
01:52:27.820 Which again, my point is, if you're doing your job as a doctor, you should have figured that out
01:52:32.500 in the family history and gone looking for it, but not with, I mean, in other words, you should
01:52:36.680 have gone to the genome to be confirming what you suspected, hopefully.
01:52:39.880 Yeah, but you know, a lot of that's missed. Like BRCA is a perfect example. You know,
01:52:43.760 what about BRCA men carriers? You know, you just don't know.
01:52:47.780 You're taking a good family history, don't you? I mean, I guess it depends on how much the patients
01:52:51.880 know about their family too, but you're right. No, those are good examples. But when I look at the,
01:52:57.100 like how many times do I look at Prometheus and it spits out, oh, you're at a higher risk for
01:53:01.060 atherosclerosis and a higher risk for diabetes. And I'm like, this is such nonsense, right? Like if you
01:53:06.140 actually understand how to evaluate lipids and you're wearing a CGM, you certainly don't need
01:53:10.240 this thing to tell somebody. No, Prometheus, I mean, I think there's gross deficiencies of outputs
01:53:15.900 because again, going back to, we don't have one central repository of data. Oh gosh. How small is
01:53:24.640 it? In terms of how many millions of people have had sequence, it's small still, you know, it's in
01:53:30.720 the 10 to, well, it's in the less than 10 million for sure of whole genome sequence.
01:53:37.360 And then of course, how many do we have accurate phenotyping on? Because if the phenotype is
01:53:41.960 not that accurate, then it dilutes the quality of what you're trying to do, right?
01:53:46.140 That's essential because what phenotype we have is fixed at the moment the genome was assessed.
01:53:51.920 Exactly. We don't know. The phenotypes change. So all the studies that we have-
01:53:55.520 So this has to be living, breathing, and longitudinal.
01:53:57.720 Exactly. And that's why I'm trying to see our way through, like, you know, the all of us study
01:54:03.520 of the million people that we're onto right now is the beginning of something like that,
01:54:07.280 where all the layers of data for long-term follow-up, it's still tiny, a hundred thousand.
01:54:11.980 I mean, a million people is tiny, but it's a start. But if we could get the leading countries in the
01:54:17.520 world to get behind this, you know, this is something that should override concerns about
01:54:23.720 competition in countries, this is about, you know, for mankind, humankind, then we might be able to
01:54:30.160 really develop something that would promote health of all human beings. That would be far-reaching.
01:54:36.860 And you know what? I actually think this is going to happen someday. I know it sounds far-fetched.
01:54:40.740 I know you think, looking at me like I'm a little cuckoo.
01:54:43.120 No, I mean, look, a lot of things seem far-fetched in the moment. I think it's,
01:54:47.640 truthfully, I think it's technologically more capable, it's more possible to me technologically
01:54:53.700 than it is politically.
01:54:55.360 Okay, well, that's good to hear. I like that.
01:54:57.000 Because I don't, I think the biggest challenge, I'm doing the back of the envelope math,
01:55:02.080 just, I think it's a couple trillion dollars to do this in the United States alone.
01:55:06.940 I don't know. Depends on how long, I mean, this is over forever.
01:55:11.900 Exactly. Which means it's a moonshot. And I don't feel like our political environment
01:55:19.200 is capable of moonshots anymore.
01:55:21.300 We're just trying to live day to day here now.
01:55:23.020 Yeah. So long gone are those days when you could make a bold, we're going to spend the
01:55:29.200 equivalent of a couple trillion dollars over 20 years, long after I'm gone, meaning me,
01:55:35.160 meaning the politician who's going to be the torchbearer of this to make this a reality.
01:55:39.820 I don't know. Maybe I'm just a jaded, skeptical guy when it comes to our political system.
01:55:43.840 Well, it's actually healthy to be looking at it that way. And also, I want to also frame it,
01:55:49.640 Peter, as it is an experiment, because you still have to, once you develop it at some scale,
01:55:54.540 you still have to prove that it's helping people.
01:55:56.660 Right. So you'll need to use a biased and an unbiased subset of these.
01:56:00.220 Yeah. So, you know, it's theory, it's intriguing, it's doable, and it's going to get progressively
01:56:05.820 better, you know, what we can put in as inputs, but it's still a question mark. Will it improve?
01:56:11.940 I just think that our complete reliance on clinical trials is misled.
01:56:17.560 I completely agree. I think the heterogeneity problem and the exposure, the time under the
01:56:22.140 curve, the exposure problem, make clinical trials very difficult to extrapolate from. I mean,
01:56:27.760 here's one of my favorite pet peeve examples is people are so quick to dismiss Zetia as a useful
01:56:34.120 drug. But in reality, it's never once, to my knowledge, been targeted towards patients that
01:56:39.340 are hyperabsorbers of sterols. Right.
01:56:41.580 And yet, you know, so Zetia gets sort of diluted in clinical trials because you're giving it to
01:56:46.260 patients that have normal and abnormal absorption of sterols. And so on balance, it doesn't look like
01:56:54.000 a very interesting drug. It seems to work okay with a statin, but I'm convinced there are patients
01:56:58.820 out there taking statins who should be taking Zetia because if you, you can phenotype this,
01:57:03.160 you can really see people who don't make that much cholesterol, but they absorb it like crazy.
01:57:08.580 It's amazing.
01:57:09.440 But who's going to do that clinical trial, right? No one's going to do that clinical trial.
01:57:12.640 The lack of interest of the people that manufacture the drug because of dilution of the,
01:57:17.720 of the market. I mean, it's really unfortunate, but you're, you're absolutely right about that.
01:57:21.340 Well, Eric, this has been, this has been a really interesting discussion and I'm glad,
01:57:25.260 uh, it's crazy that it took a decade for us to, I mean, I, I'm amazed you knew my name. I'm flattered,
01:57:31.040 but I, I certainly known about you for, from the day I got to San Diego and I'm just glad that the
01:57:36.700 podcast and your book really became a good excuse to sit down. So thank you. Thank you for your work,
01:57:41.060 most importantly, but also thank you for making the time today. I know you've talked about this a lot
01:57:44.920 and I'm sure you didn't necessarily feel like talking about the book. Oh,
01:57:48.020 and some of these stories over and over again, but I know that people listening to this are
01:57:51.240 going to appreciate it. Well, thanks, Peter. I, I, we talked about things I actually haven't really
01:57:56.000 gotten into in the past, but I also, you know, really enjoyed great intellectual thinking with
01:58:03.140 you. It's fun. And I hope we'll have a chance to get together a lot more in the years ahead.
01:58:07.100 Oh, we certainly will. We're, we're almost neighbors. So it has to happen. Thanks, Eric.
01:58:10.940 Thank you.
01:58:12.680 Thank you for listening to this week's episode of The Drive. If you're interested in diving deeper
01:58:16.820 into any topics we discuss, we've created a membership program that allows us to bring you
01:58:21.220 more in-depth, exclusive content without relying on paid ads. It's our goal to ensure members get
01:58:26.720 back much more than the price of the subscription. Now to that end, membership benefits include a
01:58:32.060 bunch of things. One, totally kick-ass comprehensive podcast show notes that detail every topic, paper,
01:58:38.060 person, thing we discuss on each episode. The word on the street is nobody's show notes rival these.
01:58:43.360 Monthly AMA episodes or ask me anything episodes, hearing these episodes completely access to our
01:58:50.000 private podcast feed that allows you to hear everything without having to listen to spiels
01:58:54.660 like this. The qualities, which are a super short podcast, typically less than five minutes that we
01:59:00.260 release every Tuesday through Friday, highlighting the best questions, topics, and tactics discussed
01:59:04.860 on previous episodes of The Drive. This is a great way to catch up on previous episodes without
01:59:10.180 having to go back and necessarily listen to everyone. Steep discounts on products that I
01:59:15.500 believe in, but for which I'm not getting paid to endorse, and a whole bunch of other benefits that
01:59:20.080 we continue to trickle in as time goes on. If you want to learn more and access these member-only
01:59:24.680 benefits, you can head over to peteratiamd.com forward slash subscribe. You can find me on Twitter,
01:59:31.260 Instagram, and Facebook, all with the ID, peteratiamd. You can also leave us a review on
01:59:37.200 Apple Podcasts or whatever podcast player you listen on. This podcast is for general informational
01:59:43.140 purposes only and does not constitute the practice of medicine, nursing, or other professional
01:59:47.580 healthcare services, including the giving of medical advice. No doctor-patient relationship is
01:59:53.720 formed. The use of this information and the materials linked to this podcast is at the user's own risk.
01:59:59.640 The content on this podcast is not intended to be a substitute for professional medical advice,
02:00:05.120 diagnosis, or treatment. Users should not disregard or delay in obtaining medical advice
02:00:11.460 from any medical condition they have, and they should seek the assistance of their healthcare
02:00:16.060 professionals for any such conditions. Finally, I take conflicts of interest very seriously. For all of my
02:00:22.940 disclosures and the companies I invest in or advise, please visit peteratiamd.com forward
02:00:29.620 slash about where I keep an up-to-date and active list of such companies.
02:00:59.620 Thank you.
02:01:01.160 Thank you.