Making Sense - Sam Harris - July 03, 2019


#162 — Medical Intelligence


Episode Stats

Length

36 minutes

Words per Minute

143.84741

Word Count

5,181

Sentence Count

259

Hate Speech Sentences

4


Summary

In the wake of the attack on an anti-fascist journalist by Antifa in Portland, Oregon, I thought it would be a good idea to talk about what's happening in the streets of the city, and why we should be worried about it. And I think we should all be worried. Sam Harris is a writer and editor at Quillette, an online magazine that's often unfairly described as being "conservative." It's really just a centrist magazine that has spent a lot of time criticizing the insanity on the left. And it's been accused of being anti-Islamic, anti-American, and anti-Semitic. And yet, in the face of all of that, it seems to have been welcomed by the authorities as a normal thing to do in the eyes of the law and order. This is a strange phenomenon, and one that needs to be talked about, because it's a symptom of a complete breakdown in social order, not a symptom, of a breakdown in the social order. And a breakdown of civil society, which again, appears to be the result of a pure confection by the people on social media, which is a result of the breakdown of social order and the use of violence by the "anti-fascists" on the streets. And that's a problem, not one that can be solved by the police, because the police are the only ones who have a chance to do anything about it, other than to stop the violence. anything at all in the first place except to do something about it and do something . at least they can do nothing about it in the right unless they re not to do it, because they re a sociopath, and they re all sociopaths or they re like sociopaths, as they are just like the sociopath but that s , right? the problem is that they are all of them, and they re just like us not them we are all , and they are not , they are just yet ? so they are also no their right , so they is are , not they aren t any s , or because they are so to them? or they , they are not ? ?


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.440 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.140 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.780 Welcome to the Making Sense Podcast.
00:00:48.980 This is Sam Harris.
00:00:51.320 Okay, I'm in lovely London, getting ready to record some podcasts.
00:00:55.860 It really is lovely.
00:00:57.060 The weather is perfect.
00:01:00.040 That makes London especially nice.
00:01:03.360 So I'm going to record a housekeeping here, and then get out of my hotel room.
00:01:09.760 A few things to say that have no relationship to today's podcast.
00:01:14.920 I am recording this right after the Andy Ngo assault in Portland, a few days after.
00:01:23.600 This strikes me as entirely the product of Twitter, or of social media in general.
00:01:32.220 This is like a physical manifestation of all that is crazy online.
00:01:38.720 I think these protests probably wouldn't occur.
00:01:43.740 Andy Ngo, the journalist who was attacked, probably wouldn't have been there.
00:01:49.840 All of the acrimony and insanity that won witnesses in the aftermath would have no forum.
00:01:56.480 It's a very strange phenomenon.
00:01:57.920 I'll catch you up for those of you who don't know what I'm talking about.
00:02:00.300 Andy Ngo is a journalist and editor at Quillette, which is an online magazine that's often unfairly described as being conservative.
00:02:09.940 It's conservative in the way that the IDW, the Intellectual Dark Web, is conservative.
00:02:16.520 It's really just a centrist magazine that has spent a lot of time criticizing the insanity on the left.
00:02:27.140 So it is branded by the left, certainly the far left, as conservative, if not enabling of fascism and racism and xenophobia and Islamophobia.
00:02:39.960 All of those things have been alleged.
00:02:43.380 Now, I don't know Andy.
00:02:44.520 I think I met him once very briefly.
00:02:47.640 He covered the release of the documentary Islam and the Future of Tolerance, which depicted my collaboration with Majid Nawaz.
00:02:55.060 I don't know his personal politics, and his politics are absolutely irrelevant to what happened in Portland.
00:03:03.640 I didn't contribute much to the resulting cacophony on Twitter.
00:03:08.600 I posted one thing, but I'll just say a few things here.
00:03:12.580 So what has been happening in Portland, apparently, is that Antifa, the so-called anti-fascist cult,
00:03:21.880 has been demonstrating periodically and allowed to do so with real impunity by the mayor, Ted Wheeler.
00:03:33.500 And my one tweet on this topic tagged him.
00:03:38.660 It seems to me he's been totally irresponsible in the scope he has given to these protests.
00:03:45.080 I mean, I've seen video with Antifa stopping traffic and pulling people out of cars.
00:03:52.660 It's madness.
00:03:53.760 It's a complete breakdown of social order.
00:03:56.580 And in the video where you see Andy Ngo attacked, that's what you witness.
00:04:02.520 A complete breakdown in social order.
00:04:05.820 And apparently the police in Portland have been told not to intervene by the mayor.
00:04:10.900 Anyway, this is the kind of story that will be picked up by the right wing.
00:04:18.320 You know, Andy Ngo will be on Fox News talking about his attack.
00:04:22.220 One can only hope that mainstream sources like the Washington Post and the New York Times
00:04:28.900 will talk about Antifa honestly here.
00:04:33.040 I mean, Antifa is often described as a group of people who are protesting the extreme right.
00:04:40.060 Well, they may be doing that, but they're also attacking innocent bystanders and journalists.
00:04:49.520 So what we have here is a group that imagines it opposes fascism, but they behave just like fascists.
00:04:57.880 And perhaps this is no surprise.
00:05:00.620 If you travel far enough to the right or to the left on the political spectrum, you find yourself surrounded by sociopaths.
00:05:09.560 And Antifa, while there may be some blameless members of this movement, seems to be chock full of sociopaths.
00:05:18.120 At least judging from their handiwork that you can see attested to in these videos.
00:05:23.280 But anyway, the response to this phenomenon, which again is a total breakdown of civil society, right?
00:05:31.200 You've got people who are attacking nonviolent bystanders in a context which, again, appears to be a pure confection of social media.
00:05:43.840 Because most of the people in these protests, most of the members of Antifa, you see, are also filming.
00:05:51.880 I mean, everyone has their phones out or their cameras out, filming themselves to broadcast this online.
00:05:59.300 It is a bizarre moment.
00:06:01.220 Anyway, the video that shows Andy getting attacked starts after the attack has occurred.
00:06:07.860 I mean, there's a few other videos, so you can sort of triangulate on this.
00:06:11.620 But the video that's widely being shown is one which starts after he's already been hit at least once.
00:06:19.580 And then you see someone run up and hit him twice in the face as hard as he can.
00:06:25.680 And then I think the same attacker then returns a moment later to kick him in the groin twice as hard as he can.
00:06:35.000 There's a few things to point out about this.
00:06:36.380 When you punch someone in the face as hard as you can, especially when they're not prepared for it,
00:06:45.420 I mean, you just blindside them, there is absolutely no guarantee that you're not going to kill them, right?
00:06:53.680 I mean, people get hit in the face, knocked out, they fall down, they hit their head on the pavement, and they die, right?
00:07:03.280 This happens.
00:07:04.200 It's not a high probability way to murder somebody, but it's not an especially low probability way of doing it either, right?
00:07:12.920 Especially if you know how to throw a punch.
00:07:15.300 I mean, if you knock someone out cold and there's only concrete to catch their fall,
00:07:21.960 you can certainly kill someone this way.
00:07:24.840 So you should be morally prepared to deal with that aftermath, right?
00:07:30.300 To know that that's what you're doing, and to know that you may very well spend a long time in prison as a result of what you've done.
00:07:40.920 And I might add, in prison, you might meet some real neo-Nazis and aspiring fascists to keep you company.
00:07:49.240 And that's actually what one hopes for these people in the video.
00:07:54.000 If you think this is effective political work so as to get people to worry more about authoritarianism
00:08:04.120 and about the heavy-handedness of the state and about the rise of the far right,
00:08:10.060 it has absolutely the opposite effect.
00:08:13.800 You know, you see a few videos of Antifa.
00:08:16.980 You want the far right to show up,
00:08:19.700 and you certainly want the state to clamp down on this kind of behavior.
00:08:25.000 This has absolutely the opposite political effect.
00:08:28.980 It will guarantee four more years of Trump at a minimum for this kind of thing to become more commonplace.
00:08:37.540 And what's especially damaging is for the left to get this so wrong ethically online.
00:08:45.440 Here you have leftist journalists from Slate and Vice and other organizations supporting this attack on Andy.
00:08:57.920 At the very least, blaming him for having brought it on himself, right?
00:09:01.800 For being there.
00:09:02.980 Why were you there in the first place?
00:09:05.200 You knew that all your prior coverage of Antifa caused them to hate you, right?
00:09:10.400 This is just so wrong-headed.
00:09:12.940 If the left can't get this right, if liberals can't get this right,
00:09:17.320 we have some very dark days ahead.
00:09:20.100 Anyway, back to the attack.
00:09:21.800 So, he gets punched in the face twice.
00:09:25.020 He gets kicked twice.
00:09:26.980 Then he gets milkshakes and eggs thrown at him and dumped over him.
00:09:33.900 These are not people who have hit him in the face themselves.
00:09:37.720 These are people who, upon witnessing a totally non-violent person,
00:09:42.460 get punched in the face hard twice and kicked in the groin.
00:09:48.400 Their contribution to this moment is to then hurl a milkshake or an egg at him
00:09:54.260 or some other object.
00:09:56.040 He gets hit with other things as well.
00:09:58.400 It's not clear from the video.
00:10:00.560 I'll also point out that the person who punched him in the face was wearing black gloves.
00:10:04.120 A lot of these guys wear these tactical gloves that have reinforced knuckles.
00:10:08.820 You know, some people ride motorcycles with these gloves,
00:10:13.340 but these are also gloves that members of the military wear.
00:10:17.100 It's not like getting punched in the face with a naked fist.
00:10:21.880 Imagine kind of hard plastic knuckles being built into vinyl gloves.
00:10:28.400 So, that only makes things worse.
00:10:32.080 So, watch the video and rewind it and just follow each beat in it.
00:10:38.880 You'll see a few people trying to protect Andy.
00:10:42.560 But, this whole thing is so ugly and it could get so much worse so quickly.
00:10:49.420 There's been some discussion about whether or not the milkshakes that were being thrown at Andy
00:10:55.680 actually had quick-drying cement in them.
00:10:59.000 Cement, apparently, is quite caustic and therefore can burn you.
00:11:03.740 This stuff is being thrown in his eyes.
00:11:06.320 So, I don't know if that was the case,
00:11:08.360 but the whole thing was ghastly and made especially so
00:11:14.300 because in the aftermath, you saw people who have reputations they should worry about
00:11:22.500 defending this violence and ridiculing anyone who complained about it.
00:11:30.480 Or they'll immediately pivot to,
00:11:32.720 well, what about where were you during Charlottesville, right?
00:11:36.080 Or putting kids in cages at the border is worse, right?
00:11:40.700 That what-aboutery completely misses the point.
00:11:45.400 Yes, there are many things to complain about and worry about.
00:11:49.840 And I spent a fair amount of time talking about what's wrong with Trump
00:11:54.580 and what could become far worse with him, given another four years.
00:12:00.360 And I'm also concerned about the far right.
00:12:03.040 But I'm concerned about the complete breakdown of moral intelligence
00:12:08.020 in the mainstream left at moments like this.
00:12:13.940 This is a crystal clear and very dangerous violation
00:12:18.300 of the most basic norms of civil society.
00:12:23.580 Attacking a journalist, beating him and publicly humiliating him
00:12:28.960 for merely covering a public protest.
00:12:32.660 It should be impossible for liberal people
00:12:36.540 to get their analysis of this wrong.
00:12:40.060 And yet, they reliably do.
00:12:43.560 Anyway, that was the big thing that happened in the last few days.
00:12:47.580 It bears absolutely no relationship to the topic of today's podcast.
00:12:51.960 And now I will move on.
00:12:54.440 Today I'm speaking with Eric Topol.
00:12:57.600 Eric is a world-renowned cardiologist
00:13:00.080 and the executive vice president of the Scripps Research Institute.
00:13:05.880 He's actually one of the top ten most cited medical researchers
00:13:09.920 and the author of several books,
00:13:13.980 The Patient Will See You Now,
00:13:16.020 The Creative Destruction of Medicine,
00:13:18.500 and the book under discussion,
00:13:20.280 Deep Medicine,
00:13:21.380 How Artificial Intelligence Can Make Healthcare Human Again.
00:13:24.360 And we do a deep dive into the current state of medicine.
00:13:28.580 We talk about why we have soaring medical costs
00:13:31.780 and declining health outcomes in the U.S.
00:13:34.440 We talk about the problems of both too little and too much medicine.
00:13:40.300 Talk about how slowly the field has adopted useful technology.
00:13:46.220 And then we get into the current status of AI in medicine
00:13:50.540 and how it could completely transform the field
00:13:54.460 for the better, mostly, but also in ways for the worse.
00:13:58.880 Anyway, I found it a fascinating conversation.
00:14:02.240 I felt it brought me up to speed with these rapid changes.
00:14:06.620 And now, without further delay, I bring you Eric Topol.
00:14:09.660 I am here with Eric Topol.
00:14:17.680 Eric, thanks for coming on the podcast.
00:14:19.860 Oh, great to be with you, Sam.
00:14:21.600 So, if I recall correctly,
00:14:24.120 we met at a whole genome sequencing conference.
00:14:29.900 And I was impressed both with the promise
00:14:32.580 of sequencing the genome at that point
00:14:34.880 and also impressed in the aftermath
00:14:36.980 that there seemed to be almost nothing to do with the information.
00:14:41.120 It felt like it was a few years too early.
00:14:43.420 I mean, are we at a point now where
00:14:44.880 if we had met at that conference,
00:14:46.780 there'd be more that would be actionable?
00:14:49.140 Are we still in kind of a place
00:14:51.280 where there's not a lot to do
00:14:53.160 with one's whole genome being sequenced?
00:14:56.020 Well, it's definitely improving.
00:14:58.080 So, whereas when we first met,
00:15:00.220 it might have been less than 1% chance
00:15:02.440 it would be actionable,
00:15:03.900 now it's getting up to 5%.
00:15:04.980 So, it's definitely getting better,
00:15:07.900 but we still have a ways to go.
00:15:09.320 And it'll take having like a billion people
00:15:11.700 with whole genome sequencing
00:15:12.900 and all their data
00:15:13.740 to finally make it very informative.
00:15:16.860 Well, it is cool,
00:15:17.980 but we're sort of,
00:15:18.560 I mean, we're going to talk about this
00:15:19.760 in some depth
00:15:20.940 in response to your new book,
00:15:23.880 Deep Medicine,
00:15:24.700 where you're talking about
00:15:25.860 how we can use AI,
00:15:28.400 not just with respect to genetics,
00:15:31.640 but really all of medicine.
00:15:32.940 But before we dive in,
00:15:34.580 what's your background as a physician?
00:15:38.520 I'm a cardiologist.
00:15:39.960 I started in practice in cardiology in 1985.
00:15:44.860 So, I've been kind of an old dog
00:15:47.100 30-some years now.
00:15:50.040 Yeah, and then you started
00:15:51.600 the Scripps Institute
00:15:53.360 for Translational Medicine?
00:15:55.140 Yes.
00:15:55.820 That was back in the beginning of 07.
00:15:57.980 It was basically a new,
00:16:01.760 broadened mission of Scripps Research,
00:16:04.300 which had been since 1923
00:16:06.940 a basic science institute.
00:16:08.800 And this is really the applied limb,
00:16:10.740 which is giving it
00:16:11.880 a lot of translational
00:16:13.240 medical research capabilities.
00:16:16.560 Right.
00:16:16.780 So, I can start with a big picture
00:16:18.800 before we get into
00:16:20.200 the high-tech discussion here.
00:16:22.880 It does seem that medicine
00:16:25.280 is broken in many ways,
00:16:27.980 and our discussion
00:16:28.740 will mostly be focused on the U.S.
00:16:31.420 In the U.S., we spend,
00:16:33.760 you know, I have this from your book,
00:16:35.520 $11,000 per person per year
00:16:38.360 on medicine.
00:16:40.140 And, you know, that's still climbing.
00:16:42.420 In 1975, I think it was
00:16:44.000 something like $550.
00:16:46.180 And yet our outcomes
00:16:47.900 don't compare very well
00:16:49.840 with the rest of the developed world.
00:16:51.800 How do you account for that?
00:16:53.580 And how do you view
00:16:55.440 the rising expenditure
00:16:57.880 and seeming plateauing
00:17:00.320 or, in some cases,
00:17:01.160 declining outcome measures?
00:17:03.680 Well, you're absolutely right
00:17:04.720 about the numbers, Sam.
00:17:05.840 And I think the basis of this,
00:17:09.040 which is outcomes
00:17:10.200 of not just lowered life expectancy
00:17:12.820 now in the U.S.
00:17:14.120 three years in a row,
00:17:15.180 which is unprecedented,
00:17:16.280 but also extends
00:17:17.480 to all the important metrics
00:17:19.500 like infant mortality,
00:17:21.460 childhood mortality,
00:17:22.740 maternal mortality,
00:17:23.980 and on and on.
00:17:25.060 So when you look at
00:17:26.200 why has the model
00:17:28.320 in the U.S.
00:17:29.720 gone south,
00:17:31.000 you start to see,
00:17:32.160 well, there's two
00:17:32.820 likely explanations.
00:17:35.460 A big one
00:17:36.320 is that
00:17:37.380 we have
00:17:38.320 major inequities
00:17:40.180 in our care.
00:17:41.120 We don't provide care
00:17:42.280 for all citizens,
00:17:43.240 unlike all the other countries
00:17:44.880 that are being compared with.
00:17:47.740 The other extreme
00:17:48.920 is that we overcook,
00:17:50.880 that we do too much,
00:17:52.300 so the people
00:17:53.060 who have coverage,
00:17:54.780 they get over-tested,
00:17:56.800 over-treated,
00:17:57.780 and that leads
00:17:59.000 to all sorts of problems,
00:18:00.280 including bad outcomes.
00:18:01.840 So we've got
00:18:02.920 lots of serious problems.
00:18:04.960 Yeah.
00:18:05.380 Well, I must say,
00:18:06.700 I feel like
00:18:08.500 I have a fair amount
00:18:09.820 of experience
00:18:11.280 with the latter problem
00:18:13.540 of too much medicine,
00:18:15.340 or at least
00:18:15.620 too much medicine
00:18:16.380 being offered,
00:18:18.320 and it's often said
00:18:19.800 that, you know,
00:18:20.520 we have the best medicine
00:18:21.560 in the world
00:18:22.160 if you're
00:18:23.480 well-off
00:18:24.980 or well-connected,
00:18:26.680 and yet
00:18:27.760 I always find it
00:18:29.280 incredibly humbling
00:18:30.840 and fairly depressing
00:18:32.600 how hit-or-miss
00:18:34.300 my encounters
00:18:35.040 with medicine are.
00:18:36.560 I'm not a doctor,
00:18:37.740 but my background
00:18:39.140 in neuroscience
00:18:39.620 gives me, you know,
00:18:40.840 a better-than-average
00:18:42.040 position
00:18:43.000 as a consumer
00:18:43.720 of medicine,
00:18:44.400 but I also find
00:18:46.160 whenever I get put
00:18:47.500 into the machinery
00:18:48.480 of the medical system,
00:18:50.500 whether it's because
00:18:51.120 I'm sick
00:18:51.680 or because someone
00:18:52.260 close to me is sick,
00:18:53.620 one of my kids is sick,
00:18:56.000 rather often
00:18:57.060 I experience
00:18:58.680 a fairly
00:18:59.620 tortuous adventure
00:19:01.360 where,
00:19:02.760 as you said,
00:19:03.540 too much medicine
00:19:04.540 is offered
00:19:05.040 or it could be drugs
00:19:06.700 with serious side effects
00:19:07.860 that are kind of
00:19:08.620 dispensed
00:19:09.220 with a totally
00:19:10.320 cavalier attitude.
00:19:12.540 Risky procedures
00:19:13.380 are recommended
00:19:14.420 almost reflexively,
00:19:16.740 and, you know,
00:19:17.740 there's a whole process
00:19:19.300 of declining
00:19:20.200 to go down this path
00:19:22.060 rather often,
00:19:23.440 and then, as you know,
00:19:24.500 most conditions
00:19:25.020 are self-limiting,
00:19:25.880 and then you feel
00:19:26.620 totally justified
00:19:27.400 for having declined,
00:19:28.760 and then, you know,
00:19:29.600 there's experiences
00:19:30.860 where, you know,
00:19:31.520 scary diagnoses
00:19:32.700 are given
00:19:33.380 only to be overturned
00:19:34.760 by a second opinion,
00:19:36.360 and diagnostic tests
00:19:38.440 are ordered
00:19:39.040 where it's revealed
00:19:40.980 that there really
00:19:42.720 is no thought
00:19:43.900 as to basically
00:19:45.120 the doctor
00:19:45.580 was going to recommend
00:19:46.500 the same treatment
00:19:47.860 or the same lifestyle change
00:19:49.260 regardless of what
00:19:50.240 showed up
00:19:50.760 on that particular test.
00:19:51.720 I mean, it's just,
00:19:52.380 I find my encounters
00:19:53.320 with medicine
00:19:54.020 weird
00:19:55.160 almost, you know,
00:19:56.660 more often than not,
00:19:58.120 and this is,
00:19:59.320 and I consider myself
00:20:00.180 to be probably
00:20:01.020 in the most fortunate
00:20:03.760 possible position
00:20:05.040 with respect to
00:20:06.060 being a consumer
00:20:06.740 of medicine,
00:20:07.360 and yet,
00:20:08.020 with a possible exception
00:20:09.120 to your own,
00:20:10.340 where you're a celebrated
00:20:11.960 physician, right?
00:20:13.380 You're a physician with,
00:20:14.920 you know,
00:20:15.140 you're not just
00:20:16.020 an average physician,
00:20:16.980 you're a very connected one,
00:20:19.740 and, you know,
00:20:21.000 you've made significant
00:20:22.280 contributions to your field,
00:20:23.820 and yet you open your book
00:20:26.220 with a totally harrowing
00:20:28.500 encounter
00:20:29.000 with your own,
00:20:31.000 you know,
00:20:31.600 medical history.
00:20:33.800 I'm sure you've talked
00:20:34.800 about this a lot
00:20:35.400 because you open your book
00:20:36.440 with it,
00:20:36.780 and it's fairly arresting,
00:20:37.880 but perhaps just give us
00:20:40.200 your experience with,
00:20:41.980 you know,
00:20:42.520 something like medical
00:20:43.760 malpractice,
00:20:44.780 which you as a physician
00:20:46.360 still, it seems,
00:20:49.300 couldn't protect yourself from.
00:20:51.200 Right.
00:20:51.700 Well, Sam,
00:20:52.160 it was harrowing.
00:20:53.320 That was a good word
00:20:54.880 to assign to it.
00:20:56.860 I was having a knee
00:20:57.780 replacement.
00:20:59.280 It was almost
00:20:59.860 three years ago now,
00:21:01.160 and I had thought
00:21:02.720 it would be
00:21:03.440 pretty straightforward
00:21:04.560 because I was
00:21:06.060 pretty physically fit
00:21:07.320 and, you know,
00:21:08.640 thin and relatively young
00:21:10.640 compared to a lot of people
00:21:11.740 who have knee replacements,
00:21:12.980 and I had referred
00:21:13.780 many patients
00:21:15.060 to the same orthopedist,
00:21:17.080 so I had some confidence,
00:21:19.320 but what happened was
00:21:20.800 I had a disastrous
00:21:22.420 post-operative complication,
00:21:24.740 which I didn't even,
00:21:25.380 I'd never heard of the word
00:21:26.620 arthrofibrosis,
00:21:28.400 and part of that really was
00:21:30.280 I had a high risk
00:21:31.320 that I didn't know about
00:21:32.300 because I had a congenital condition
00:21:34.260 called osteochondritis dissecans,
00:21:36.660 which set me up for that,
00:21:38.040 so this really was horrendous.
00:21:43.500 You know,
00:21:43.760 I couldn't sleep.
00:21:45.360 I was in pain.
00:21:46.560 I was taking opiates,
00:21:48.440 and I went,
00:21:49.840 showed up,
00:21:50.400 you know,
00:21:51.460 with all this,
00:21:52.920 you know,
00:21:53.180 really bad state
00:21:54.680 with my wife
00:21:55.580 to the orthopedist
00:21:56.820 about a month
00:21:57.540 after the surgery,
00:21:59.340 and he said to me,
00:22:00.940 I need to get
00:22:01.560 some anti-depression medications.
00:22:04.180 Right.
00:22:04.780 And I said,
00:22:05.700 what?
00:22:06.420 You know,
00:22:06.780 so this is
00:22:08.020 like the shallow medicine,
00:22:09.860 you know,
00:22:10.340 robotic.
00:22:11.140 I mean,
00:22:11.340 here's a human expert
00:22:12.960 who did the surgery.
00:22:14.740 That wasn't the issue.
00:22:15.820 It was the post-operative care,
00:22:17.380 and I think
00:22:19.400 that's telling.
00:22:20.420 I think that
00:22:21.000 almost everyone now
00:22:22.900 who I talk to
00:22:23.700 has had either
00:22:24.600 on their own
00:22:25.600 or their family members,
00:22:26.960 loved ones,
00:22:27.680 have had a roughed-up experience,
00:22:29.860 and that's what it was for me.
00:22:32.260 Yeah,
00:22:32.520 so
00:22:32.820 maybe this doesn't account
00:22:34.780 for
00:22:35.440 your experience.
00:22:37.740 I mean,
00:22:37.800 on some level,
00:22:39.120 there's a fair amount
00:22:40.400 of bad luck there,
00:22:42.080 and also just,
00:22:43.140 I mean,
00:22:43.380 obviously the diagnosis
00:22:44.520 was missed,
00:22:45.080 or your risk potential
00:22:46.820 for that complication
00:22:47.980 was missed,
00:22:49.140 and we could talk about
00:22:49.960 the way in which AI
00:22:50.960 might make that
00:22:52.120 less likely to happen,
00:22:54.040 but I don't know.
00:22:54.720 It feels like
00:22:55.160 there's just a problem
00:22:56.300 in the culture
00:22:58.460 of medicine.
00:23:00.060 I mean,
00:23:00.200 medicine is kind of
00:23:01.760 a priesthood.
00:23:02.840 I mean,
00:23:03.040 it's like the way
00:23:03.920 people relate
00:23:05.280 to doctors
00:23:05.960 is a far less
00:23:07.960 straightforward
00:23:08.600 transaction
00:23:10.460 with respect to
00:23:11.760 the use of
00:23:12.840 another person's
00:23:13.500 expertise,
00:23:14.040 disease,
00:23:14.480 and
00:23:15.420 it's difficult
00:23:17.220 to navigate
00:23:17.840 for almost anyone
00:23:19.840 because,
00:23:20.440 in part,
00:23:20.740 it's the subject matter.
00:23:21.720 I mean,
00:23:21.820 you're dealing
00:23:22.200 in some,
00:23:23.160 in many cases,
00:23:24.340 either with life and death
00:23:25.440 questions
00:23:26.320 or
00:23:26.900 a concern,
00:23:28.500 a legitimate concern
00:23:29.220 about,
00:23:29.680 you know,
00:23:30.440 significant disability
00:23:32.000 or suffering
00:23:32.660 or risk,
00:23:34.300 and
00:23:35.000 I don't know.
00:23:36.200 We know so much
00:23:36.940 about how
00:23:37.860 impossible it is
00:23:40.040 for people
00:23:41.160 to navigate
00:23:43.300 their own
00:23:43.840 cognitive biases.
00:23:45.040 I mean,
00:23:45.160 we know that
00:23:45.740 physicians are
00:23:46.880 making diagnoses
00:23:48.900 based on
00:23:49.860 their clinical
00:23:50.760 experience
00:23:51.440 in ways
00:23:51.900 that really
00:23:52.620 distort the,
00:23:53.980 you know,
00:23:54.420 their sense
00:23:54.620 of probability
00:23:55.300 and the accuracy
00:23:56.100 of diagnosis
00:23:56.760 is way off.
00:23:58.460 I mean,
00:23:58.560 this is something
00:23:58.860 you touch
00:23:59.180 in your book
00:23:59.660 by reference
00:24:00.360 to Danny Kahneman
00:24:01.380 and Amos Tversky's
00:24:03.120 work.
00:24:03.860 There's something
00:24:04.340 about the culture
00:24:06.280 that,
00:24:07.280 again,
00:24:07.620 we haven't yet
00:24:08.220 introduced robots
00:24:09.380 into the equation
00:24:10.500 here,
00:24:10.820 but I mean,
00:24:11.640 can you say
00:24:11.920 anything about that?
00:24:12.680 I mean,
00:24:12.860 my impression
00:24:14.620 here is fairly
00:24:15.460 inchoate,
00:24:16.160 but I just realized
00:24:16.880 that there's,
00:24:17.820 I mean,
00:24:18.000 just the process
00:24:19.120 of,
00:24:19.540 you know,
00:24:19.820 getting second
00:24:20.420 opinions
00:24:20.940 is often weird
00:24:22.360 and what you do
00:24:23.580 with opinions
00:24:24.500 that can't be
00:24:25.200 reconciled
00:24:26.040 and how do you
00:24:27.960 see the effect
00:24:29.100 of putting on
00:24:29.720 a white lab coat
00:24:30.840 on,
00:24:31.380 you know,
00:24:32.700 the conversation
00:24:33.400 and the
00:24:35.060 relevant cognition?
00:24:36.860 Right.
00:24:37.140 Well,
00:24:37.760 you're touching on
00:24:39.380 this medical
00:24:40.020 paternalism,
00:24:41.540 which is
00:24:42.160 the sense
00:24:43.720 that,
00:24:44.160 you know,
00:24:44.400 doctor is
00:24:45.100 a know-all
00:24:46.100 entity
00:24:46.620 and that
00:24:48.260 wasn't as big
00:24:49.940 a problem
00:24:50.540 decades ago
00:24:51.700 when there was
00:24:52.940 a lot of trust,
00:24:54.300 there was presence,
00:24:55.160 there was a deep
00:24:55.720 relationship
00:24:56.540 and really
00:24:57.700 an intimacy,
00:24:59.140 an inner human bond.
00:25:00.840 But what's happened
00:25:01.560 over time
00:25:02.280 is that paternalism
00:25:03.820 has sustained
00:25:06.120 and at the same
00:25:07.860 time,
00:25:08.240 there's very
00:25:08.580 little time
00:25:09.180 with patients.
00:25:10.720 It's very
00:25:11.300 much a lack
00:25:12.860 of presence
00:25:13.380 because,
00:25:13.900 you know,
00:25:14.280 doctors are
00:25:14.980 looking at
00:25:16.240 keyboards
00:25:16.760 and they
00:25:18.200 really don't
00:25:18.720 have the time
00:25:19.560 to cultivate
00:25:21.240 a relationship.
00:25:22.400 So,
00:25:22.900 it's gotten
00:25:23.380 much worse.
00:25:24.860 It's the same
00:25:25.480 problem,
00:25:26.180 the basic
00:25:26.560 problem of
00:25:27.400 the kind of
00:25:28.380 authority,
00:25:29.180 control,
00:25:30.040 don't question
00:25:30.880 my opinion.
00:25:31.840 What do you mean
00:25:32.520 that you need
00:25:33.120 a second
00:25:33.520 opinion
00:25:33.980 when everyone
00:25:35.020 should be
00:25:36.000 entitled
00:25:36.400 and feel
00:25:37.080 very comfortable
00:25:37.780 to have
00:25:38.420 that second
00:25:38.880 opinion?
00:25:39.340 But this
00:25:40.160 doesn't
00:25:41.660 fit in
00:25:42.680 any longer
00:25:43.180 because
00:25:43.680 there's not
00:25:44.420 a relationship.
00:25:45.640 It's eroded
00:25:46.580 so seriously
00:25:47.420 over the last
00:25:48.760 three or four
00:25:49.440 decades.
00:25:51.620 It's interesting,
00:25:52.520 despite how much
00:25:53.920 we're spending
00:25:54.500 on medicine
00:25:55.140 each year,
00:25:56.660 and again,
00:25:57.080 the costs
00:25:58.060 are just going
00:25:58.620 up and up,
00:25:59.620 the field
00:26:00.200 is actually
00:26:00.920 very slow
00:26:02.240 to adopt
00:26:03.040 new technology.
00:26:04.680 And this is
00:26:05.320 something that
00:26:05.880 we've all
00:26:06.980 noticed the
00:26:07.760 transition to
00:26:08.520 electronic
00:26:09.520 health records,
00:26:11.040 which has
00:26:11.840 seemed somewhat
00:26:13.300 dysfunctional
00:26:14.100 and somewhat
00:26:15.420 haphazard.
00:26:16.920 I mean,
00:26:17.200 that just
00:26:18.680 feels like
00:26:19.140 as far as
00:26:20.160 this adoption
00:26:20.640 of tech
00:26:21.260 medicine is,
00:26:23.040 apart from
00:26:23.940 the introduction
00:26:25.240 of some
00:26:25.660 new scanner
00:26:26.500 from time
00:26:27.060 to time,
00:26:27.920 it seems
00:26:28.640 more like
00:26:29.260 the FAA
00:26:29.960 dealing with
00:26:31.000 old equipment
00:26:31.660 than it
00:26:33.380 looks like
00:26:34.280 Silicon Valley
00:26:34.960 dealing with
00:26:35.540 the latest
00:26:36.320 breakthrough
00:26:37.400 in consumer
00:26:38.580 tech.
00:26:39.120 How do you
00:26:39.920 view medicine
00:26:41.140 and tech
00:26:41.740 in general?
00:26:43.460 Yeah,
00:26:43.680 it's a pretty
00:26:44.140 sad story.
00:26:45.840 A lot of
00:26:46.220 people think
00:26:47.240 digital medicine
00:26:48.420 arrived with
00:26:49.240 the electronic
00:26:49.700 health record,
00:26:51.040 and that was
00:26:51.860 an abject
00:26:52.900 failure,
00:26:53.980 a disaster,
00:26:54.660 because when
00:26:55.860 those were
00:26:56.260 introduced,
00:26:57.600 they were set
00:26:58.520 up for
00:26:59.000 billing purposes
00:26:59.920 without any
00:27:01.320 consideration
00:27:01.880 of how
00:27:02.840 that would
00:27:03.160 affect either
00:27:04.100 patients or
00:27:04.820 doctors or
00:27:05.780 other clinicians.
00:27:07.180 So really,
00:27:07.820 that was actually
00:27:08.440 the motive?
00:27:09.020 It wasn't to
00:27:09.840 be able to
00:27:10.700 aggregate
00:27:11.500 information better?
00:27:13.720 No,
00:27:14.240 no,
00:27:14.500 it was just
00:27:14.820 to have
00:27:15.160 really good
00:27:15.680 billing,
00:27:16.100 to not miss
00:27:16.620 things.
00:27:17.640 It's amazing,
00:27:19.040 and it's not
00:27:19.960 really ever
00:27:20.500 improved.
00:27:21.060 It's the most
00:27:21.520 clunky,
00:27:22.760 pathetic software,
00:27:24.400 and across all
00:27:25.140 the different
00:27:25.480 companies that
00:27:26.460 are in this
00:27:27.560 business.
00:27:28.780 And that
00:27:29.740 had led to
00:27:30.860 doctors becoming
00:27:31.780 data clerks,
00:27:32.940 and has been
00:27:34.580 one of the
00:27:35.860 most important
00:27:36.620 aspects of why
00:27:37.580 there's such
00:27:38.340 profound burnout
00:27:39.340 in the medical
00:27:40.840 field,
00:27:41.700 with more than
00:27:42.260 half having
00:27:43.620 expressed that
00:27:45.020 they are
00:27:45.560 burnout,
00:27:46.020 but also over
00:27:46.740 20%,
00:27:47.480 even with
00:27:48.360 clinical depression
00:27:49.280 and the highest
00:27:50.020 numbers of
00:27:51.060 suicides ever
00:27:52.060 in the medical
00:27:52.900 profession.
00:27:54.800 And is there
00:27:55.540 anyone tracking
00:27:56.280 just the
00:27:56.740 actual use
00:27:58.040 of doctors'
00:27:58.940 time with
00:27:59.500 respect to
00:28:00.340 this new
00:28:01.020 technology?
00:28:01.960 Has the
00:28:02.500 experience of
00:28:03.360 being a
00:28:03.640 doctor been
00:28:04.620 more of
00:28:05.360 one dealing
00:28:06.400 with records
00:28:07.740 and insurance
00:28:08.420 and all the
00:28:09.620 rest,
00:28:10.000 and year by
00:28:10.840 year?
00:28:11.700 Exactly.
00:28:12.500 So what's
00:28:12.900 happened,
00:28:13.400 I mean,
00:28:13.620 the most
00:28:13.880 recent study
00:28:14.640 was that
00:28:15.760 80% of
00:28:17.180 the time
00:28:17.900 that medical
00:28:19.500 residents
00:28:20.180 were spending
00:28:22.600 without any
00:28:23.980 contact to
00:28:24.640 patients because
00:28:25.360 they were
00:28:25.580 working on
00:28:26.160 electronic health
00:28:27.220 records and
00:28:27.960 administrative
00:28:28.460 tasks.
00:28:29.500 And all the
00:28:29.940 recent time
00:28:30.600 studies that
00:28:31.320 have really
00:28:31.640 delved into
00:28:32.240 this show
00:28:32.740 a two-to-one
00:28:33.940 or greater
00:28:34.500 ratio of
00:28:35.520 time away
00:28:36.180 from patients.
00:28:37.780 So this
00:28:39.080 electronic health
00:28:39.900 record, which
00:28:40.540 is unfortunately
00:28:41.900 the precursor
00:28:43.020 of bringing
00:28:43.700 the digital
00:28:44.680 world into
00:28:45.480 the medical
00:28:46.560 profession,
00:28:47.880 has backfired.
00:28:49.080 It's really
00:28:49.760 been a
00:28:50.800 serious hit
00:28:51.660 to the
00:28:52.640 care of
00:28:54.020 patients.
00:28:55.900 And what
00:28:56.580 about other
00:28:57.200 technology like
00:28:58.820 diagnostic
00:29:00.080 imaging?
00:29:01.160 And I
00:29:02.120 remember,
00:29:02.800 you know,
00:29:03.900 I've had a
00:29:04.680 few adventures
00:29:05.520 in cardiology,
00:29:06.440 which is your
00:29:06.920 wheelhouse,
00:29:08.060 you know,
00:29:08.300 like a CT
00:29:09.520 scan, you
00:29:09.940 know, calcium
00:29:10.320 score scan,
00:29:11.560 and it's,
00:29:13.300 again, I
00:29:14.100 have found the
00:29:14.940 way in which
00:29:15.460 this imaging
00:29:16.040 has been
00:29:16.800 dispensed to
00:29:17.460 me.
00:29:17.660 I mean,
00:29:17.760 you know,
00:29:18.000 I've done it
00:29:19.280 and, you
00:29:20.340 know, happily,
00:29:21.200 I guess, you
00:29:22.020 know, I would
00:29:23.020 probably be
00:29:23.440 telling a
00:29:23.860 different story
00:29:24.440 if something
00:29:25.140 scary and
00:29:26.280 actionable were
00:29:26.960 found and I
00:29:27.680 had felt my
00:29:28.360 life was saved
00:29:29.140 by it.
00:29:29.620 But the way
00:29:30.920 this was
00:29:32.760 dispensed to
00:29:33.400 me was
00:29:34.780 cavalier enough
00:29:36.980 and it was
00:29:37.280 just like, we
00:29:37.640 now have this
00:29:38.180 new tool, let's
00:29:38.820 use it.
00:29:39.700 And there was
00:29:40.880 nothing, and I
00:29:41.700 got to the end
00:29:42.280 of the process
00:29:43.240 and it was
00:29:43.540 really, there
00:29:44.260 was just, it
00:29:45.260 was pretty clear
00:29:45.880 that it just
00:29:47.160 didn't make
00:29:47.480 sense in my
00:29:48.380 case.
00:29:49.280 to have done
00:29:50.080 this.
00:29:50.500 And so, how
00:29:51.700 do you view
00:29:52.160 just these
00:29:52.840 intrusions of
00:29:53.620 new machines
00:29:54.340 which could
00:29:55.260 be very
00:29:56.540 useful but
00:29:57.400 are either
00:29:58.600 used in
00:29:59.840 cases where
00:30:00.820 there's just
00:30:01.820 no reason
00:30:02.820 to use them
00:30:03.440 and I guess
00:30:04.440 we should also
00:30:04.920 talk about the
00:30:05.540 prospect of
00:30:06.260 type 1 errors
00:30:07.860 here where
00:30:08.260 people get
00:30:08.840 false positives
00:30:09.940 which then
00:30:10.620 they go chasing
00:30:11.440 with yet more
00:30:12.300 intrusive
00:30:13.260 procedures and
00:30:14.780 incur other
00:30:15.880 risks.
00:30:16.300 exactly for
00:30:18.160 that too.
00:30:18.980 The problem
00:30:19.820 here is we've
00:30:21.040 got a lot of
00:30:21.600 good technologies
00:30:22.680 but they're
00:30:24.080 misused, they're
00:30:25.240 overused.
00:30:26.280 So, the
00:30:26.820 example you
00:30:27.360 gave of a
00:30:29.200 calcium score
00:30:30.240 with a CT
00:30:31.280 scan to see
00:30:32.520 whether or not
00:30:33.020 you may have
00:30:33.680 coronary disease,
00:30:35.200 that test is
00:30:36.580 terribly overused.
00:30:37.880 I have never
00:30:38.680 ordered that test.
00:30:40.640 And mine was
00:30:41.120 worse, I had
00:30:41.560 an angiogram, I
00:30:42.400 didn't just have
00:30:42.880 the ordinary CT.
00:30:44.360 Yeah, so that
00:30:46.660 likely fits into
00:30:48.860 the so many
00:30:50.800 patients that I've
00:30:51.540 seen for second
00:30:52.260 opinions who
00:30:53.760 have become
00:30:54.760 disabled, who
00:30:56.740 have become
00:30:57.500 adversely affected
00:30:58.940 by the results
00:30:59.760 of their calcium
00:31:00.760 score even though
00:31:01.620 they have no
00:31:02.140 symptoms or others
00:31:03.880 that have been
00:31:04.260 told their lives
00:31:04.940 have been saved
00:31:05.580 because they are
00:31:06.200 whisked away from
00:31:07.220 the CAT scan to
00:31:08.500 then have an
00:31:08.980 angiogram and
00:31:09.760 stents or even
00:31:10.840 a bypass operation.
00:31:12.680 So, you know,
00:31:13.560 cardiac cripples
00:31:14.340 have been a result
00:31:15.820 of some of these
00:31:16.520 scans with patients
00:31:18.100 without any
00:31:18.820 symptoms and it's
00:31:19.920 really unsettling.
00:31:20.880 So, this is
00:31:22.020 an exemplar of
00:31:23.540 so many tests
00:31:24.680 that we have
00:31:25.160 today that they
00:31:26.040 can be helpful
00:31:26.660 in certain
00:31:27.260 individuals but
00:31:28.540 they can be
00:31:29.080 very harmful
00:31:30.300 as well.
00:31:32.080 And these
00:31:33.000 particular harms,
00:31:33.860 so I guess there's
00:31:34.460 two problems here.
00:31:35.380 We have the
00:31:36.260 underuse or lack
00:31:37.780 of availability
00:31:38.540 of medicine
00:31:39.860 to people who
00:31:40.660 really need it
00:31:41.480 and who have
00:31:41.840 substandard care
00:31:43.080 in a first world
00:31:44.520 society, our own,
00:31:46.380 that doesn't compare
00:31:47.220 favorably to the
00:31:48.680 rest of the
00:31:49.400 developed world.
00:31:50.860 But then here
00:31:51.680 we're talking about
00:31:52.560 the high class
00:31:54.140 problem of
00:31:55.160 having a more
00:31:57.380 consumer relationship
00:31:58.680 to advanced
00:32:00.400 medicine where
00:32:01.120 you have access
00:32:02.180 to what are
00:32:04.620 ostensibly the
00:32:05.240 best doctors,
00:32:06.120 the best hospitals,
00:32:07.000 the best
00:32:07.940 information,
00:32:08.900 the new
00:32:09.240 scanners,
00:32:10.540 and although
00:32:11.880 even there,
00:32:12.640 I mean,
00:32:12.760 just to give you
00:32:13.740 a reference
00:32:14.240 point for this
00:32:14.980 angiogram,
00:32:15.820 so I went to
00:32:16.860 a highly
00:32:18.260 regarded
00:32:18.760 cardiologist
00:32:19.760 on the
00:32:20.700 assumption that
00:32:21.420 whatever scanner
00:32:22.440 he would be
00:32:22.900 putting me in
00:32:23.500 would be the
00:32:24.880 latest and
00:32:25.660 lowest dose
00:32:26.440 of radiation
00:32:27.500 scanner.
00:32:28.720 And then I
00:32:29.080 get the scan
00:32:30.620 and I see
00:32:31.080 the amount
00:32:32.060 of radiation
00:32:32.540 delivered and
00:32:33.620 I just kind
00:32:34.180 of check this
00:32:34.760 with a friend
00:32:35.820 who's a
00:32:36.120 physician who
00:32:37.000 has access
00:32:38.100 to similar
00:32:39.300 doctors and
00:32:40.120 he said,
00:32:41.160 yeah,
00:32:41.360 if I had
00:32:42.340 ordered the
00:32:42.700 scan,
00:32:43.080 you would
00:32:43.440 have gotten
00:32:43.860 one-third
00:32:44.800 the amount
00:32:45.660 of dosage
00:32:47.380 there,
00:32:48.300 so it's like
00:32:48.640 I'm not quite
00:32:49.080 sure why
00:32:49.760 you got put
00:32:50.940 in that
00:32:51.200 scanner.
00:32:52.140 And just the
00:32:52.500 fact that
00:32:52.760 there's that
00:32:53.180 kind of
00:32:53.480 variance,
00:32:53.940 I'm not
00:32:55.040 especially
00:32:55.840 paranoid about
00:32:56.500 this,
00:32:56.800 I understand
00:32:57.440 that this
00:32:58.220 doesn't raise
00:32:58.700 my cancer
00:32:59.460 risk all
00:32:59.920 that much,
00:33:00.360 but the
00:33:00.980 fact that
00:33:01.620 in the
00:33:02.340 most
00:33:02.580 prestigious
00:33:03.220 networked
00:33:04.380 circles,
00:33:05.280 there could
00:33:05.600 be that
00:33:05.900 kind of
00:33:06.200 variance,
00:33:06.700 it's just
00:33:07.860 bizarre to
00:33:08.360 me.
00:33:09.400 Well,
00:33:09.780 you've just
00:33:10.360 touched on
00:33:10.940 something as
00:33:11.540 a pet peeve
00:33:12.180 of mine,
00:33:12.700 which is why
00:33:13.780 don't we tell
00:33:14.400 patients when
00:33:15.140 we order a
00:33:16.540 test or say
00:33:17.540 they should
00:33:17.880 have such a
00:33:18.500 test that
00:33:19.320 uses ionizing
00:33:20.280 radiation about
00:33:21.760 how much
00:33:22.700 radiation they
00:33:23.420 would be
00:33:23.620 exposed to.
00:33:24.340 That is,
00:33:24.920 we don't have
00:33:25.520 to use the
00:33:26.100 millisieverts
00:33:26.820 units,
00:33:27.980 we could say
00:33:28.740 it's equivalent
00:33:29.380 to how many
00:33:30.020 chest x-rays.
00:33:31.600 All right,
00:33:31.780 so this physician
00:33:33.160 who I will not
00:33:33.700 name,
00:33:34.400 but whose name
00:33:35.120 would be known
00:33:35.780 to you,
00:33:36.840 as part of
00:33:37.500 this pattern,
00:33:38.100 I asked the
00:33:39.020 perfunctory
00:33:39.980 skeptical questions
00:33:41.400 about whether
00:33:41.840 this scan was
00:33:42.720 necessary and
00:33:43.380 what my
00:33:44.820 dosage would
00:33:45.880 be,
00:33:46.480 and he said,
00:33:46.920 well,
00:33:47.080 it's analogous
00:33:47.940 to you taking
00:33:49.220 10 flights to
00:33:50.420 Hong Kong this
00:33:51.460 year.
00:33:52.300 Has someone told
00:33:53.480 you that you
00:33:54.100 shouldn't go to
00:33:54.920 Hong Kong 10
00:33:55.680 times this year?
00:33:57.060 And I said,
00:33:58.580 no, no,
00:33:59.220 that sounds fine.
00:34:00.120 I mean,
00:34:00.280 it's a lot of
00:34:01.000 Hong Kong,
00:34:01.540 but I can do
00:34:02.560 that.
00:34:02.800 But then when
00:34:03.460 I actually saw
00:34:04.920 my dosage and
00:34:05.780 did a little
00:34:06.400 arithmetic,
00:34:07.120 it was more
00:34:07.860 like 150 to
00:34:10.160 200 flights to
00:34:11.160 Hong Kong this
00:34:11.940 year.
00:34:12.420 Right.
00:34:12.660 Right.
00:34:12.920 So it's just,
00:34:14.520 again, I guess I
00:34:16.180 could be an
00:34:16.580 airline pilot this
00:34:17.420 year and it's
00:34:17.840 okay, but still
00:34:19.260 it's just to have
00:34:20.260 that wrong by
00:34:21.800 orders of
00:34:22.960 magnitude,
00:34:23.380 it's just
00:34:24.580 bizarre.
00:34:25.700 Well, and also
00:34:26.620 if you take it
00:34:28.480 by number of
00:34:29.880 chest x-rays,
00:34:30.500 when you tell a
00:34:31.100 patient that's
00:34:31.700 like 2,000
00:34:32.660 chest x-rays,
00:34:33.700 they say,
00:34:34.300 no, no,
00:34:34.760 I'm not doing
00:34:35.600 that.
00:34:36.240 Right.
00:34:36.440 So if we just
00:34:37.620 were real about,
00:34:39.580 and the other
00:34:40.060 thing you mentioned
00:34:41.020 I think has to be
00:34:42.220 underscored as well
00:34:43.400 is that there's so
00:34:44.640 much variability
00:34:45.440 in the exposure
00:34:46.880 of the radiation.
00:34:48.680 So we have,
00:34:50.480 again, this is out
00:34:51.220 of paternalism,
00:34:52.200 there's,
00:34:53.180 you're rare because
00:34:54.400 you actually ask
00:34:55.340 your doctor,
00:34:56.420 but most patients
00:34:57.440 just go and have
00:34:58.600 the scan.
00:34:59.720 Right.
00:35:00.480 And so this is
00:35:01.760 something that's
00:35:02.300 just not right
00:35:03.100 because this is
00:35:04.280 information that
00:35:05.180 everybody should be
00:35:06.080 entitled to and
00:35:07.140 they should be
00:35:07.780 part of the
00:35:08.480 decision of whether
00:35:09.300 they want to
00:35:09.960 accept that type
00:35:10.940 of exposure to
00:35:11.800 radiation.
00:35:12.900 Okay, so let's
00:35:13.860 bring in the
00:35:14.540 robots.
00:35:16.540 How did you get
00:35:17.940 interested in AI?
00:35:19.240 When do you date
00:35:20.300 your awareness of
00:35:21.940 it as a possibly
00:35:23.240 relevant technology?
00:35:24.400 Well, you know,
00:35:26.580 I had been working
00:35:27.360 in the prior
00:35:28.300 times on
00:35:29.560 digital medicine.
00:35:31.480 That was a
00:35:31.880 creative destruction
00:35:32.540 medicine.
00:35:33.340 If you'd like to
00:35:37.140 continue listening
00:35:37.740 to this conversation,
00:35:39.080 you'll need to
00:35:39.580 subscribe at
00:35:40.260 samharris.org.
00:35:41.840 Once you do,
00:35:42.480 you'll get access
00:35:42.980 to all full-length
00:35:43.880 episodes of the
00:35:44.540 Making Sense podcast,
00:35:45.920 along with other
00:35:46.500 subscriber-only content,
00:35:48.260 including bonus
00:35:48.960 episodes and AMAs
00:35:50.640 and the conversations
00:35:51.680 I've been having
00:35:52.240 on the Waking Up app.
00:35:53.120 The Making Sense
00:35:54.320 podcast is ad-free
00:35:55.560 and relies entirely
00:35:57.000 on listener support.
00:35:58.380 And you can
00:35:58.840 subscribe now
00:35:59.620 at samharris.org.