The Saad Truth with Dr. Saad - October 29, 2025


Cardiologist Dr. Aseem Malhotra - On Evidence-Based Medicine (The Saad Truth with Dr. Saad_906)


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

190.82097

Word Count

12,027

Sentence Count

644

Misogynist Sentences

1

Hate Speech Sentences

7


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Dr. Asim Malhotra, academic, clinician, and close ally of Robert F. Kennedy Jr, is back on the show to talk about his early days in the anti-Vaccination movement. He talks about the early days of the MHA movement, his first visit to Washington, D.C. in the early 20th century, and what he's been up to since then.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Everybody, this is Ghat Saad.
00:00:01.900 Rarely do I have repeat guests that can pass the threshold of excellence to be re-invited again.
00:00:09.260 But my current guest, Dr. Asim Malhatra, academic, clinician, close ally of the Maha movement,
00:00:16.580 and Robert F.J. Kennedy, is back on the show.
00:00:19.700 How are you doing, Asim?
00:00:21.040 Ghat, it's always a pleasure and honor to be back.
00:00:23.980 Oh, you're very kind.
00:00:24.820 All right, I went back just to do my homework to check when was that first time that you came.
00:00:31.100 I don't know if you checked also, but if you haven't, just off the top of your head to see if you've got temporal accuracy.
00:00:38.980 Do you roughly know when it was?
00:00:40.620 I think it was April, May 2023.
00:00:42.920 So it was after Dan Rogan, and I think I was in Santa Eulalia, Ibiza.
00:00:47.280 Okay, you're exactly right.
00:00:49.300 May 15th, 2023, which actually happens to be my dad's birthday, May 15th, and he's still around 95.
00:00:56.920 Oh, wow.
00:00:57.820 Yeah.
00:00:58.460 So, all right.
00:00:59.860 By the way, I've never done a podcast on my birthday, so this is the first podcast I'm doing on my birthday.
00:01:04.820 This is your birthday?
00:01:05.920 Yeah.
00:01:07.300 Of all the things you could be doing, you're sitting with a foreign professor on your birthday?
00:01:14.120 No, this is a pleasure for me.
00:01:16.720 Thank you.
00:01:17.220 Happy birthday.
00:01:17.940 In Arabic, let me say it in Arabic, and then I'll translate.
00:01:21.020 In Arabic, when it's someone's birthday, you say, which means, like, may you reach to 100, looking to 100.
00:01:29.920 That's the plan.
00:01:30.680 Thank you.
00:01:31.020 I appreciate that.
00:01:31.860 All right.
00:01:33.000 So, May 2023, we talked about all kinds of things.
00:01:36.560 We talked about vaccines.
00:01:37.840 We talked about should we use statins or not.
00:01:39.960 I went back to check what we discussed.
00:01:41.680 We discussed all sorts of diets and what are the good ones, the bad ones.
00:01:45.720 Since then, till today, fill us in on what's been going on in the world of Dr. Malhotra.
00:01:52.500 Wow.
00:01:53.000 Quite a lot, Gad, to be honest.
00:01:54.940 I think the most significant, how should I put it, events in my life are probably very much linked to the Maha movement.
00:02:02.820 Because, of course, when we last spoke, at that stage, you know, there was no discussion of President Trump and Robert Kennedy Jr. joining forces.
00:02:12.880 And then, you know, since we spoke, of course, I initially then helped when Robert Kennedy Jr. was in his presidential campaign.
00:02:20.400 I was very much helping him with that.
00:02:23.240 Went to organize an event for him in San Jose, California, towards the end of 2023.
00:02:29.520 And, yeah, and then obviously so many things have happened, haven't they?
00:02:35.860 We've been pushing this, you know, this obsession of mine and obsession of a lot of people because the major issue is around getting more transparency on the COVID vaccine, but in particular asking for it to be halted.
00:02:46.640 And it culminated a couple of months ago with Robert Kennedy Jr. through HHS basically stopping half a billion dollars worth of investments in the mRNA technology for vaccines specifically, which I think is a big victory and I think sends a message.
00:03:03.020 So I think there's been a lot of progress on that front.
00:03:04.940 I still think there is a big issue with a lack of sort of accountability in particular to address all the people who have been vaccine injured.
00:03:13.380 And also, I think, Gad, the other thing to mention as well, I don't want this whole discussion to be about the COVID vaccine with lots to talk about, but it's a symptom of a bigger system failure, really, is that the discussions I've had with very senior people in the CDC, and this is my concern as well, is that we are now in a situation where more evidence is emerging of the long-term harms of these COVID vaccines.
00:03:41.260 And there are question marks, certainly around whether or not it may be a driver of cancer.
00:03:47.860 And if it is, then what we want to do is how do we identify people at risk, and then what can we do about it?
00:03:53.480 So these are the discussions going on behind the scenes right now with very, very senior people in the Trump administration.
00:04:00.260 So two points.
00:04:01.680 Number one, for those of you who don't know, I'm not explaining this to you, but for our listeners, a meta-analysis is a statistical technique that allows us to take a bunch of studies that were conducted by different people, put them together because they are passing certain inclusion metrics,
00:04:19.040 and then imagine them as sort of one mega study to kind of get a bottom line, you know, snapshot picture of where we are in a particular area of research.
00:04:29.840 So since we last spoke till today, have there been one more or more meta-analyses that would support the claim that the vaccines are harmful or it's still debatable?
00:04:42.380 That's a great question. So they haven't done a specific meta-analysis, Gad, but Stephen Hatfield, who's one of the chief medical officers within HHS, who was involved in the analyses that they did to make the decision of stopping the investments moving forward in mRNA technology for vaccines,
00:05:00.300 he described something called a mega, which I'm not familiar with, but he said that we did a mega-analysis.
00:05:05.320 But in other words, I think it's another way of saying we looked at the totality of data up to date, data, and they concluded that very clearly it was more harmful than beneficial.
00:05:13.900 And they said that, you know, they discovered, you know, through hundreds of studies of harm, but also one of the mechanisms that concern them is that the COVID mRNA vaccines were interfering with tumor suppressor genes.
00:05:24.880 So that means there is a plausible biological plausibility that they could, let me be very careful with my wording here, precipitate cancer.
00:05:32.160 And of course, some data is emerging, observational studies, one from Italy, one from South Korea, suggesting a strong association with vaccinated people and many different types of cancers, in particular in the young.
00:05:44.160 And that's kind of what we are seeing anecdotally as clinicians.
00:05:47.460 We're seeing, you know, I've had friends, people I know in my extended social circle, getting things like colon cancer in the early 40s, which is just very unusual, people with no real family history.
00:05:57.900 So I think that in my view, the weight of the evidence suggests that it probably is causing or has some carcinogenic potential, but just how, what, to what severity, what grade, what the strength of that association in terms of also the, that risk, we haven't fully understood yet, Gadsad.
00:06:17.640 But I think that in terms of just to break it down very simply without overcomplicating it, and people can look at this, I had a letter published in the BMJ recently, and that was after a very interesting series of events that happened in the media in relation to me speaking at the Reform UK Party Conference, right?
00:06:34.940 And I'll come back to that. But in that, the BMJ, after much pressure and their own internal vetting process, and it was quite extraordinary, they published, you know, a 600-word piece for me as a letter.
00:06:47.140 And in that, I wrote a few key points, which is also something coming to your expertise, which I want to ask you about.
00:06:53.460 But what I said was, if you go back to square one, which was the original randomized trials, right, which for people listening is the highest quality level evidence you can have, right?
00:07:01.980 This is how drug approval occurs. That reanalysis that was done at Pfizer and Moderna's trials by Joseph Freeman, Peter Doshi, Sander Greenland, very eminent data scientists, independent of drug industry influence,
00:07:14.100 they found from the beginning, from square one, you were two to four times more likely to suffer a serious adverse event from the COVID mRNA vaccines, Pfizer and Moderna in particular, they looked at both those trials,
00:07:26.800 than you were to be hospitalized with COVID. And that serious adverse event rate was about 1 in 800. And that's just in the short term.
00:07:33.300 Yeah, this is just at two months. And of course, since then, we've got a multitude of studies, whether it's observational studies, whether it's autopsy studies, whether it's, you know, clinical data,
00:07:42.520 whether it's plausible biological mechanisms of harm, you know, basically all going in the wrong direction, saying this is actually quite, quite bad.
00:07:51.160 So for me, as someone who has had to turn, you turn, having took the vaccine, having promoted it on TV for high risk people, you know, now looking back with hindsight, looking at was it going to be overall harmful compared to beneficial?
00:08:04.800 When you look at the all age groups that were coerced to have it, if you like, it's a slam dunk, as far as I'm concerned, right?
00:08:12.080 And also, if you look at the issue of what are the benefits, and you break down the benefits in similar way as a 1 in 800 harm, for the highest risk groups, Gad, right now, which is people over 90, who are going to get the best benefit,
00:08:25.200 and the only country in the world that has released this data, comparing vaccinated versus unvaccinated by age group, not looking at confounders, of course, right?
00:08:33.040 But let's just look at those, that data alone, you have to vaccinate 7,000 people over 90, by the end of 2024, for six months protection, because that's the longest it will probably give you protection, right?
00:08:45.780 For people over 90.
00:08:47.340 So you've got, and then once you get any younger age groups, Gad, people under 80, 70, you're talking about 10s of thousands, hundreds of thousands, and millions of people, especially the young, healthy adults, need to be vaccinated, one person being hospitalized with COVID, right?
00:08:59.980 So it's, it's, it's really a no brainer, as far as I'm concerned, like, so, so the question is, why is this not being accepted? And this is where I was going to ask you exactly that, because I'm not familiar with all of the intricacies of that particular data, those data sets.
00:09:16.040 So if I hear you, and I presume that everything you're saying has 100% veracity, I'm saying, okay, well, that sounds good to me.
00:09:22.920 So if someone who's sitting instead of me, but who is a detractor of your position is listening to this, what are they rebutting?
00:09:32.520 Well, they're not really rebutting. That's the problem, Gad. That's why I think there's a problem. They're ignoring, okay?
00:09:38.180 So there's, there's ignoring, and then this is what's happened in the last few weeks, which I think you'll listen to be fascinated by.
00:09:42.900 So, and we'll delve into, so ultimately, for me, and this is what I've learned in my own journey in the last few years, losing both my parents and almost going through my own spiritual journey, is what I've learned more than anything else, Gad, during the pandemic years, is that the barriers to the truth are primarily psychological.
00:10:04.980 They are not intellectual.
00:10:06.000 Now we're entering my world.
00:10:07.560 Right. And you get that. You understand that probably better than anyone. So, so what I've noticed, and it's, it's, sometimes it's, it's, we're humans, we're emotional beings, we react, right?
00:10:16.500 So I'm getting all this abuse and backlash, okay, to the stuff I've been doing over the years, but nothing got to, the vitriolic attacks towards me, I think culminated at their peak so far with me speaking at the Reform UK.
00:10:28.960 So Reform UK, just for people listening, is one of the political parties in the UK right now.
00:10:32.620 They are, at the moment, if there was a general election, which isn't going to happen for four years, they would sweep the board, they would be in power, okay?
00:10:40.480 They asked me, a consultant cardiologist, which is very unusual, by the way, for a party conference, right?
00:10:45.300 It's like, like a convention, I think you, in Canada or US, you call it like a democratic convention or republican convention, that kind of thing.
00:10:52.380 They asked me to be a keynote speaker, right, on the main stage.
00:10:54.780 So I had eight, I had 15 minutes, everyone had 15, 20 minutes max, and I went over a little bit.
00:11:00.440 And Nigel Farage, who is the leader of Reform UK, basically wanted me essentially to give an update of what's been going on in the US and my role there.
00:11:09.240 But the title of my talk is, How Do We Make Britain Healthy Again?
00:11:12.460 You have the, you know, the Maha movement in the US, so it's how to make Britain healthy again.
00:11:15.720 And of course, I address what I feel is the root cause of the problem, which is commercial distortions of the scientific evidence, okay?
00:11:23.820 That was, that for me, it is a, if not the major root cause of the problem.
00:11:28.520 And of course, to illustrate that fact was to present the data I've just told you on the COVID vaccine.
00:11:35.820 But also what I did, and of course, you know, one can predict this could potentially have caused an uproar, and it did, is that I quoted, and this is an interesting, this is where the, you're going to love this bit.
00:11:47.520 I quoted one of Britain's most eminent, most published oncologists, who's also an immunologist and a vaccine developer, an expert in vaccine development.
00:11:57.200 His name is Angus Dalglish, Professor Angus Dalglish.
00:12:00.480 Just to give you an idea of just how brilliant this man is, he was actually behind discovering that CD4 cells were involved in HIV and AIDS.
00:12:09.560 Okay, so he has that background, okay?
00:12:12.000 He's not just a rookie.
00:12:13.260 He's a guy with a lot of experience.
00:12:15.520 And he asked me when I was going to give this talk, because people have intuitively, who are kind of a bit more awake, if you like, to this situation with a potential cancer link.
00:12:26.420 And we had four members within a two-year period, four members of one family, probably the most famous family in the world, the royal family of the UK, okay?
00:12:34.380 Getting cancer, right?
00:12:36.420 So it's a little bit strange, first of all.
00:12:39.280 And I think anyone who understands any basic science will think, clearly there's been some kind of exposure, something, right, that has caused within a two-year period of different age groups to all get cancer in a family that doesn't get cancer, okay?
00:12:52.160 So, and then when you know that there is a plausible mechanism, and in fact, Angus Dalglish is much stronger than I am, he's unequivocal in saying that the COVID vaccine causes cancer, especially if you have boosters, for an immunosuppressive effect.
00:13:06.300 And in fact, there are five or six different plausible mechanisms that can do this.
00:13:09.860 So he asked me to say, and I was very careful with my wording, you're a scientist, you get this.
00:13:13.660 I said on stage that one of Britain's most eminent oncologists, Professor Angus Dalglish, has asked me today to tell you today that he believes that the COVID vaccines have played, likely played a significant role in cancers of members of the royal family, okay?
00:13:31.820 Now, all hell broke loose at that point, all the British press were there, the BBC, the Guardian, you know, and then suddenly it turns into, you know, cardiologist links, you know, they didn't mention Dalglish, which is interesting.
00:13:50.400 They put it all on me, okay, first and foremost, right?
00:13:52.320 Well, because they could delegitimize you because you are a cardiologist, you're out of your lane, you know nothing about viruses or about oncology, so shut up and stay in your lane, correct?
00:14:04.640 Yeah, yeah, absolutely, it could be.
00:14:05.860 But again, you know, it's about accurate reporting, right?
00:14:07.840 And the question is, why would they also report in the way that they did, which was basically slamming me?
00:14:11.600 And then it was just hatchet jobs in the media for the next couple of weeks, which even culminated, okay?
00:14:16.680 And this is where things get kind of, I don't know whether it's strange or, again, it comes back to the issue of the psychological barriers to the truth, is where, you know, the situation happens where you are then basically, people are using a straw man argument.
00:14:31.420 So they're painting an extreme version of yourself, picture of yourself, and attacking that and ignoring the nuance, right?
00:14:38.320 And what happens next is, the British Prime Minister, in a full House, in Parliament, like five days later, obviously, I'm now part of a political football because they want to attack reform, they're threatened by Reform UK.
00:14:53.100 So I'm part of that now, even though I'm not a member of reform, even though I didn't write their health policy, I was there as a guest speaker, okay?
00:14:58.820 And Gad, on that point, it's really important, again, to emphasize this, because there's so much polarization of health, and it becomes so politicized.
00:15:07.280 Just, I was, and I think, I hope, maybe, I hope you would agree with this, I think you will, but let's just put it out there, right?
00:15:14.820 My duty, my ethical code as a doctor, okay, is to treat people to the best of my ability, regardless of their race, their religion, their sex, okay, their political affiliation, right?
00:15:29.180 So if people were attacking me for, I said, why are you speaking?
00:15:31.080 I said, listen, you know, Malcolm X said, I'm for the truth no matter who tells it.
00:15:35.380 I remember before I went and even Tucker Carlson, you know, at the end of 2022, somebody I respect, very, very famous, and I respect this guy, I'm not going to name him, right?
00:15:45.840 He's a Jewish doctor, very famous in America, I've been friends with him for a long time, and he messaged me beforehand, said, Asim, you know, I'm disappointed that you are going to be interviewed by a white supremacist, okay?
00:15:56.260 I have a story about that, finish your story, but I have a story about Tucker.
00:15:59.760 But the point is, I went there, because I'm a doctor, I feel this is a big issue, they asked me to speak, and anyway, so I said what I said, Prime Minister gets up in Parliament, now, there's one thing, taking what I said, and arguing against it, and forming an opinion, but they didn't even do that.
00:16:14.020 But he got up and got him in Parliament and basically said to attack reform that you had a man, essentially, who linked, right, in brackets, all vaccines to cancer.
00:16:24.560 So we've gone from, I'm suggesting there may be a role with Angus Dalí suggesting COVID vaccine, now, I'm now saying all vaccines and cancer.
00:16:31.380 This is just so, so it's ridiculous, and he misled Parliament, right, unwittingly, I think.
00:16:35.300 I wrote this in the BMJ, so it's in the BMJ, in my rapid response, and I was like, this is just insane.
00:16:41.140 But I also thought this is great, this is great material, because actually, what's happening in their attacks, Gad, without addressing the nuance, right, which is so important, right, we're talking about trying to be as precise as possible in science, right, as we can, okay, is that the fact that they are doing this, for me, it tells me they don't really have any strong argument against what I'm saying.
00:17:02.640 They can't form a robust argument, and therefore, let's misrepresent a scene, let's call him an anti-vaxxer, but Gad, this is the point I'm going to come on to right now.
00:17:12.080 Psychologically, this has a very powerful negative effect in the media, okay?
00:17:16.160 I don't know if I talked to you about this before, but there's a great documentary called King in the Wilderness about the last two years of Martin Luther King's life, and many people don't know that he became one of the most hated men in America.
00:17:28.940 By the way, this is a guy that already had the Nobel Peace Prize, right?
00:17:33.080 Why did he become one of the most hated men in America?
00:17:35.000 He was one of the first to stand up against the Vietnam War.
00:17:38.160 And I remember there's a scene in the documentary where he gets up and he says, people are damning me for saying we shouldn't be killing little brown children in Vietnam.
00:17:46.260 And the press turned on him, right?
00:17:48.700 And he was a tough guy, he could handle it, right?
00:17:50.640 He could handle the press.
00:17:51.580 And they accused him, they said he bordered on treason, all this kind of stuff.
00:17:54.340 But what sent him to clinical depression, right?
00:17:56.780 He went into clinical depression because the effect on the press was so strong that it affected his friends.
00:18:02.300 And his friends started, his closest friends started turning on him.
00:18:05.320 Now, I'm not going to name this person.
00:18:07.040 I posted on my social media in a very, I felt, very nuanced way, saying, one, I'm not a member of reform.
00:18:13.000 This is a big issue about the COVID vaccine, et cetera, et cetera.
00:18:15.980 This long post.
00:18:16.800 And a family friend, okay, this is going back to childhood, this is an Indian community, he's a doctor, someone who's looked up to me, you know, he's a younger kid, right?
00:18:26.260 Who I've known since I was literally, whatever, five years old or whatever else.
00:18:30.000 And his immediate response, even almost without reading whatever, he basically said, you are, you know, because I said, you know, because part of the story is that this instigated the General Medical Council.
00:18:43.080 There were calls, including from the Secretary for Health, for the General Medical Council to basically strip my medical license, okay?
00:18:49.260 This is how extreme it's got, right?
00:18:50.540 So this, for me, is, you know, potentially a bit stressful, but all on a misrepresentation of what I said.
00:18:55.420 And this guy basically wrote, was to the effect of, you deserve what's coming to you.
00:19:00.180 Yikes.
00:19:01.080 Well, just to, to not, I mean, not that misery loves company, but just to show you that you're hardly the only one to go through this.
00:19:09.980 I want to mention the story about Tucker Carlson, about another friend.
00:19:13.360 So after I had appeared, so I've appeared many times on Tucker Carlson's show when he had a television show with Fox, but then he had invited me to his, maybe that's where you went also, to his long form podcast.
00:19:24.400 Is that what you did as well?
00:19:26.160 It was actually, no, it was before his podcast.
00:19:28.000 It was when he was still with Fox.
00:19:29.240 Ah, okay.
00:19:29.720 And it was a one hour with Fox Nation.
00:19:31.280 Okay.
00:19:31.580 So I had gone to his studio in Florida and, you know, we had met in person.
00:19:39.220 He met my family.
00:19:40.220 He was very warm, very gracious with our young kids and so on.
00:19:42.900 So when I returned back to the hotel, I posted a tweet saying, you know, and tagged Tucker.
00:19:49.940 Hey, Tucker, it was great, you know, seeing you again and thank you for your, you know, gracious warmth with my family.
00:19:56.740 I received a public tweet reply from my cousin.
00:20:04.600 Now, this is not just a Jewish doctor that you were friends with or the kid who you grew up with or whatever.
00:20:10.400 This is a guy who was my closest friend growing up in Lebanon who went through the Lebanese civil war with me.
00:20:17.560 We hid under the bed when the rockets were coming.
00:20:21.380 So you would think there is an unbreakable bond between us.
00:20:25.080 Not only was he my closest childhood friend in Lebanon, but he's my cousin.
00:20:30.400 So he writes to me something to the, I don't have the exact tweet in front of me, but have you no shame?
00:20:36.020 And, you know, like how could you associate with such a person publicly, not even privately?
00:20:42.360 So that's story one.
00:20:43.420 Story two, I was speaking in Savannah, Georgia this past February, and I knew that an old friend
00:20:52.680 from our days at Cornell, a friend that I've loved.
00:20:56.820 We haven't spent many time together, but, you know, one of the loveliest, warmest, kind
00:21:00.820 guys you could ever hope to meet.
00:21:02.260 I knew that he lived in the area at this point.
00:21:04.680 So I reached out to him and I said, hey, I'd love to get together with you if you're around.
00:21:10.080 I'm in Savannah.
00:21:11.040 Maybe we could hook up and hang out.
00:21:13.080 He writes me a long thing, basically saying, in light of the fact that you seem to be harboring
00:21:19.400 any inkling of a positive response to Donald Trump, I'm afraid that I could no longer pursue
00:21:26.940 my friendship with you.
00:21:28.820 So a friendship of 30 plus years is gone.
00:21:33.260 By the way, I'm Canadian, so it's not as though I voted for Donald Trump.
00:21:36.680 But yes, I don't have Trump derangement syndrome.
00:21:38.800 I don't think he's the greatest existential threat ever, that I lost that friendship.
00:21:43.500 A cousin with whom I grew up and went through the Lebanese Civil War.
00:21:47.400 So basically, I don't know if it makes you feel any better.
00:21:50.660 Yes, it's very painful what you went through, but that's regrettably part of the feature
00:21:54.940 of the architecture of the human mind.
00:21:56.600 It's horrible.
00:21:57.400 No, absolutely.
00:21:58.060 And I think one of the books I read very recently, which you may be familiar with,
00:22:02.260 Matthias Desmet, who's a clinical psychologist.
00:22:04.840 He wrote The Psychology of Totalitarianism.
00:22:06.680 Yes, yes.
00:22:07.160 And that was fascinating.
00:22:08.700 And that really, I think, for me, is another piece of the jigsaw.
00:22:12.160 Because when I give my lectures, I tell people that the greatest barrier, I talk about the
00:22:16.800 fact that what's worse than ignorance is the illusion of knowledge.
00:22:20.820 And the three components I feel that people can probably relate to during the pandemic is,
00:22:27.440 one is obviously fear and what fear does, right?
00:22:30.280 It inhibits your ability to engage in critical thinking.
00:22:32.620 Then you've got willful blindness, which is linked to that.
00:22:36.540 And then the third part, I think, is this sort of mass hypnosis that Matthias describes, right,
00:22:41.760 that can happen to the extent where even families will surrender their own relatives to the state, right,
00:22:51.780 who are basically rebelling.
00:22:52.960 I mean, that is absolutely extraordinary.
00:22:56.340 And then you and I have experienced that ourselves, right?
00:22:58.420 And we're like, and of course, with your background, I'm sure it's still hurtful,
00:23:01.920 but you can probably rationalize it because you're an expert in this area.
00:23:04.680 Well, since we're talking about pathogens and viruses and so on, I think I have an innate
00:23:12.060 inoculation against it to some extent in that I have a personality that allows me to not wilt
00:23:18.420 away.
00:23:19.080 If you rattle me, I'll come back and fight more.
00:23:22.300 So I have that inoculation.
00:23:26.080 But as you said, I think earlier, we're all human and it does bother you once in a while.
00:23:30.960 So, for example, the fact that I have a lot of family in Israel, and hence when October 7th happened,
00:23:39.300 I was concerned about their safety, as would any human being be.
00:23:45.660 Suddenly, so my caring about the safety of my Israeli family became, I condone the genocide of Gazan children.
00:23:57.940 So, exactly to your point where, you know, you were making that point during your speech
00:24:04.480 that was very delineated, that was very specific, it turned into, don't take any vaccines,
00:24:10.880 all vaccines cause cancer, and then Dr. Malhotra is a quack.
00:24:15.280 Well, guess what?
00:24:16.180 I am the head of the Mossad, apparently, who is orchestrating the daily genocide of Gazan babies
00:24:23.520 because I care about my Israeli family's safety.
00:24:27.080 So the world is insane, but that's why I get up every day, by the way, you know, with sort of
00:24:32.360 anticipation because I'd like to think, with my optimistic nature, that we could develop the
00:24:38.260 mind vaccine to fight against all this nonsense.
00:24:40.880 Yeah, absolutely, Gad.
00:24:42.080 And actually, there's a great, for people listening, there's a video I've watched and shared with so
00:24:46.280 many people called, Why Do Intelligent People Believe Stupid Things?
00:24:49.580 I don't know if it's after school, YouTube, and essentially, what's, you know, the crux
00:24:53.880 of all of it is, is the, what is it about, it's not so much about intelligence, it's about
00:25:00.000 your character that allows you to get to a greater truth.
00:25:02.640 And the most important characteristic is, I know it's not easy because, you know, I think,
00:25:07.020 I don't know whether you can, maybe Jordan Peterson talked about whether, I don't know
00:25:10.440 if there's a correlation between intelligence and narcissism, but, you know, it's about being
00:25:15.780 humble, right?
00:25:16.800 And that's part of your personal spiritual growth as one, is that, you know, go into
00:25:22.020 a conversation, I mean, Charlie Kirk mentioned this as well, right?
00:25:24.660 Go into a conversation thinking you could be wrong, right?
00:25:27.580 How do we encourage a culture, Gad, right?
00:25:31.060 How do we encourage a culture where we teach kids about humility, about the fact that things
00:25:36.560 may change, information emerges, even coming back to the basics of just realizing there's
00:25:40.300 7 billion of us on this planet, and this, it's very, it's self-evident that there is
00:25:46.300 more that we don't know than we actually do know.
00:25:50.600 That's where you should start from, shouldn't it?
00:25:53.480 Yeah, absolutely.
00:25:54.220 So there is actually, by the way, a psychometric scale that measures intellectual and epistemic
00:26:01.780 humility.
00:26:02.240 It came out a few years ago.
00:26:03.560 So you can actually administer this to people to get their scores on how they score.
00:26:09.280 In the same way that you can get on a scale to measure your weight, you can find out whether
00:26:13.200 Asim scores here or here on intellectual humility, but to build further on your excellent point.
00:26:20.260 So if you look at someone like Joe Rogan, or someone like Elon Musk, or if I dare say in
00:26:28.340 however successful I've been in my own career and in hosting people like yourself, it takes
00:26:35.080 humility to be able to say, you know what, when it comes to cardiology, it is perfectly
00:26:41.740 reasonable that I probably know 0.01% of what Asim Malhotra knows.
00:26:48.760 And that doesn't in any way diminish me.
00:26:51.240 As a matter of fact, that's why I'm inviting him for a conversation on my show, because I
00:26:56.700 concede the fact that the guest that I'm inviting has something interesting to say that I probably
00:27:03.320 know very little about.
00:27:04.820 So why did Joe Rogan develop the platform that he did?
00:27:09.100 Because he sits with people and he doesn't presume that every single guest that's there,
00:27:13.940 he knows more than them.
00:27:15.280 I sat with Elon Musk for four plus hours at his house.
00:27:20.380 I've never seen someone so humble.
00:27:22.160 You could think, I mean, he's going to talk over me, he's going to, exact opposite.
00:27:29.580 We were locked in, two guys having a conversation, zero ego.
00:27:36.120 We all, we each know that we bring something to the table, but we are not in any way diminished
00:27:41.440 by the fact that the other, the interlocutor might know a million more things than me on
00:27:46.960 a million things.
00:27:47.840 That's what makes life beautiful.
00:27:49.380 On the other hand, I could tell you, I have family members who, if Dr. Malhotra, the cardiologist
00:27:56.780 sat with them and they asked you about the heart, you would never be able to finish a
00:28:03.620 sentence because they would be teaching you about heart surgery, even though they never
00:28:08.920 finished elementary school.
00:28:10.460 That's the narcissism.
00:28:11.940 And by the way, many of those family members I'm estranged with because I simply could not
00:28:17.280 stomach that anymore.
00:28:18.360 So, yeah.
00:28:19.720 Wow.
00:28:20.280 Interesting.
00:28:20.780 So interesting.
00:28:21.700 So I think, I just wonder what you, how you feel.
00:28:24.420 Obviously we're in this, I don't know, I think revolutionary time.
00:28:28.620 There's a lot of polarization.
00:28:30.040 There's tribalism.
00:28:31.200 Okay.
00:28:31.480 We see it.
00:28:32.020 We see on social, it's exacerbated probably by the social media algorithms as well.
00:28:36.000 You know, and Kenny talks about healing the divide.
00:28:38.000 And I think we need to do that, but I don't know whether we figured out exactly how to do
00:28:42.380 that yet, Gad.
00:28:42.960 I mean, I sometimes see myself as a bit of a Trojan horse because I come from traditional
00:28:48.880 left-leaning politics.
00:28:50.600 Okay.
00:28:50.720 Traditional left, right?
00:28:52.000 Not woke left.
00:28:53.140 Okay.
00:28:53.860 Traditional left.
00:28:54.780 Right.
00:28:55.000 But I have tried, but my friendships are across the board, right?
00:29:00.260 Because I speak to everybody.
00:29:01.480 And now I'm much more now, I wouldn't say that I align myself with the left or the right.
00:29:08.140 You know, I'm just for evidence-based medicine.
00:29:10.160 I'm for the truth.
00:29:11.160 I'm for improving patient outcomes.
00:29:12.460 I'm for understanding how do we get better psychological health.
00:29:15.880 It's a learning, you know, a learning process for me, but, uh, but actually what was interesting
00:29:21.000 is the concepts I talk about when I talk about, um, you know, what is it that builds a healthy
00:29:26.140 society?
00:29:26.720 And there's something called the social, socioeconomic or biopsychosocial determinants
00:29:30.600 of health at the base, right?
00:29:31.920 Basic needs being met are the most important, the first stage, you know, have you got enough
00:29:35.340 food to eat?
00:29:36.300 We've got housing, right?
00:29:37.260 If these are not met, then you can't really advance.
00:29:40.600 Um, and, and I talk about concepts that are traditionally probably thought to be very left.
00:29:44.680 And I'm going to these places, Gad, and this is over the last few years, all across America,
00:29:49.500 where most of the most receptive audience to me are people from the ideological right.
00:29:55.700 And I'm getting standing ovations and I'm thinking this is, there's some, it's so interesting,
00:30:00.460 right?
00:30:00.700 And, and I'm, I'm, I'm telling them concepts.
00:30:03.280 I'm talking about things that normally is very, very strongly embraced by the left,
00:30:06.740 right?
00:30:07.080 But I'm just for evidence-based, okay?
00:30:08.760 I'm for evidence-based medicine.
00:30:10.220 So I just find that really interesting, but it gives me hope that if.
00:30:14.680 It's about enough people almost in the middle, having humility, having compassion, and it's
00:30:21.600 not always easy because it's, it's almost like, you know, um, how does one try and engage
00:30:27.600 in a conversation with someone that has verbally abused you, right?
00:30:32.120 I see that as a challenge.
00:30:33.040 Actually, I see it as a challenge for myself, right?
00:30:36.320 And I think we need more people who are doing that.
00:30:39.720 And I think we can then hopefully bridge these, these, these, these very, um, unhelpful
00:30:46.120 and I think, um, uh, detrimental to society divisions.
00:30:51.240 I agree.
00:30:51.580 I, so just yesterday, do you know who Mehdi Hassan is?
00:30:55.140 I do.
00:30:55.920 Yeah.
00:30:56.120 So just yesterday, uh, I was exchanging.
00:31:00.960 It wasn't very pleasant, but not to any of my doing, uh, back and forth with him, but
00:31:06.880 right away, I was, you know, a genocide supporter.
00:31:09.600 I was committing genocide.
00:31:10.940 I was, you know, and so on.
00:31:12.300 Uh, I would love the opportunity.
00:31:14.820 I mean, whether it be Mehdi or someone else, he's just an exemplar of the point that he's
00:31:18.780 making.
00:31:19.060 I would love to be able to invite on my show people that are perfectly antithetical to
00:31:26.900 anything that I'm saying, right?
00:31:28.120 They love Islam.
00:31:29.680 Islam is peace.
00:31:31.220 You're an Islamophobe, God, you're a hater.
00:31:33.780 Okay.
00:31:33.960 Let's sit down and have a conversation.
00:31:36.060 I can't have a conversation with them.
00:31:38.000 I genuinely cannot, not, not because I'm not willing to, but because it right away, you
00:31:44.620 know, heads into the ad hominems and the so on.
00:31:48.700 There's a whole bunch of these kinds of characters.
00:31:50.860 I mean, it's probably not a good idea for me to have a conversation with Muhammad Hijab.
00:31:54.680 I don't know if you know who he is.
00:31:56.080 He's a sort of British extremist guy, but I would love to invite these guys.
00:32:00.440 I even had, by the way, on the show many years ago, a former radical kind of ISIS type
00:32:07.280 of guy who had kind of de-radicalized and he came on the show and we had a very productive
00:32:14.460 conversation.
00:32:15.020 I've had Imam Tawheedi on, although he's hardly radical.
00:32:19.600 He's actually quite liberal in his views for Islamic Imam.
00:32:24.000 So I'm willing to have conversations with everyone, but you're right.
00:32:27.520 The reflex is for people to demonize you and then other you and then I don't want to talk
00:32:31.700 to you.
00:32:32.240 And by the way, that's what made Charlie Kirk so great, right?
00:32:35.180 Because he did it as his primary job to go into the dines, into the Vipers' den and have
00:32:42.620 these incredible conversations and be gracious and be smiley.
00:32:46.540 Yeah, sometimes he was a bit punchy and spicy, but that's the great loss of Charlie Kirk because
00:32:51.500 so few of us lost what he was able to do.
00:32:54.800 Yeah.
00:32:54.900 And Gad, I want to ask you about this as well, what you think.
00:32:58.780 My understanding is that one of the reasons we have these very polarized positions and views
00:33:06.560 and vitriolic so-called arguments or debates is, and just bear with me with this, a broadly
00:33:14.120 used term, I think, is colonial thinking.
00:33:16.560 And what I mean by that is people who actually believe that other human beings are less worthy,
00:33:24.520 right?
00:33:24.960 This is, I mean, to take an extreme example, and please forgive me, but you know this yourself,
00:33:30.300 probably, you know, it's something that I'm sure close to your heart, is how obviously
00:33:34.320 did the Holocaust happen?
00:33:35.780 And it was basically dehumanizing Jews, right?
00:33:39.320 But I think that this is something all inherent potentially, maybe this is part of our primitive
00:33:43.840 brain, right?
00:33:44.720 Animalistic brain or whatever else.
00:33:46.560 I think that's where the problem is, where it really is at the root of the problem.
00:33:51.520 I think when you have, I mean, listen, I have embraced Buddhism, right?
00:33:56.780 It's given me a lot of strength in the last few years.
00:33:58.460 I'm a Hindu.
00:33:58.780 I was born a Hindu, but I've embraced Buddhism.
00:34:00.400 Of course, you know, the Buddhist philosophy thinking is that essentially, you know, there
00:34:04.040 are three stages of compassion.
00:34:05.460 The first stage is you accept everybody else is equal to you.
00:34:08.540 The second stage is that you put other people's needs ahead of your own, okay?
00:34:12.720 And I'm not saying, you know, this is something that everybody can do all the time,
00:34:16.320 but sometimes we do obviously do this.
00:34:18.040 And then the third, which is true altruism, is taking on the burden of others.
00:34:21.980 And the one situation in life where that happens, you know, unequivocally is a mother's love
00:34:27.820 for their child.
00:34:28.780 Right.
00:34:29.240 Right?
00:34:29.560 So that's like all the most extreme form of compassion.
00:34:31.960 Whether that can be translated to us, you know, to the broader community is a challenge.
00:34:36.680 But for me, I think that's the problem, Gad.
00:34:39.420 That's the real issue.
00:34:40.740 Why these debates and these discussions happening is that people start off from a position thinking,
00:34:46.340 I am a superior being, or I have more value.
00:34:49.840 My life is more valuable than yours.
00:34:52.040 I'm just wondering what you think about that, whether from a psychological perspective,
00:34:54.900 is that what's driving a lot of this?
00:34:56.900 Yeah.
00:34:57.040 Thank you for the question.
00:34:57.780 I mean, certainly that could be part of it.
00:34:59.980 So two points I want to make to address your point.
00:35:05.460 Number one, I'm glad you mentioned Buddhism, because in my forthcoming book, Suicidal Empathy,
00:35:10.760 when I'm talking about various ways by which empathy can go wrong, actually it was, and
00:35:16.960 I credit him, because I'm very, very assiduous to credit people, even in personal conversations.
00:35:23.160 Someone wrote to me and said, hey, Professor Saad, I don't know if you know about this, but
00:35:28.120 in Tibetan Buddhist philosophy, there is the concept of idiot compassion, which in a sense
00:35:36.960 links, it's not my framework of Suicidal Empathy, but I ended up citing that principle
00:35:43.840 from a Tibetan Buddhist master, so that even in the context of Buddhist philosophy, where
00:35:51.280 sort of the highest ideal is to be infinitely compassionate, there is a point of diminishing
00:35:58.240 return where your infinite compassion, to use their term, becomes idiotic compassion.
00:36:03.760 So that's point number two.
00:36:04.920 Point number two, and I think I might have sent this to you privately.
00:36:09.080 At one point, you had reached out to me to ask me about, you know, why is it that people
00:36:12.960 don't change their minds in light of incoming information?
00:36:15.560 So I don't know if I had mentioned this then to you, and if I have, apologies if it's repetitive,
00:36:20.640 but for our audience, it may not be repetitive.
00:36:23.460 There is this great book, and I want to bring it out and show it to you.
00:36:29.980 So this book, I think I might have shared it with you.
00:36:33.080 It's called The Enigma of Reason.
00:36:36.200 Yeah.
00:36:36.640 And every time I share this book, it's probably the fifth, sixth time on my show that I've
00:36:42.080 shared this book.
00:36:42.740 No, I'm not getting royalties for plugging their book, although I think I should start
00:36:47.560 getting royalties, but it's written by two French psychologists, both very much evolutionarily
00:36:54.020 minded, and I'll explain in a second why that's important.
00:36:57.360 So the faculty of reasoning, of to reason that humans have, we can study it from an evolutionary
00:37:03.600 perspective to say, well, why did human beings evolve the capacity to reason?
00:37:09.080 What evolutionary problem does it solve?
00:37:12.680 And now you're going to see how I'm going to link it with some of the stuff you're talking
00:37:15.520 about.
00:37:15.920 And what they say in their book, which regrettably I agree with, is that we didn't evolve the
00:37:22.640 capacity to reason to achieve some objective truth, but rather to win arguments.
00:37:30.240 Yes.
00:37:30.680 Right?
00:37:30.960 So that speaks to the ego.
00:37:32.940 I'm superior to you.
00:37:34.480 I am in group A, you're in group B.
00:37:36.460 I don't care if my position is veridical.
00:37:40.360 That's not the point.
00:37:41.420 The point is that Dr. Malhotra will never say anything that invalidates my position.
00:37:49.160 I will die before I concede to him.
00:37:52.640 Now, that's, and by the way, I take that point, and in chapter seven of The Parasitic
00:37:58.060 Mind, where the chapter is titled How to Seek Truth.
00:38:01.540 So I am about to propose an epistemological procedure to seek truth, but yet I start with
00:38:08.440 their story, which says people don't want the truth.
00:38:12.180 So therein lies the conundrum, because we've got the tools to seek truth, but most people
00:38:18.500 don't care about the truth.
00:38:19.880 They care about their team winning.
00:38:21.560 Yeah, no, absolutely.
00:38:24.020 Absolutely.
00:38:24.700 And in fact, on that as well, because it's a big discussion going in the UK right now,
00:38:27.740 there's a push by the prime minister to push these digital IDs.
00:38:31.820 And of course, part of that, people are, you know, understandably skeptical about it being
00:38:36.060 misused, is around, and it's a topic that comes up again and again and again, is around
00:38:41.320 free speech, freedom of speech.
00:38:43.040 And I think because we mentioned earlier the fact that you can't get to a greater truth,
00:38:48.000 well, basically, you can't get to a greater truth unless you listen to all sides, you
00:38:52.020 know, and Kennedy said, Robert Kennedy said at a rally I attended before the election,
00:38:57.880 last election called Rescue the Republic, which I love this line.
00:39:00.060 He said, the solution to bad speech is not censorship, it's more speech.
00:39:04.260 Of course.
00:39:05.140 Right.
00:39:05.520 And I think, again, that's something we, I don't know, Gad, whether it's something that,
00:39:10.960 you know, I claim to go, I've been to the best school in the world, right?
00:39:13.880 I'm very proud of my school.
00:39:14.980 It's called Manchester Grammar School.
00:39:16.000 Um, it's over 500 years old.
00:39:18.540 Our motto at Manchester Grammar was Sappere Aude, which in Latin means dare to be wise.
00:39:24.060 It comes from, I think, Immanuel Kant, the Enlightenment, you know, the, the, the, right?
00:39:28.260 Do not let fear subdue your ability to reason.
00:39:30.640 But I don't remember, interestingly, being, maybe it was intuitive, but it wasn't necessarily
00:39:37.600 something I remember being taught in school about the importance of freedom of speech and,
00:39:41.240 and how we've got to where we are, the, the, the evolution of Western society and how
00:39:45.360 important it is.
00:39:46.640 And it's all, it all seems to be getting undermined, um, by people, I think, I believe, and I'd
00:39:51.500 love to hear your view on this, by a so-called oligarchy of people that have got to their
00:39:55.860 positions of power where they believe they know what's best for us.
00:40:00.280 And they actually believe it, Gad.
00:40:02.040 I don't think there's any conspiracy.
00:40:03.360 I don't think some evil cabal, you know, these, again, these are psychological barriers
00:40:06.760 here.
00:40:07.060 I agree.
00:40:08.060 So, uh, in, in, in suicidal empathy, I have a chapter where I talk about forbidden knowledge,
00:40:15.000 right?
00:40:15.860 And so forbidden knowledge has a very specific term in the context of academia, right?
00:40:21.840 Don't do research on race differences, because then if the results come out in a way that might
00:40:28.320 be problematic from a politically correct narrative, then that's going to further marginalize the
00:40:33.240 group in question.
00:40:34.320 So therefore that should be filed away on the forbidden knowledge.
00:40:37.660 But I argue in, in the book that the, the reflex to your point, the reflex for whomever is in
00:40:46.300 power to then say that, which I disagree with, and then fill in the blank.
00:40:52.660 We can call it dangerous.
00:40:54.040 We could call it blasphemous.
00:40:55.980 We can call it hate speech.
00:40:57.420 We could call it disinformation.
00:40:58.840 We can call it misinformation.
00:41:00.140 So the reflex to stifle those who are below us is an indelible feature of the human condition,
00:41:10.600 right?
00:41:10.880 I mean, the inquisition, right?
00:41:12.980 Was exactly that.
00:41:14.480 There is an official church doctrine, set of doctrines.
00:41:18.380 And if you go against it, even if you're called Galileo, then we might come after you because
00:41:25.360 how dare you say that it's not the earth that's at the center of the universe.
00:41:29.480 And so when you get someone like Joe Biden, I don't know if you remember, I can't remember
00:41:34.680 her name.
00:41:35.160 I cite her in the book when he appointed someone to be, and it was very short lived to be a
00:41:41.140 czar of disinformation.
00:41:42.880 Her name was Nina something.
00:41:44.860 Do you remember her?
00:41:45.360 She did a viral clip where she was singing.
00:41:48.060 Do you remember that?
00:41:48.960 Yeah.
00:41:49.460 That in the 21st century, the president of the United States thought that it was a good
00:41:56.160 idea to have a czar of disinformation tells you that Orwell was right.
00:42:04.180 Yes.
00:42:05.000 No, absolutely.
00:42:05.980 And one other thing that I, that I people, I discovered as well, like I kind of was aware
00:42:10.200 of this over my last 10, 15 years as a public health campaigner and, you know, activists where
00:42:15.560 I've had access to politicians and leaders across different parties is that there's, I
00:42:21.400 think people don't understand the level of ignorance and ineptitude amongst people in
00:42:27.600 leadership positions and, and, and, and linked to that as well is when you have, you know,
00:42:33.800 I, in my view, one of the big issues we have right now, I call it, you know, a big farmer
00:42:37.700 tyranny or corporate totalitarianism, whatever you want to, you know, there are different problems
00:42:42.040 going on in the world, right.
00:42:42.820 But, or different factors that are contributing to the, the, the sort of, to the detriment
00:42:48.060 of our mental, to our, of our mental, physical and social wellbeing.
00:42:51.140 But one of the things that's really interesting is, and this again, got a lot of attention
00:42:54.740 on social media recently is that president Trump had his, I think his six monthly, you
00:43:01.300 know, health check.
00:43:02.420 Okay.
00:43:02.940 And there was a post from the white house where they mentioned that he had had the COVID-19
00:43:07.520 booster.
00:43:08.300 Oh, I know where you're going with this.
00:43:09.600 Yes.
00:43:09.820 I saw your thing.
00:43:10.580 And, and, and, and people are automatically thinking, no, he didn't have it.
00:43:14.280 It's fake.
00:43:14.720 And I was like, no, I actually believe he has it.
00:43:16.960 I believe that president Trump believes in this COVID vaccine.
00:43:20.260 But also the other interesting thing is I was, I was given his, you know, it's public
00:43:24.100 knowledge anyway.
00:43:25.360 There have been observations.
00:43:26.440 And I think president Trump is a, as a 79 or a president, I think is remarkable in terms
00:43:30.660 of his, his energy and what he does.
00:43:32.440 Right.
00:43:32.640 And he only sleeps four hours a night, but there have been some, you know, there's some press
00:43:37.560 reports and some observations, even from people close to the president who contacted
00:43:40.560 me, well, we'll not name who have said, yeah, there are moments where, you know, he's kind
00:43:46.420 of falling asleep or whatever.
00:43:47.420 And that could be just a natural thing, but it could also be the fact that he's taking
00:43:51.760 a statin.
00:43:52.720 Right.
00:43:53.100 So I'm again, a precise doctor.
00:43:54.740 I look at his whole cardiovascular risk profile.
00:43:57.280 We know what his cardiac status is.
00:43:59.060 I know what the benefit of a statin is in this situation is that close to zero it's, you
00:44:02.880 know, you have to vaccinate, so you have to give a statin to 446 people to prevent one
00:44:06.840 heart attack or stroke, right.
00:44:08.180 With no prolongation of life.
00:44:09.440 So no real significant benefit.
00:44:10.780 And it comes with, with side effects, which are more prevalent as you get older.
00:44:14.460 One of them being brain fog, fatigue, all right.
00:44:16.900 And muscle pain.
00:44:17.840 And I see this all the time as a clinician.
00:44:19.540 I have patients coming to me, Gad, sad, and I'm not talking, Gad, sorry.
00:44:23.020 And I'm not talking about, I'm not talking about, um, uh, I'm not, you know, I'm not
00:44:28.900 saying this is president Trump, but for example, there are people who come to me as patients
00:44:32.000 who have literally, you know, have been diagnosed that they may have early dementia.
00:44:36.160 Some of the family thinking, hold on a minute.
00:44:37.420 I don't believe this is the case.
00:44:38.500 You know, the doctor won't stop the statin.
00:44:40.320 They come and have a consultation with me.
00:44:41.720 I go through the whole history and I say, listen, you know what?
00:44:44.420 I, I don't, I don't think this is dementia, but we can do something here.
00:44:48.220 We can try off your statin for two to three weeks.
00:44:50.040 Cause if it is that you'll notice very quickly and Gad, you know, it's, I've lost count of
00:44:56.260 the amount of patients have come back late.
00:44:57.540 They, they cannot believe, you know, the, how they feel afterwards that, you know, they're,
00:45:01.220 they're, they're much more alert.
00:45:02.680 They're more energized.
00:45:03.260 They're sleeping better.
00:45:03.980 Right.
00:45:04.420 And it was all, all the time.
00:45:06.020 It was a stat.
00:45:07.280 So I went public, obviously with this, I tried to get through because I said, this is,
00:45:11.600 I genuinely want to help the president.
00:45:13.360 I want him to be the best president he can be.
00:45:15.100 And I'm like, also as a doctor, I'm like this, I'm watching this.
00:45:17.660 This is crazy.
00:45:18.340 He's on aspirin as well.
00:45:19.180 He doesn't need to be on aspirin.
00:45:20.200 It's more likely to cause him a bleed than prevent him from a heart attack.
00:45:22.340 Well, I know that because we've, we've seen his, his cardiovascular status.
00:45:25.260 We know he's okay.
00:45:26.520 And this is insane.
00:45:28.080 But, but what it also represents for me is the, you know, I, I said this in my talks
00:45:33.580 in terms of the medical misinformation mess.
00:45:36.720 And this is a, um, a phrase that came from Professor John Ioannidis.
00:45:41.460 You may be familiar with Professor Stanford, right?
00:45:43.480 Yeah.
00:45:44.040 The most excited medical researcher in the world.
00:45:45.560 Right.
00:45:46.340 And it's good.
00:45:46.740 And he wrote this paper in 2017, which people can look up and you can read it.
00:45:49.480 It's excellent called how to survive the medical misinformation mess.
00:45:51.920 And he talks about the fact that most doctors are unaware of the poor quality research that
00:45:56.500 drives, you know, medical decision-making at one stage.
00:45:59.620 He said most published research findings are false in medicine.
00:46:02.320 Right.
00:46:02.820 But he said, ignorance of this problem at the highest levels of academic and clinical leadership
00:46:07.800 is profound.
00:46:08.920 Yeah.
00:46:09.140 So don't presume that even the, whatever the, you know, the, the, the surgeon general may
00:46:13.560 even know this stuff.
00:46:14.260 So for me, knowing all of that, I have no doubt that president Trump's physician and even
00:46:18.440 president Trump are actually believing in this without even breaking the data down into
00:46:22.300 it in its individual parts.
00:46:23.520 And part of it, I think is because of this system, this, this, this ideological system
00:46:28.760 that has basically become subservient, right?
00:46:32.980 Wittingly or unwittingly to the interests of big pharmaceutical companies and purely their
00:46:37.380 profit motive.
00:46:37.980 Right.
00:46:38.300 That's what it is.
00:46:39.260 So, so, so that's, so my point is, I think there is no conspiracy.
00:46:42.680 I don't think president Trump is pretending.
00:46:44.500 I actually think that no one is immune, even the president to medical misinformation.
00:46:48.440 Because of the system.
00:46:49.400 So if I asked you to allocate a hundred points to reflect the relative influence of the following
00:46:57.100 two factors, you know, pharmaceutical, profit seeking, nefarious motives versus all of the
00:47:06.980 other psychological barriers that stop people from either getting at the truth or revising
00:47:14.280 their opinions when facing contrary evidence.
00:47:17.440 So one is the various pharmaceutical money stuff to the psychological architecture of
00:47:26.080 the human mind that makes it feeble to truth.
00:47:28.800 How many points do you allocate to those two?
00:47:32.260 Wow.
00:47:32.920 So again, out of a hundred, out of a hundred.
00:47:34.960 So if it were, you know, 90, 60, 40, 60, 40, 50, 50, they're equal.
00:47:40.020 Wow.
00:47:40.040 That's a great question, Gad.
00:47:42.780 Okay.
00:47:43.680 With humility, I don't, you may have a better idea than I would, but, but my intuitive, intelligent
00:47:50.520 response from my own experience is I would say, and from following on from our conversation,
00:47:56.040 I would say 60, 40, but they're linked anyway, more in favor of the psychological.
00:48:03.180 That, see, I was certainly going to go more psychological than nefarious.
00:48:09.800 Yes.
00:48:10.700 I might even go slightly higher than 60.
00:48:13.260 I might say that in the same way.
00:48:16.040 But can I, yeah, sorry, but can I say, sorry to interrupt you, but can I say, I don't think
00:48:19.700 it's nefarious actually.
00:48:21.560 You mean nefarious even on the pharmaceutical side?
00:48:23.940 Yeah, I think it's like, it's, it's like a, I don't know how to describe it.
00:48:28.020 It's, it's, it's, it's, I think it's an ideology to some degree.
00:48:31.800 I, I, I don't think Albert Boehler, CEO of Pfizer is sitting there thinking that, you
00:48:36.440 know, I'm going to completely cover up all potential vaccine injuries, whatever else
00:48:39.960 it's, it's, I think there's something that switches on in his brain that he wants to try
00:48:44.620 and do good with his products, but it causes him to have a massive deliberate blind spot,
00:48:49.240 almost maybe subconsciously to the harms.
00:48:51.760 Now, I'm not saying there aren't nefarious CEOs out there.
00:48:54.180 Okay.
00:48:54.500 But then what you just said, I would put that under psychological barrier then.
00:48:58.660 There you go.
00:48:59.340 That wouldn't go under pharmaceutical.
00:49:00.720 So it's primarily psychological then, I, I, I think.
00:49:04.420 Okay.
00:49:05.000 All right.
00:49:05.560 Let's, let's switch to a few personal, I mean, not, not deeply personal, but number one,
00:49:11.000 why did you choose cardiology?
00:49:13.200 And let me, before, before I ask you, before I ask you to answer this, let me give you the
00:49:17.760 background.
00:49:18.120 So my doctoral dissertation was in psychology of decision-making, right?
00:49:22.740 I specifically looked at how much information do people look at before they stop collecting
00:49:29.420 additional information and commit to a choice.
00:49:31.460 So those are called stopping rules, right?
00:49:33.520 If I'm choosing between two cars or I'm choosing between two prospective women to marry, I don't
00:49:39.820 look at all of the available information.
00:49:41.780 I look at enough information that it becomes clear to me that I should marry girl A or I
00:49:46.880 should buy car B, right?
00:49:48.300 So my original training was in psychology of decision-making.
00:49:52.460 So it, under that, in the spirit of, of that background, you could have chosen 60 different
00:49:59.900 medical specialties, yet somehow you went into cardiology.
00:50:03.760 Why?
00:50:06.200 I think, okay.
00:50:07.080 So at school, I had a lot of variety of interests in terms of subject matter.
00:50:11.400 And I actually, my favorite subject, interestingly, was history.
00:50:15.760 And the reason history was most interesting to me, Gad, and just to answer your question,
00:50:20.120 because I think it may, may help you understand who I am as a complete doctor, if you like,
00:50:26.880 is that we were taught in my school about bias.
00:50:31.180 And I, I came almost top in the air, I came second in the air, and my school was like the
00:50:35.340 top school in the country.
00:50:36.300 And I'm not claiming, I'm not blowing my trumpet here saying I was like, whatever, best
00:50:39.660 of the best, but I was a straight A student.
00:50:40.900 And for me, you know, I was always good at, you know, but the thing I enjoyed the most was
00:50:45.560 history because we weren't just taught about historical facts.
00:50:48.260 We were taught about bias, right?
00:50:50.360 We were taught about critical thinking.
00:50:52.120 And, and specifically, you know, in, in, in, in my second year in what we call high school.
00:50:57.400 Um, in fact, no, I think it was a bit later than that, the, the big topic was, was actually
00:51:02.600 World War II and the Holocaust.
00:51:04.740 And I remember being presented in class of things where they would show these cartoon figures
00:51:09.940 where they were demonizing Jewish people.
00:51:12.040 And, and the history teacher says, what do we, you know, what do you think of this guys?
00:51:15.200 What else is missing, right?
00:51:16.580 What's not there that gives you a more complete truth, right?
00:51:20.000 Of what's going on.
00:51:20.740 So that's where I got, you know, and that was what I really enjoyed.
00:51:23.660 And the reason I mentioned that is up until the last minute, in many ways,
00:51:27.400 I wanted to be a lawyer, but I grew up with loving debate, um, loving, uh, having an
00:51:34.540 intrinsic, um, how should I put it?
00:51:38.120 An intrinsic, uh, aversion to any injustice around me.
00:51:44.380 And some of that, Gad, and you can relate to this as well, may well have been, but the
00:51:48.020 fact that I grew up in, in, in the North of England, where I had two parents that were
00:51:52.180 doctors, we were Indian background.
00:51:53.640 I had a brother with down syndrome, but I grew up in a place where, uh, it was not
00:51:59.120 unusual for me to be called names.
00:52:02.680 And we were even attacked once by neo-Nazis.
00:52:05.100 Wow.
00:52:05.540 Okay.
00:52:05.960 So I grew up in that kind of environment where even as a kid, Gad, I sometimes would make
00:52:11.320 excuses that I was sick.
00:52:13.340 So that to stop, like on the weekend, my parents would want to go out shopping, for example,
00:52:17.620 into the city center.
00:52:18.600 I knew there's a very good chance someone's going to call us Paki or something like that,
00:52:21.960 some racial term.
00:52:23.040 And my dad wasn't a shrinking violet type of guy.
00:52:25.880 He was a guy that wouldn't care.
00:52:27.080 He'd be up for a fight.
00:52:28.160 And I didn't, as a kid, I was scared to see something, something would go wrong.
00:52:31.380 Right.
00:52:31.760 So let's give you an example.
00:52:32.640 Like, and I don't know whether that came from that.
00:52:34.240 My dad was also someone that, you know, always stood up against injustice.
00:52:36.780 So I had this thing about initially my experiences of that and seeing, uh, and being very sensitive
00:52:41.860 to racism.
00:52:42.960 Right.
00:52:43.180 Now, of course things evolved and progressed.
00:52:45.400 And I can't remember the last time anyone called me a name such as that.
00:52:48.080 There are different types of biases and racism still there.
00:52:50.360 And I think it's something we're all vulnerable to.
00:52:52.880 Okay.
00:52:53.480 But, um, but I think that sense of injustice started there.
00:52:57.480 Um, and, and then what happened, and then, you know, I think because both my parents were
00:53:01.780 doctors, there was always a little bit of that, you know, um, you have to follow, you
00:53:06.920 have to follow on their footstep.
00:53:08.160 You're a little bit, but my parents were actually for, for, for Indian family, parents
00:53:12.100 were actually relatively cool.
00:53:13.280 They wanted to follow my passion.
00:53:14.480 And so they never really pushed me down that route, but there was an intrinsic, so there's
00:53:18.200 a natural influence there.
00:53:19.220 My brother had a heart condition, um, which wasn't going to kill him.
00:53:23.280 He had a hole in his heart.
00:53:24.000 He had down syndrome, but the age of 13, when I was 11, he got a viral infection.
00:53:28.220 It was very traumatic for the family because out of the blue within five or six days, unexpectedly,
00:53:32.860 he went into crashing heart failure and died.
00:53:34.920 And, and, and that, I think as an 11 year old child definitely had a profound impact
00:53:38.980 on me.
00:53:39.300 And I remember, you know, my mom crying and my father, and I, you want to console them.
00:53:43.360 And it's like, okay, I'm, I remember saying something, I'm going to be a cardiologist so
00:53:46.540 I can make sure, you know, kids like my, my brother don't die unnecessarily or whatever
00:53:50.920 else.
00:53:51.920 And, um, so that's, there was always that I think in me.
00:53:55.120 And then I decided the reason I changed, decided not to go into law is I had a family, uh, um,
00:54:02.240 um, a relative who was a lawyer and they said to me, I'll never forget this.
00:54:06.800 So to seem, remember one thing, being a lawyer, isn't so much about fighting injustice.
00:54:12.260 It's about winning.
00:54:13.500 So go in with open eyes to know, you know, and it put me off a little bit.
00:54:18.040 Well, it's, it speaks to this book, right?
00:54:20.500 It's about winning the argument, not about getting the truth.
00:54:23.820 Exactly.
00:54:24.760 Right.
00:54:25.220 So I decided in the end to do medicine, you know, I was still good at biology and sciences.
00:54:28.780 I had a bit more of an interest in history or whatever.
00:54:30.600 And, and, uh, so I went to, I went to do medicine.
00:54:33.100 And then once I was in medical school, there was, you know, part of it was, I was fascinated
00:54:38.340 with the heart.
00:54:39.600 Listen, I was also a competitive guy, Gad.
00:54:42.100 I, I, you know, I, I played sports, I captained sports teams and there was a bit of this kind
00:54:46.820 of, there was slightly like, okay, cardiology is really tough.
00:54:49.500 Like it's, it's, it's kind of crazy.
00:54:50.880 It's super competitive.
00:54:51.660 It's a bottleneck, you know, only a certain percentage of the doctors can.
00:54:55.020 Uh, can, can, can hack it, you know, and it's a very, and so I said, you know what?
00:54:58.740 I like that challenge.
00:54:59.680 I like the idea of cardiology.
00:55:01.340 I was fascinated with the heart and that's why I pursued it.
00:55:03.880 And I don't look back, but just to give you an example in terms of the, you know, I, I
00:55:07.660 chose interventional cardiology originally, which is a subset of cardiology, which is keyhole
00:55:11.700 heart surgery, which is even more competitive.
00:55:13.840 And it was literally, and this hardened me up.
00:55:15.840 And I think that's one of the reasons I'm able to do what I do is because it was almost
00:55:19.840 like a military style type of environment, right?
00:55:23.100 There was, I remember there was this amazing cardiologist and really good at what he did,
00:55:27.660 but you know, he had some anger management type of issues.
00:55:31.260 And if you screwed up in the operating theater, I mean, he was throwing scalpels around that
00:55:35.240 kind of stuff for people.
00:55:36.040 He was, he was, he was sending people into tears and I'd come down from Manchester.
00:55:39.820 I'd worked in my, my final training in cardiology and I was kind of the tough, hard, in England
00:55:44.980 has a thing where you're from the North, you're a bit harder.
00:55:46.840 You're a bit more toughened up.
00:55:47.700 I trained in Scotland.
00:55:48.640 The first hospital I worked in was among one of the most deprived areas of the whole country
00:55:52.680 where you have, you know, teenagers coming on with, with, with liver failure from alcohol,
00:55:56.980 you know, alcoholism.
00:55:57.820 You have, you know, I worked in, in the most busy, um, ER in, in the whole country where
00:56:03.580 we had, you know, gangland shootings and stabbings.
00:56:06.580 So I came, I was really hardened up by the time I got into the interventional cardiology
00:56:11.300 circuit.
00:56:11.880 So for me, I was taking my stride and this guy's looking at me and he was making other people
00:56:15.060 cry.
00:56:15.360 And I was just like, just taking a very cool in my stride and just doing my thing.
00:56:18.980 So, um, yeah, that was kind of my journey before I became an activist, just to give
00:56:22.940 you an idea.
00:56:23.520 So, so nothing compares all this big pharma stuff and people attacking me and the prime
00:56:27.240 minister's like, nah, this is nothing compared to the, my boss in cardiology, right?
00:56:30.560 Trying to scare the shit out of me.
00:56:31.640 I could link what you, what you said about the seeking justice as a lawyer.
00:56:38.240 And then what happened to your brother tragically, in a sense, you becoming a cardiologist, you were
00:56:45.740 seeking to resolve the cosmic injustice that your brother faced.
00:56:51.460 So in a sense, you became a cardiologist also to pursue injustice.
00:56:56.140 Why?
00:56:57.040 I mean, I don't know if you've seen some of those foot footages, uh, of young soccer players.
00:57:02.220 I mean, I, I just showed one yesterday to my, to my wife and she said, why are you showing
00:57:06.480 me this?
00:57:06.760 Are you, did you see that famous footage of this guy, soccer player, and he just drops
00:57:12.680 dead of cardiac arrest.
00:57:15.320 And then all the teammates kind of drop and they're like in, in, in literal shock.
00:57:20.180 They're like crying.
00:57:21.340 Right.
00:57:21.660 I mean, that's cosmic injustice, man.
00:57:24.300 I mean, what could be more unjust than a 27 year old, perfectly healthy guy.
00:57:30.260 One minute he is making a dribble.
00:57:32.700 The next minute he's off the earth.
00:57:36.420 Yeah.
00:57:36.920 No, it's fine.
00:57:37.480 No one's ever said that to me.
00:57:38.580 That's so interesting.
00:57:39.500 Well, you just, you've, you've given, you've diagnosed actually my, uh, my, my past.
00:57:43.680 That's a psychologist.
00:57:45.300 It makes, it makes sense.
00:57:46.380 Yeah.
00:57:46.540 No, totally.
00:57:47.420 Oh, good.
00:57:48.020 Well, I'm glad, I'm glad to have been of service.
00:57:50.180 Uh, one last question for you before I let you go.
00:57:53.160 Uh, about a couple of years ago when my happiness book came out, it was a tradition for me to
00:57:57.880 ask it.
00:57:58.300 Then I stopped doing it, but let's, let's revive the tradition.
00:58:01.120 So in one of my last chapters in the happiness book, I talk about sort of a, a well, a life well
00:58:08.060 lived is one where hopefully when you're 85, 90 years old on the proverbial porch and you're
00:58:13.620 looking back at your life, you've got as few regrets as possible.
00:58:16.900 And one of my former, uh, doctoral professors, uh, in psychology is a gentleman by name of,
00:58:24.620 uh, Tom Gilovich, professor Tom Gilovich, who was one of the pioneers of studying the psychology
00:58:30.720 of regret, specifically the regret due to inaction versus regret due to action.
00:58:36.760 Regret due to action would be, I regret that I cheated on my wife and now my marriage is over.
00:58:42.800 Regret due to inaction.
00:58:45.120 I regret the fact that I became a cardiologist because my dad was a cardiologist, but really
00:58:50.080 I always wanted to be a, an artist, an architect.
00:58:53.600 And so I wake up at 60 and I regret the fact that in a sense, I've lived an inauthentic life.
00:58:58.880 I've lived someone else's desire for the life that I should lead.
00:59:03.080 Now, it turns out, as you probably can guess, that the biggest looming regrets in people's
00:59:08.720 lives are those due to inaction rather than due to action.
00:59:13.240 So if I now turn the FBI light on you and say, you're still a young man, so you still have a lot
00:59:20.440 of life to live, what would be the biggest looming regret or regrets that you currently have?
00:59:27.940 You know, as you've been talking, I've been thinking about this, um, as you probably know,
00:59:33.700 GAD, uh, you know, and we, suffering is inevitable part of life.
00:59:38.980 We all, we're all going to experience it.
00:59:40.320 And of course we want to live a life where we minimize that suffering or at least increase
00:59:44.060 our resilience to it.
00:59:44.940 Right.
00:59:45.140 This is part of personal growth.
00:59:47.840 Uh, for me, I think the most difficult part of the last few years of my life has been the
00:59:53.380 absence of my parents who, who died prematurely, partly due to system failures.
00:59:57.180 So if I was to reflect on, I'm not one for regrets, you know, I'm, I'm a, I'm a half a
01:00:02.800 glass, half full guy, you know, every, every experience is something for learning and you
01:00:06.180 move forward from it.
01:00:06.980 Right.
01:00:07.400 But if there's one regret I have, GAD, and maybe other people can, maybe it can help them
01:00:11.640 think about it a little bit where they are right now is actually not spending more time
01:00:16.840 with my parents and telling them how much I love them.
01:00:19.780 Wow.
01:00:20.700 Wow.
01:00:21.260 But believe me, by the way, and I won't get into it.
01:00:23.900 It even touches me in a, in a, in a way, uh, my father is 95, my mother is 91.
01:00:30.980 So I, I, I appreciate that very much.
01:00:33.300 Uh, because, because, because, because God, one final point is there's no, I think one
01:00:37.180 thing there's, there's nothing like the unconditional love of your parents.
01:00:42.020 And when it's gone, it never comes back.
01:00:44.520 They're in your heart, but there's nothing like that, GAD.
01:00:47.260 There's nothing like that.
01:00:48.780 I, and, and I'm not, I don't mean to in any way minimize.
01:00:52.000 I hope you'll appreciate it.
01:00:52.900 I think the only place you can find it, but not with bipedals would be go for the quadrupedals.
01:01:00.860 Dogs might give you the unconditional love.
01:01:03.460 Yes.
01:01:03.960 Your parents do, but you are correct that if it comes to other human beings, you're probably
01:01:08.820 hard pressed to find someone who's going to love you more than your parents.
01:01:11.560 So point taken.
01:01:12.800 Absolutely.
01:01:13.720 Asim, I can't wait for your treepeat.
01:01:17.660 Now we've only had a repeat.
01:01:19.780 I look forward to your next visit.
01:01:21.000 Anything that you want to plug a project you're working on, a book that you're thinking of
01:01:26.440 writing before we wrap this up, take it away.
01:01:29.520 That's really kind, GAD.
01:01:30.640 One of the things that we're doing, you know, I made this documentary film, independent film
01:01:36.080 called First Do No Farm.
01:01:37.600 And people can download that by going to nofarmfilm.com, P-H-A-R-M, is that we're now in a stage where
01:01:43.180 we've got some interest from Hollywood to try and make another film for an Oscar campaign,
01:01:48.200 essentially.
01:01:49.020 Really to help change, not for any other reason, but to just help shift, you know, heal this
01:01:53.040 divide, get to the truth around big pharma and people's health.
01:01:55.800 So people can privately reach out to me if they're interested in investing in that movie,
01:02:00.600 because we have all the ingredients of a great documentary story here, a big arc of
01:02:04.980 the story, speaking truth to power, you know, the appointment of Robert Kennedy Jr., and
01:02:10.820 trying to give people the greater truth that's missing.
01:02:15.080 So that's the only thing.
01:02:16.160 And of course, people can direct message me on Instagram on Lifestyle Medicine Doctor.
01:02:20.280 I'm on X's Dr. Asim Malhotra, or even just email me.
01:02:24.340 My email is public anyway.
01:02:25.540 So it's asim underscore malhotra at hotmail.com.
01:02:28.260 Well, I will say this, and I mentioned this before I came on to my wife.
01:02:32.360 I said, you know, the more I know people, the more I love dogs.
01:02:36.540 That's not, it's not me who came up with this thing.
01:02:39.120 But I said, but the next guest that I have on, she's never met you, might be one of the
01:02:45.620 rare violations of that dictum, because you are a true scholar and gentleman.
01:02:51.320 It is a pleasure to call you my friend.
01:02:53.720 So kind.
01:02:54.380 Thank you so much for coming on.
01:02:55.920 Come back again anytime.
01:02:56.940 Stay on the line so we can say goodbye offline.
01:02:59.180 And thanks again for coming on.
01:03:00.600 Cheers.
01:03:01.280 Thank you.