My Chat with Fired Harvard Epidemiologist Martin Kulldorff (The Saad Truth with Dr. Saad_652)
Episode Stats
Length
1 hour and 7 minutes
Words per Minute
173.9966
Summary
Dr. Martin Kullorf is a biostathematician and epidemiologist, and until very recently, a professor at Harvard University School of Medicine. He was one of the co-authors of the Great Barrington Declaration, co-author of the "Great Brighton Declaration," and was a founding member of the Harvard School of Public Health Advisory Board. He is also the author of the new book, "The Parasite Mind," which explores why people get parasitized by bad thinking, bad ideas, and bad pathogens.
Transcript
00:00:00.960
Hi, everybody. This is Gadsad for The Sad Truth. Today, I have another fantastic guest.
00:00:06.660
Martin Kulldorff is a biostatistician and epidemiologist, and until very recently,
00:00:12.660
a professor at Harvard University School of Medicine. Martin, how are you doing?
00:00:19.760
I'm doing good. How are you? Nice talking to you.
00:00:21.980
Likewise, likewise. I know that you are one of the three co-authors of the great
00:00:27.860
Barrington Declaration. Maybe you can start us off with that, and then we could talk about your
00:00:32.900
Harvard debacle, and we can also talk about the fact that we're both Cornelians. So go ahead,
00:00:37.980
take it away. Okay, yeah. So in early 2020, the world went mad, and all the principles of public
00:00:48.000
health were thrown out the window. And as the public health scientist, I was in shock. And I tried to
00:00:56.400
publish here in the US, but it was very difficult during the spring and summer of 2020. I had no
00:01:02.840
problems publishing in my native Sweden in the major newspapers there. But here, I couldn't get my
00:01:08.040
voice heard, because I think it was important to tell people that while anybody can get infected,
00:01:14.380
there's more than a thousandfold difference in the risk between the old and the young. So the
00:01:17.840
obvious thing is to do as much as we can to protect older people, because they are high risk
00:01:22.020
while letting kids go to school. So, but anybody who tried to speak out was sort of
00:01:28.380
dismissed somehow, either they were only one person, or they didn't have the right expertise,
00:01:33.340
or whatever. So I got together with Dr. Jay Bhattacharya from Stanford University, and Dr.
00:01:40.560
Suneeta Gupta from Oxford University, who is the preeminent infectious disease
00:01:44.340
epidemiologist in the world. And we wrote the one page, Great Brighton Declaration, arguing for
00:01:49.640
focused protection. We need to do more to protect older people. And we should let kids go to school,
00:01:56.200
and we should not have all these other lockdown measures, because they are not effective against
00:02:02.240
COVID. And they create a lot of collateral public health damage on cancer, or cardiovascular disease,
00:02:10.400
for mental health, and so on. And what, so, I don't know if you know the answer to this,
00:02:16.840
but as someone who, you know, studied the parasitic mind, hence this book, right? Why do people get
00:02:23.280
parasitized by bad thinking, bad ideas, idea pathogens? What, in your view, explains that reticence
00:02:33.060
to hear your voice? So, for example, when it comes to, let's say, transgender activism, and the,
00:02:39.020
you know, men too, can menstruate, I can offer an analysis as to why there is that departure from
00:02:45.660
reason. But when it's coming from fellow public health officials, whom you're saying were violating
00:02:51.560
all the fundamental tenets of, you know, good public health thinking, what was driving,
00:03:00.060
That's a very good question. I have asked that myself many, many times. And I have certain theories,
00:03:07.980
I don't have any answers that I know. But if we look, for example, at the leadership, which was
00:03:15.280
NIH Director Francis Collins, and NIID Director Anthony Fauci, and Deborah Birx, who was the COVID
00:03:26.960
coordinator at the White House during 2020, they're lab scientists. They are, they know about
00:03:35.560
virology, they know about immunology, but they don't know much about public health. And that was
00:03:39.740
obvious. They didn't understand that you can't just focus on one disease, COVID, and ignore all the
00:03:45.740
collateral damage. They don't understand that, I mean, a hurricane will come, and you can hanker down
00:03:52.320
for a while, and then it's over, and you can go back out in the sun. That's not how infectious
00:03:56.880
diseases work. If you hanker down, well, you might not get infected for those two weeks.
00:04:02.200
But when people start going out again, the infection will just take off instead. So you can
00:04:07.760
push it forward a little bit, but you cannot prevent it. You have to go through it. It's like
00:04:13.040
COVID is like influenza. There's no way to stop it. And as soon as we had the outbreaks outside of
00:04:22.740
China, it was first in Northern Italy and Iran, it was obvious to any clear-thinking
00:04:28.640
epidemiologists that this would hit the whole world. This virus will spread to the whole world
00:04:35.680
So, okay. So I don't know if you're being charitable or if that's really truly your position.
00:04:42.660
You seem to be attributing all of the weight of the bad decisions on ignorance, right? But let me
00:04:49.780
force you to sort of attribute points to two possible scenarios. One is what you suggested,
00:04:57.720
which is, look, they're not public health officials, and so they were imbecilic in their
00:05:02.720
approach to this thing. So that would be 100%. It's only ignorance. Another one, slightly more
00:05:09.280
dark or a lot more dark, is that no, there is willful duplicity, whether it is political, whether
00:05:15.980
who knows, whether it's machinations at Davos. So if I asked you to allocate 100 points to these two
00:05:23.200
factors, you know, 75 points go to ignorance, 25 points go to diabolical machinations. How would you
00:05:30.400
attribute those points to those two differing factors?
00:05:35.040
Maybe I'll give 25 out of 100 to both of those two, and then I leave 50 for something else.
00:05:42.240
That's overconfidence in themselves, the hubris.
00:05:47.280
They thought they were right, and because of that, they censored other people. So the fundamental
00:05:55.120
principle of science of the Asian enlightenment that we have been lived with for a few hundred years
00:06:02.080
now is an open exchange of ideas and freedom of speech, and in terms of universities, academic
00:06:14.800
freedom. And that was shut down. So even if they were ignorant about public health, that's an
00:06:21.040
excusable thing, because we don't know about everything. But it's completely unexcusable to
00:06:27.680
shut down the debate. So when we came up with the Greg Barrington Declaration, the NIH director at the
00:06:32.320
time, Francis Collins, he asked for a devastating takedown. What he should have done if he was a true
00:06:43.680
Invite us and invite some people who are pro-lockdowns, and we'll just have an open public discussion. That's
00:06:49.760
what he should have had. So I would put 50 percent to that, and 25 percent of the other two, maybe.
00:06:58.640
No, no, I understand. We're speculating. But so to your point about the 50 points, the sort of the
00:07:05.440
ego and the hubris and the inability to, you know, auto-correct in light of new evidence.
00:07:12.400
So I've been mentioning this quite often because it's a question that recently arose
00:07:18.000
on a show that I appeared on hosted by a British psychiatrist, where towards the end of our chat,
00:07:25.200
he asked me, well, you know, Professor Saad, you've been, you know, you've been a behavioral
00:07:29.440
scientist now for, you know, three decades. You've been a professor for three decades.
00:07:32.720
What is the singular phenomenon that has most surprised you about, you know, human nature?
00:07:39.920
Which I thought, on the one hand, it's a very basic question. On the other hand, it's actually
00:07:43.760
quite a profound question because I have studied a lot about human nature. And so I have to kind of
00:07:48.000
pause for a moment to hopefully offer him a good answer. And very quickly, the top answer to your
00:07:54.320
point about your 50 points factor, I said, probably the thing that surprises me most about human nature
00:08:02.640
is the inability of people to change their opinions once it is deeply anchored somewhere.
00:08:09.920
And the reason why I was so, you know, surprised by that, because, you know, in the parasitic mind,
00:08:17.040
I'm arguing that there is an objective epistemological way to seek truth. I mean,
00:08:22.080
beyond just the scientific method, I talk about nomological networks of cumulative evidence where
00:08:26.960
I can build you a network of evidence stemming from across species, across time periods, across
00:08:33.280
cultures. So it's like an orgiastic triangulation method that really drowns you in a tsunami of
00:08:39.200
evidence to hopefully prove my point, which is exactly how any serious scientist should think.
00:08:44.160
And yet, I also argue in that chapter that regrettably, despite the fact that I've got this
00:08:50.480
mind vaccine for you, most people are going to go, la la la, I don't want to hear it, including
00:08:55.840
Francis Collins and so on. Which leads me to think that it is a fundamental feature of the architecture
00:09:02.480
of the human mind, not to seek truth, but rather, to your point, to win arguments, which is something
00:09:08.720
that two French psychologists argued in their book on why has reasoning evolved. And they specifically
00:09:15.760
said, it's got nothing to do with seeking truth. It's got to do with me winning arguments. So, basically,
00:09:25.760
Yeah, that's depressing. And I mean, if that's the case, that's the death of science.
00:09:30.160
We are then at the end of the scientific period. And I mean, we had the Greek period, and then we had
00:09:40.960
the Arab period, and then we had the Western European, the US period of the three golden ages of scientific
00:09:47.360
discovery. But maybe that's come to an end. And then we'll have to wait a few hundred years for the next one. I don't know.
00:09:56.000
It's sort of a depressing thought, because science cannot survive if the point is to win arguments.
00:10:03.680
Science can only survive if we are sort of humble, and we're seeking the truth. And to me, as a
00:10:10.560
scientist, like when I do studies, it's really exciting because, okay, do this study. I don't
00:10:15.120
know if A is true or B is true. That's so exciting to sort of be the first one to know.
00:10:24.400
To me, that's the fascination with science, that we don't know, and we try to know. And actually,
00:10:32.880
I got my PhD at Cornell, like you know, and my thesis advisor there told me one thing he said,
00:10:39.600
that the foundation of science and scientific research is ignorance.
00:10:45.520
Yes. At first, that seems strange, because you need to know stuff, but it actually is, because
00:10:51.520
we have to be ignorant then to search the truth.
00:10:55.040
Yes. And I want to talk about Cornell in a second, because I think we have a lot of
00:11:00.080
intersections that are worth exploring. So all over me now, I'm in my study at home. I have
00:11:09.920
a gigantic personal library of books. To my anxiety and dismay, there are many, many of those books that
00:11:18.800
I've yet to read. And periodically, probably maybe once every week or two weeks, I start calculating,
00:11:24.560
you know, if I read one of those books, you know, per week, you know, do I have enough of a lifespan
00:11:30.880
left? God willing, I have a long life ahead that I can finish those books. So to your point,
00:11:35.520
I think it sounds like a cliche, but I think it's truly true, that the more you know, actually,
00:11:41.840
you're humbled by how little you know, right? So on the one hand, I could walk around saying,
00:11:46.160
my God, I know more than certainly 99.9% of people that are walking the earth. That's not how I look
00:11:53.520
at it. I actually compare myself to the knowledge that is out there. And then I say, I'm an idiot.
00:11:59.600
There's so much things that I don't know. And I think any serious scientist has to have that
00:12:05.200
epistemic humility. Do you think this scientist or others, do they think they have a fear to
00:12:11.680
acknowledge that they don't know something? And yes, it says, I don't know.
00:12:15.680
I absolutely, I think it takes someone who is truly self confident in their personhood
00:12:22.080
to say, I don't know. So I'll give you a personal example to that. So, you know,
00:12:26.640
I've been in the public eye for many years now, years. And so there are all sorts of ways by which
00:12:32.400
I could have fumbled an answer somewhere where it got me into trouble and so on. And many people have
00:12:38.480
asked me, what is it about you that makes you so uncancellable, knock on wood, so far? And one of
00:12:45.520
the answers I give to the point of what we're talking about now is that I'm very well calibrated
00:12:50.560
about what I know and what I don't know. So that if you ask me a question about, you know,
00:12:56.800
the evolutionary roots of some phenomena that is a human universal, I've already built the requisite
00:13:03.120
nomological network of cumulative evidence. And so if I come to that debate, good luck to you.
00:13:09.280
On the other hand, you can ask me a million questions about things that I don't know too much about,
00:13:14.800
and I'll never try to wing it. I'll never try to phone it in. So I'll say, you know, Martin, that's
00:13:20.640
that's a great question that you're asking about, you know, the pros and cons of legalization of
00:13:25.120
marijuana. But I just don't know enough about it to offer you a really intelligent answer. By me doing
00:13:32.240
that, I actually build trust. So imagine you're sitting with an undergraduate class where they come
00:13:36.880
into the class thinking you're the all knowing professor. And then you tell the 21 year old who just
00:13:41.280
asked you a question. Wow, what a great question. I don't know the answer to that. Send me an email
00:13:45.440
and I look into it. That immediately gives you credibility because the students are saying,
00:13:50.480
well, this guy is not haughty. He actually admits when he doesn't know something. So to your original
00:13:56.000
question, I think, you know, the ego is a very, very nasty beast. And most people are unwilling to
00:14:03.760
admit and certainly not the Francis Collins of the world to say, I don't know. That's like a death
00:14:09.520
nail to their ego. That's very interesting. You says that because that's kind of my experience,
00:14:15.600
because unlike you, I have never I haven't been in the public eye before the pandemic.
00:14:22.880
And so I was sort of thrown into this after the great brain deliberation doing interviews.
00:14:28.960
People will sometimes ask me a question and I would answer it. But then sometimes they ask me
00:14:32.800
something. I didn't know what the answer was. So I would just say, well, I don't know. And sometimes
00:14:36.960
I'll say, I don't know because somebody, I don't really expert in that area. Or sometimes I say,
00:14:42.080
I don't know because really nobody knows. And I realized that that actually, as you said,
00:14:47.520
that increases the confidence in the things that I do say about. So I didn't do it purposely,
00:14:54.880
but by accident, I sort of just started doing that because I tried to be honest and that was a good
00:15:02.240
thing. Right. So did you, so you may or may not have heard the metaphor that I use when I ask,
00:15:10.800
implore people to activate their inner honey badger. And here, what I'm talking about is you may or may
00:15:16.560
not know, Martin, that the honey badger has been ranked as the most ferocious animal in the animal
00:15:22.320
kingdom. It's the African honey badger is the size of, you know, maybe a small dog. And yet it is so
00:15:29.680
ferocious that, you know, all the animals in that neighborhood are afraid of it. I mean,
00:15:33.440
including adult lions, they don't mess with the honey badger. And so when I ask people to activate
00:15:38.120
their inner honey badger, I'm basically saying, be combative. If you, if, if you have principles on
00:15:43.220
which you can stand and that you can defend, you know, don't walk away, don't be a coward. Now,
00:15:50.040
clearly, whether you knew it or not before COVID, you've certainly become a honey badger. Maybe
00:15:56.500
that's not what you meant to do, but you said, Hey, you know, I'm, I'm not going along with this.
00:16:01.940
Was, were there any hints that you were a honey badger earlier in your career, or is this something
00:16:08.440
that you've only discovered since the COVID world about yourself?
00:16:15.340
Well, I don't think I'm the person who sort of attacks people, but maybe I have a honey badger
00:16:21.440
when it comes to protecting the truth and doing, yeah. So depending on both the truth,
00:16:27.960
the scientific truth, which I care a lot about and also people's lives and their livelihoods and
00:16:34.560
their wellbeing. So after I got my PhD at Cornell, I actually did not go to do academics immediately.
00:16:42.720
I went to work as a human rights observer in Guatemala where I was, uh, uh, so basically the,
00:16:49.740
there was a military dictatorship there that were killing people, disappearing people. So this brave
00:16:55.420
Guatemalans, if they had a Westerner with them, uh, uh, as protective accompaniment, not with the gun,
00:17:03.160
but with the camera that sort of, uh, kept them safer because, uh, obviously these death
00:17:09.680
cause wanted to operate in the dark. So I guess that's one instance where I sort of took some
00:17:14.820
risk myself to, to protect. So maybe that's, uh, my inner honey badger in that sense.
00:17:21.220
Well, there, there were hints of the honey badger mindset early in your career. What then led you
00:17:26.880
to leave that work and say, okay, I'm, I'm, I'm going back into academia.
00:17:31.560
Well, I was only supposed to do that for a while and then I had to sort of start my own life. Uh,
00:17:40.100
and I love science. Uh, I love the curiosity. Uh, basically every child is a scientist.
00:17:47.180
They're exploring the world and doing experiments and to be a scientist, a true scientist is like
00:17:54.160
being a child your whole life. And that's the wonderful thing.
00:17:56.740
I love that you said this. Forgive me for interrupting you. I'm very, very sorry for
00:18:01.120
interrupting you. So in my latest book on happiness, I have an entire chapter that I titled
00:18:07.160
life as a playground. And in that chapter, I argue that the highest form of play is science
00:18:14.820
because it is the ultimate, excuse me, puzzle making, right? I mean, in a puzzle, you're taking
00:18:20.440
a bunch of pieces, see what fits with, well, what's science? There's a million variables floating
00:18:24.540
around and now I need to see, does this variable cause this one or is it the other way or they're
00:18:29.660
not right? So I love the science as play metaphor. I love that. Yeah. So, uh, yeah, thank you. I like
00:18:38.620
what you said. Thank you. Okay. So I want to come back to Cornell in a second, because I, in looking at
00:18:45.340
your bio, I think there might be links between your doctoral training and actually my doctoral dissertation.
00:18:52.960
So I'm going to keep you intrigued for a moment or two, but before, so before we come to Cornell,
00:18:58.340
we have to talk about what happened at Harvard. So you joined Harvard, I think in the early two, is it
00:19:04.940
2001 or 2003? When did you join? I think 2003 probably. Okay. And you were there until very
00:19:13.020
recently. You, you, were you a, you were a tenured professor? Yeah, I was a full professor. So they
00:19:19.020
call it with indefinite duration. That's what they call it. Okay. So, so indefinite duration is another
00:19:25.580
term for tenure or it's a different process because I'm trying to understand how they can get rid of you
00:19:30.660
if you have the tenure protection. Uh, well, uh, I don't know. So I think it's at least similar to
00:19:39.680
tenure. So, okay. Okay. So then you're there. You're obviously, I mean, I checked your, uh, Google
00:19:45.020
scholar metrics. It's certainly, you weren't a slouch. It's not as though you weren't being productive.
00:19:51.440
I think, I don't know, an age index of 80 or something like that. So for the people who don't
00:19:56.440
know what that is, it's a pretty impressive, uh, well, not only publication record, but also in
00:20:01.640
terms of the influence of that, of the, the, the publication. So it's certainly not because,
00:20:06.480
you know, you were dead wood that we need to get rid of you. So what happened? Walk us up. I mean,
00:20:11.860
are you allowed to talk through the story or is it a litigation or is it, am I not allowed to ask you?
00:20:16.180
I can say whatever I want. Okay. So go ahead. So tell us what happened.
00:20:20.240
Um, so first of all, I never had any problems with my close colleagues that I work with at Harvard.
00:20:27.400
They were all very good. And, uh, a lot of them agree with me on the pandemic. So there was never
00:20:32.340
any personal problems with any of my colleagues. The, the leadership, both at Harvard's Mass General
00:20:39.860
Hospital, uh, as well as Harvard Medical School, they were not too happy with the great brain to
00:20:45.500
the declaration. So they were struggling how to deal with me. Um, but, and I guess there was two
00:20:54.260
people who tried to set up a debate with me and other Harvard faculty who was in favor of lockdowns
00:20:59.360
and school closures. And I said, yes, of course, but there was no takers on the other side. So there
00:21:04.260
was no debate or discussion about this at Harvard. So one thing that we disagreed about is, uh,
00:21:10.720
immunity. So if you've had an infection, you have immunity. That's how the immune system work. And
00:21:16.740
we've known that for two and a half thousand years since, uh, the Athenian plague in 430 BC.
00:21:23.620
Uh, and for example, the hospital, you will have nurses. They worked at the hospital. They took care
00:21:29.600
of COVID patients. They got infected. They stayed home for a few weeks. They recovered. They go back,
00:21:34.580
taking care of more COVID patients. And then the hospital administration was fired them because
00:21:38.760
they didn't take the vaccine, uh, even though they had, uh, superior immunity compared to those who,
00:21:44.860
uh, uh, who took the vaccine. So when I was, uh, so they instituted a vaccine mandate and I had already
00:21:53.140
had COVID. I also have an immune deficiency that's genetic alpha 1 at the tryption deficiency. So I think
00:21:58.320
it would be very unwise for me to take the vaccine, but also it was completely unscientific and unethical
00:22:04.660
because I mean, there are problems with these vaccines, but suppose it was the perfect vaccine
00:22:09.760
when there's a vaccine shortage, you shouldn't give it to people who don't need it because they
00:22:13.520
already had COVID. You should give it to like my 87 year old neighbor who hadn't gotten it yet
00:22:18.560
because there was a vaccine shortage or people in, uh, in Asia or Africa or Latin America where they
00:22:24.860
had shortage of vaccines. So it would have been unethical for me also to take the, take the vaccine
00:22:30.640
for that reason. So, uh, they did give exemptions to a number of faculty members, but, uh, my medical
00:22:40.260
exemption was denied. And I also did a religious exemption, uh, which I didn't expect to, to be
00:22:46.440
granted, but that was also denied, but I think they should have accepted my medical exemption request.
00:22:51.700
Okay. So then that happened, but, but that happened a few years ago, right? The,
00:22:55.700
so, so that happened, uh, from Mass General Brigham. So, uh, I, uh, they fired me and then Harvard
00:23:04.260
University put me on leave for two years. And then a few months ago, they ended the leave. So the firing
00:23:10.760
actually sort of happened a little bit two years ago. And then a few months ago, they, uh, they put me
00:23:17.140
on leave. And then I think said, well, now I'm not affiliated with Harvard anymore. So I can sort of go
00:23:20.900
public with what happened. Wow. And so do you, uh, I, again, I, I don't know if you're comfortable
00:23:27.000
answering this. Is there a pathway for a lawsuit that's forthcoming in the future?
00:23:34.520
Well, I haven't really decided. I mean, I'm from Sweden and we don't really do those things.
00:23:39.080
Uh, I do, I do have another lawsuit, uh, going, which is called, uh, Missouri versus Biden or,
00:23:45.460
or, or, or Murphy versus Missouri. So I'm one of five co-plaint. Uh, so there's two states,
00:23:52.480
uh, Missouri and Louisiana, and then we are, we are five private co-plaintiffs who were censored
00:23:58.200
by social media at the behelf of the government. So, um, the, the temporary drug action was up at the
00:24:07.280
Supreme court actually on Monday, uh, the oral arguments and they will decide probably by June sometime.
00:24:13.800
Well, I, now do you forgive the ignorance of the legal question here when it goes up in front of the
00:24:21.100
Supreme court, do you actually serve as a, uh, witness or is it only the lawyers who take on the
00:24:30.660
case that are your voice or do you actually address the court? No. So I have, uh, I wrote, uh,
00:24:38.880
uh, written testimony under oath of what happened to me. Uh, then the lawyers, they write the briefs
00:24:46.900
because I don't know constitutional law. I, I strongly believe in the first amendment, but I
00:24:51.480
can't write those briefs. And then there's one of the attorneys who is arguing in front of the
00:24:55.100
Supreme court. So I don't have to, I don't have to argue in front of the judges, which I will be
00:25:00.280
completely incompetent in that. Wow. So how, how long does that process take to unfold? I mean,
00:25:07.000
when, when will there be a bottom line to that process? So the interesting thing is that, uh,
00:25:14.100
we have the discovery and the deposition and we deposited, uh, Anthony Fauci, for example,
00:25:19.300
and asked him many questions and, uh, for seven hours. And during those seven hours, I think there
00:25:25.160
was 170 times that his answer was, I don't remember. So we didn't remember much of what happened during
00:25:29.480
the pandemic, uh, uh, poor fellow, but, uh, that the discovery and depositions was actually quite
00:25:36.320
useful because that's what's unearthed the, the magnitude of the government, fellow government's
00:25:42.280
involvement in the censoring of the social media companies. And then later on, we also got the
00:25:46.580
Twitter files with added to that. And of course that was more public. Uh, and we won, uh, so we then
00:25:53.380
filed a request for a temporary injunction, which means while this goes on in the court,
00:25:57.560
it will take several years. We don't want the government to coerce social media companies.
00:26:03.580
They should stop now. So we won that case in, uh, in the district court in Louisiana, they issued
00:26:11.040
their, uh, their verdict on July 4th last year. I'm not quite sure why they picked July 4th, but I
00:26:17.260
don't think it was an accident. And then, uh, early fall or late summer, uh, the federal, uh, court,
00:26:25.600
they appealed it. So we won that case in the fifth circuit court of appeal. It was unanimous by the
00:26:31.720
judges. And then the federal government appealed it to the Supreme court. And the name was then
00:26:38.180
changed from Missouri versus Biden to Murphy versus, uh, Missouri. And that's what, uh, oral arguments
00:26:45.000
was heard, uh, uh, just a while ago. Why, why is it Missouri and Louisiana? What, what's the link
00:26:51.380
with those two States? Why those two versus others? How does that work? It was the attorney
00:26:56.700
general. So those two States, uh, that filed it. Oh, I file it on behalf of their citizens who,
00:27:03.620
who, when you, when you, uh, violate the first amendment, you violate speech. So when I'm a
00:27:11.480
sense that, okay, I'm, I'm a victim, but actually other people are more victims because they don't get
00:27:17.620
to hear all the information that they could hear. So I want to hear what you have. I want to hear
00:27:22.540
what you have to say. So if they censor you, well, that's bad to you, but it's actually bad to me
00:27:26.700
also, because I can't hear what you're saying. So by, by them censoring you in Boston, you're not
00:27:32.180
a Missouri, you're not a Louisiana, but there is some prospective audience member in Louisiana or
00:27:38.260
Missouri that could have heard the voice of Martin that was suppressed. And therefore I can come in as
00:27:44.320
the attorney general of those States and so on their behalf. Correct. So they are suing on behalf
00:27:49.560
of the citizens, but then we are five co-plaintiffs who were censored, including one from Louisiana
00:27:55.800
and Jill Hines, but it's myself, it's, uh, Dr. Aaron Kriati, who was fired by U.S. Irvine and then Dr.
00:28:04.120
Jay Balashari at Stanford, and then one journalist, Jim Hoft. So those are the five private plaintiffs.
00:28:08.960
Wow. Well, you know, it's, it's, uh, it gives me great pride to, so Kriati and Jay have become
00:28:16.760
good friends. They actually served as blurbers, endorsers on my last book on happiness. And so I am
00:28:24.560
in a great company here. Uh, speaking of suppression, of course, on social media platforms, uh, you know,
00:28:31.320
over the past while, uh, Elon Musk and I have developed a very nice, uh, bromance. And, uh,
00:28:38.240
I've often said that, uh, of all of Elon's, you know, astonishing achievements in his life,
00:28:46.260
in my view, none will be historically as important as him having bought Twitter. Do you agree with
00:28:53.920
that? Or do you think it's hyperbolic on my part? Uh, no, I agree with that. Right. I mean,
00:28:59.540
cause some people said, really, do you think that's the bigger, you don't think Mars, you
00:29:02.700
don't think, uh, you know, saving, uh, the climate with Tesla, you don't, I said not, nothing.
00:29:08.460
I mean, you know, it sounds like a cliche at this point, but nothing is as important as
00:29:12.580
freedom of speech. And certainly as someone who comes from the middle East, I can really
00:29:15.740
appreciate that. Okay. Let's switch. I think that's very true. And during the pandemic, I
00:29:21.420
was kind of in an interesting position because in the U S I was on the fringe, according to
00:29:25.780
Francis Collins, that was a fringe epidemiologist that had to be, uh, uh,
00:29:29.540
devastated take down against me and my colleagues, but I'm a native of Sweden and
00:29:34.860
Sweden was the only major Western country who did not lock down, kept schools open and
00:29:40.300
so on. So there I was actually in the mainstream. And when there were, there was, Sweden was
00:29:46.380
heavily criticized internationally, but also, uh, by a group of 22 scientists. Uh, only one
00:29:52.720
of them was an infectious disease immunologist, but there was also like a mathematician and
00:29:55.900
oncologist, the climate scientists, and so on, but they were very harshly pushing against
00:30:01.400
the Swedish system. So I was writing, uh, in support or what Sweden was doing, uh, against
00:30:07.840
them, but I'm very, so I didn't like what they said. I was against what they said, but not
00:30:15.420
only do I think that they should say it, I'm really happy that they did. Right. Because since
00:30:22.320
they were thinking these things, there were obviously other people in Sweden who were thinking
00:30:25.880
similarly, I was asking questions. So by actually taking the effort and time to write it down
00:30:31.780
and send it to the major newspapers, uh, they sort of voiced those concerns that was there.
00:30:37.500
And then I and others had an opportunity to respond to them in the newspapers. So I completely
00:30:45.540
disagree with them, but I'm very, very grateful that they actually spoke up and that we could
00:30:50.500
have that discussion and debate because it allowed me and others to sort of counter it and explain
00:30:56.840
why Sweden was doing it the right way. So I like to thank them for doing that. And I think that's
00:31:04.160
the attitude we should have. If somebody has a different viewpoint on whatever, we should thank
00:31:09.700
them for expressing it so that we can have that discussion and debate.
00:31:12.640
Has anyone who was vehemently against your position? So to my earlier point about the most, uh, surprising
00:31:23.000
thing about human nature is the fact of people not willing, willing to, you know, revise their
00:31:28.300
positions in light of incoming new evidence. Has anyone who was very much your detractor come back to you
00:31:36.760
privately or publicly and said, dear Martin, boy, you were right. Apologies. Or is the orgiastic hubris still undefeated?
00:31:49.320
Among the prominent advocates for lockdowns? No, no, nobody has done that. Amazing.
00:31:55.240
Among other people, there are people who were sort of privately or within their own group were in
00:32:00.720
favor of lockdowns and they sort of acknowledged, well, I'm, uh, I'm right. And you were, I was wrong
00:32:05.540
and you're right. So there's been quite a few of those, uh, and some did it privately and some actually
00:32:11.800
did it publicly. So, I mean, that's very honest and a lot. I think that they do that.
00:32:16.880
But, but that speaks to the fact that it is very much ego driven because the more prominent I am,
00:32:22.800
the more public I was, the more the Francis Collins and Fauci I am, the less likely I am to
00:32:30.060
deviate from my original position because of the hubris, because of the ego. So that I think
00:32:35.300
perfectly fits with what we're talking about. So I think I've, I've heard sort of the grave
00:32:40.040
point that Francis Collins regretted calling us finch technologist, uh, but he hasn't reached out
00:32:45.280
personally. And I also know that he publicly at one point stated only a few months ago saying that,
00:32:51.080
well, we made a mistake by only focusing on COVID. We should have looked at the collateral damage
00:32:55.840
to other public health. So he sort of has acknowledged that he didn't get everything
00:32:59.920
right. Uh, but there's been no, uh, no sort of more, no personal, uh, connect contacts. And, uh,
00:33:08.860
um, but I think that in the back of the minds, they, they realized that things didn't go, uh, go very well.
00:33:16.420
So that that's a perfect segue. I mean, both we'll talk about Cornell, but let's talk about
00:33:21.440
operations research, which, uh, because you said, Oh, now we realized that we shouldn't have
00:33:26.620
only focused on deaths and COVID, which is a very nice way to set up the idea of when you are trying
00:33:33.980
to maximize or minimize an objective function, you could be trying to maximize or minimize a singular
00:33:41.020
variable, or you understand that there are multiple variables that need to be minimized or
00:33:46.560
maximized subject to certain constraints. So I think for any intelligent person, that should be
00:33:52.020
obvious, but I think if you have been trained in the framework of operations research, where that is a
00:33:58.640
fundamental framework by which you set up problems, it makes you less likely to have succumbed to the
00:34:06.220
singular variable myopic focus. Do you think that that's true? Do you think it's just,
00:34:12.240
in other words, do you think that your training as an operations research specialist inoculated you
00:34:19.020
against a single variable bias? Yes. Operations research is how you use mathematics to optimize
00:34:25.380
various functions in many variables. So I had never thought of that, but I think that's a very
00:34:30.180
interesting, and I have to agree with you that that's, that might've been quite helpful. I mean,
00:34:35.140
I always thought of it as in terms of the foundations of public health, and is it that you don't just
00:34:40.320
optimize one thing, you have to look at it all. So I always thought of it as, as that, but I think
00:34:46.100
that's quite interesting. And maybe that was very useful education for, for the pandemic back in the
00:34:53.040
80s in Cornell. So yeah. Well, I mean, I like that analogy. Yeah. Oh, thank you. Look, I, so my, I always,
00:35:02.760
I mean, there were two things that I was really interested in in my life. You may not know my
00:35:06.960
personal history, so I'll mention it for you and for some of the viewers who may not know. I was a very
00:35:11.280
good soccer player. And so I wanted to continue in my soccer career. And I was always from a child,
00:35:17.200
I said, I want to become a professor. I wasn't sure in exactly what, but the, the thought of,
00:35:23.020
you know, professors seemed like superheroes to me and that's what I wanted to do. So I always knew
00:35:27.540
that that's what I was going to become. And so as I entered my undergraduate, uh, I thought, okay,
00:35:34.780
well, if I'm going to spend my entire career solving problems, irrespective, I didn't know what it was
00:35:41.280
going to be in. There were several possibilities. What is the discipline that I should study in my
00:35:48.100
undergrad as the foundation of anything else, irrespective of whether I go on in this, those
00:35:53.480
fields or not. And I decided in my undergrad to, to do an undergrad in mathematics and computer
00:35:59.240
science. Uh, and then even all the way up. So in my, in my MBA, I did an MBA that then I went to
00:36:05.020
Cornell to do my MS and PhD, but in my MBA, I did a mini thesis in operations research on something
00:36:10.740
called the two-dimensional cutting stock problem. When I got to Cornell, I, one of my doctoral
00:36:15.400
committee members, I wonder if you know him, was a very famous statistician at Cornell named Ali Hadi.
00:36:21.180
Do you know him? He wasn't there in the department. I was there, but I know who he is. Yeah.
00:36:26.740
Right. Okay. And, and I can't imagine that I would have gotten the same cerebral training,
00:36:35.940
right? I mean, I don't, I, it has been a very long time since I solved the differential equation,
00:36:40.720
and it's been a very long time since I did a triple integral. Exactly. But that training,
00:36:48.680
right? I mean, when, when you're training as a soccer player, you also do sit-ups. Well,
00:36:53.140
it's not because your abs are going to kick the ball. It's because the body has to be fully
00:36:58.520
in shape. And therefore having that training, whether it be the OR part or more generally the,
00:37:04.960
the super technical analytic part has to have helped me irrespective of what my innate intelligence
00:37:12.860
was. And so when I now tell my children, they're, they're, they're, they're still teenagers. I just
00:37:19.140
keep saying math, math, math, math, math, math all day long. It doesn't matter what you become later.
00:37:24.420
Some of the most famous academic psychologists all have the exact same background. They all did
00:37:29.300
undergrads in mathematics. Amos Tversky, Daniel Kahneman, all those Nobel prize winners all had
00:37:35.100
math degrees. So having said all that, and obviously you have a PhD in, in, in an applied
00:37:40.200
mathematics field. Do you, do you think I'm overselling the math angle or is it, are these good
00:37:44.840
advice to give to our youngsters? No, I think you're underselling it because all the things that you
00:37:49.840
said is true, but there's also one other thing. Math is lots of fun. It's a fantastic thing to do.
00:37:55.640
Yeah. Yeah. Yeah. Yeah. Okay. Let's talk about Cornell. So you got the, I I'm trying to think if
00:38:03.520
we overlap, I don't think so. You, you were there from one to one? From 84 to 88. Ooh, I just missed
00:38:10.960
you. I got there in 1990. Uh, uh, and I finished everything in 94. Now here's where, remember earlier
00:38:18.640
I said that there might be some overlap, even in some of the technical stuff that we did, or maybe not.
00:38:24.540
I saw, I think in your, either in your dissertation or in some of the publications post your
00:38:30.860
dissertation, the term sequential sampling and so on. Was it, was that right? Yeah. I've done,
00:38:37.280
and I've done work on sequential analysis, which is a branch of, uh, uh, mathematics, which is
00:38:43.660
actually used a lot in medicine. Uh, so it's a very practical area of, uh, of mathematics and
00:38:53.120
statistics. Very, very important for, uh, for the, in the health sciences. The reason why I'm smiling
00:38:58.620
because now, now my inner geek is coming out because my doctoral dissertation, I think you will
00:39:04.680
know the, the, the word, the, the specific individual, maybe some of our listeners won't,
00:39:09.160
so we can explain it to them. I'm, I'm assuming you're familiar with Abraham Wald? Oh yeah. He's,
00:39:14.960
he laid the foundation for sequential analysis. Oh my God. Okay. Okay. Geek alert, everyone. We're
00:39:21.300
about to enter into a geek zone. So let me explain to you, um, Martin, what my doctoral dissertation
00:39:28.240
was on and then we'll, we'll, we'll see where that takes us. So my doctoral dissertation was on
00:39:36.160
binary sequential choice where sort of the, the genesis of the framework comes from Abraham Wald
00:39:43.600
with his sequential analysis, right? You know, uh, sampling itself can be a random variable where you
00:39:51.560
reach a threshold of, of evidence that you no longer need to sample additional information.
00:39:57.080
It's used in statistical decision theory. You know, how, how much do I need to acquire evidence
00:40:03.380
to either prove that a manufacturing system is defective or not? So I took all that stuff,
00:40:10.140
right? So you've got the, again, I'm not, I'm not explaining this to you. I'm explaining it to,
00:40:15.500
to our audience and then you can fill in. And so I took this idea that a lot of contexts where there
00:40:22.220
are decisions being made, you're trying to reach a stopping threshold that allows you to say,
00:40:28.780
I have now seen enough evidence to stop acquiring additional information. And so I took that framework
00:40:35.280
and I did a doctoral dissertation demonstrating how, whenever you're choosing between two final
00:40:42.760
alternatives, it could be choosing between two political candidates, between two prospective
00:40:48.400
suitors to marry, between two cars to purchase. It doesn't matter. The process is one where it's a
00:40:55.800
race to either reach the stopping threshold of alternative A or B. And once you reach that
00:41:03.080
threshold, you stop. And that goes against the classical economic paradigm, which basically said
00:41:08.120
you should have looked at all of the available and relevant information. Whereas a stopping threshold
00:41:13.580
model says, no, you only look until you hit that threshold. How does that fit within the stuff
00:41:19.840
that you did? Does it resonate with you? No, that's very similar to work I've done in, in, in,
00:41:25.540
in a medical. I figured. And Cornell has a very famous professor in sequential health called Bruce
00:41:30.760
Turnbull. And so what he and others have been doing a lot is developing this for clinical trials. You have
00:41:38.980
randomized trials where you, where you look whether a drug should be approved or not, uh, or, or a vaccine
00:41:45.960
or a particular treatment. It could be a surgery. So you're experimenting a random, some patients to A
00:41:51.140
and some patients to B, but you maybe have decided, okay, I'm going to recruit a thousand people in
00:41:56.040
this group, but you don't want to just want to do that because you want to do intermediate analysis.
00:42:01.440
So after you have 100 in each group, you might look at the data. And if it's very clear that this
00:42:07.120
drug saves lives, you want to stop it and give everybody the drug. Or if it's very clear that
00:42:13.220
maybe this drug has some adverse reaction and people are getting heart attacks and dies,
00:42:17.720
we also want to stop it because it's not a safe drug. So you do this interim analysis
00:42:23.120
continuously during the trial and sometimes you stop it early. And one of the things that, uh,
00:42:30.640
I have done is, uh, during working with CDC and others is to use this concept in drug and vaccine
00:42:40.560
safety. So CDC has a program called the vaccine safety data link. So I, I helped develop the
00:42:47.480
statistical methods and epidemiological methods for how to monitor the safety of this vaccine. So
00:42:52.800
after it has been approved, you still want to monitor safety because there could be some rare,
00:42:57.920
uh, but serious adverse reactions that didn't show up. So we got weekly data and then, uh, from, uh,
00:43:07.560
insurance, uh, from health plans. So we know all the kids who got the vaccine and then we see
00:43:12.780
what do they have? Do they have a seizure on any days after it? And then we monitor every week. We get
00:43:18.380
more data. We monitor it. So if there's a problem, we can detect it as soon as possible. And we use them,
00:43:24.180
these methods to sort of say, okay, we have to stop now because there's this alert that something might
00:43:30.220
be wrong. Now that could be that the vaccine is bad. It could also be that there was some data errors
00:43:36.480
because statistics cannot distinguish between those two, but it's, it's okay. We have to look
00:43:41.000
into it now. And for example, we use this method when there was a new, uh, vaccine against measles.
00:43:47.560
Uh, so they used to be measles, mumps, rubella vaccine, and then they got a separate shot for
00:43:52.340
varicella, chickenpox. And Merck decided to have a combined one that combined all four in one needle
00:43:58.860
because kids don't like needles. So they get one needle instead of two. Uh, but it turns out that
00:44:04.800
this new vaccine, and when you gave it to one year old, there was an excess risk of febrile seizures
00:44:10.100
seven to 10 days after the vaccine. And it's not a life threatening thing, but you want to avoid it.
00:44:15.540
So after 25,000 doses, we found this alert and eventually soon after they sort of CDC has
00:44:24.620
changed the recommendations to not recommend this for one year olds. So that's a very practical
00:44:28.920
example where the public health benefits from what, uh, Wald, Abram Wald started, I guess,
00:44:38.460
it's 70, 70 years ago now. Yeah. 1940s. Yeah. 80, 80 years ago. Yeah. Yeah. Yeah.
00:44:45.320
Well, you know, it's going to be another jump here from sequential analysis to, I mean, I don't know
00:44:52.180
if he was directly your colleague, but E.O. Wilson, who was a Harvard professor. I've often,
00:44:57.440
did you ever meet E.O. Wilson, by the way? I don't think I met him, no. No. I was just a simple,
00:45:03.020
I was just a simple graduate student when I was at Cornell. So no, no, that's not, he, I didn't meet,
00:45:07.900
but it was a Harvard biology professor. Oh, okay. Sorry. No, I haven't met him. Yeah. E.O. Wilson is a
00:45:13.780
Harvard biologist who actually was injected into the culture wars, you know, when we were, you know,
00:45:20.660
young children, because he, he is someone who introduced the ideas of sociobiology to study
00:45:28.080
human affairs. And of course, as you probably know, Martin, most social scientists till today
00:45:33.020
refuse to accept that human beings are biological creatures shaped by evolutionary forces, just like
00:45:39.640
any other species. That's been much of my scientific career has been trying to incorporate evolutionary
00:45:44.360
psychology in studying economic decision-making, consumer decision-making. The reason why I'm
00:45:48.880
raising E.O. Wilson is because he wrote a book, which maybe my viewers are tired of me promoting it,
00:45:55.480
but he wrote a book in the late nineties called Consilience. And consilience is a beautiful word,
00:46:02.120
which refers to unity of knowledge. So for example, you know, physics is more consilient than
00:46:08.560
sociology, not because physicists are inherently smarter than sociologists, but they have an
00:46:14.000
organized, coherent tree of knowledge. And consilience is also about building bridges across
00:46:19.940
disciplines. So in E.O. Wilson's case, he was talking about how you could use evolutionary theory
00:46:24.700
to understand phenomena in the humanities and the social sciences and the natural sciences.
00:46:30.260
And the reason I'm mentioning all this is because my mind works at that synthetic level. I'm always
00:46:35.620
trying to draw links between pieces that you should not be expecting there to be a link. And so coming
00:46:42.940
back to our discussion about sequential sampling, one of the things that has always interested me,
00:46:48.500
I mean, I did write about it in several of my academic papers and so on, but I've thought about
00:46:53.580
one day, maybe this is something we could work on together. I thought about writing one of those big
00:46:57.880
meta papers demonstrating that that sequential sampling process applies across a bewildering
00:47:07.020
number of areas that people typically wouldn't associate, right? So you hit me now with the
00:47:11.360
medical trial example. I can show you how men and women use it to stop acquiring additional information
00:47:18.500
and choose a mate or reject a mate. Now that's a very, very different process than the medical sampling
00:47:24.840
that you're talking about, but the fundamental stopping threshold is exactly the same. So what
00:47:31.100
do you think about that? Do you think that it would be a cool paper, you know, one of those behavioral and
00:47:35.280
brain sciences paper, those meta target papers where you demonstrate the ubiquity of sequential sampling
00:47:41.740
across a bewildering number of human affairs? I think that's fascinating. And I have thinking mostly
00:47:48.920
in the health field, but you're right. Those are everyday decisions. Well, it's not everyday decision
00:47:54.120
to choose a spouse, but sort of common decisions. And in some cases, like a clinical trial, we put
00:48:00.440
mathematical numbers on it. But even if we don't do that and other things, the principles are the
00:48:05.560
same. Exactly. Exactly. So I think that's fascinating to sort of draw those parallels. And by the way,
00:48:12.100
that's what good scientists do. Exactly. I've always thought I would love to be able to have a machine
00:48:19.640
where I could put a scientist and his conciliance score comes out, right? Because in the same way
00:48:26.380
that we now, okay, we can debate whether IQ captures intelligence and so on. But I mean, it just prima
00:48:33.140
fascia. Of course, there's innate differences across people in terms of intelligence, whether the IQ is the
00:48:38.600
best measure or not, we can debate. I think it has done very well. So in the same way that we have an
00:48:43.560
IQ score, I would love to calculate a conciliance score. And I'm willing to bet as an open empirical
00:48:51.180
question that the higher my conciliance score has to correlate with higher IQ. Because you wouldn't be
00:48:59.900
able to draw those disparate links between these areas that most people had not thought about if you
00:49:06.320
don't have that higher meta intelligence. Does that make sense to you?
00:49:12.560
Yeah, but maybe there are different types of intelligence. Okay, fair enough. Because I think I
00:49:18.420
am reasonably intelligent on certain things. But there are also some parts of life that I'm certainly
00:49:25.220
quite stupid at. Which reminds me of a joke. So I'm, although I'm, I think, pretty technologically
00:49:33.840
savvy, my brain shuts down when I have to do certain apps thing on my iPhone, like, like the
00:49:40.160
parking app. So my wife will look at me and she said, she says, how is it that you can be so fancy
00:49:46.200
in all your professorial career, but when it comes to just entering the app, which an average seven
00:49:51.460
year old could do, you suddenly turn into a complete imbecile? I go, that's, that's, those are the
00:49:56.080
mysteries of life, sweetie. Well, I'm smart in the way that when I buy a phone, I always get the same
00:50:02.740
brand as my son. Because that way, I can ask him how it works. So my smartest is the choice of what
00:50:10.800
phone to buy. And then I rely on him. Gotcha. All right, let's talk about what some, what is your
00:50:17.680
future trajectory. So now you're out of Harvard. Do you have any interest in re-entering academia
00:50:24.880
or is it in your rear view mirror and we're moving on to bigger and better things?
00:50:30.780
Well, I'm still doing science. I have colleagues that enjoy working with me.
00:50:35.200
I have never accepted money from the pharmaceutical industry and I will never do that. So that's,
00:50:39.760
that door is closed. So you're not going to industry. Okay. No, I'm not going to do that.
00:50:43.740
Um, but, uh, I work with colleagues who have, who are grant funded and who, who likes to work with me,
00:50:51.240
but also I'm working with, uh, uh, Scott Atlas and Jay Borashaya. I mean, the pandemic is now over
00:50:57.960
and COVID is endemic. It will be with us forever, but the pandemic is over. So that's work is sort of
00:51:03.340
done. Uh, but academia is in trouble and I don't know if it will, if will, uh, it will survive or not.
00:51:11.940
I mean, there will always be universities and so on, but if we don't have academic freedom and
00:51:16.920
freedom of speech, science is not going to, uh, survive. It's not going to, or the science will
00:51:22.340
always be there, but the scientific community and the scientific process will, uh, will slowly
00:51:28.460
crumble and die. So I think it's important for us to, uh, try to put the scientific enterprise on
00:51:37.820
this back on its track again. I think this problem with how funding is done is far too
00:51:41.980
centralized. One of the problem with, uh, was that during the pandemic, Francis Collins sat
00:51:49.700
on the biggest pile of research money in the world. And, uh, Anthony Fauci sat on the biggest
00:51:54.420
pile of infectious disease research money in the world. People did not want to contradict what
00:51:59.280
they said because they would put the livelihood at stake. So if you can have a more decentralized
00:52:03.680
system, so instead of one, uh, NIID, we have six. And if one is run by Fauci, we have at least five
00:52:11.360
that are operating. We need to change the scientific publishing process is done.
00:52:16.480
So in a sense, if I can analogize to how, you know, there are three branches of governments
00:52:23.840
in the U S that serve as checks and balances against each other, in a sense, you're arguing
00:52:28.760
for something similar where you can't have a singular judge and executioner that sits on
00:52:33.920
the pile of money. And if he's not happy with you, you're basically out of luck. That's, that's
00:52:38.440
basically what you're saying. Yeah. We have like a, like Sunata Gupta sort of, uh, uh, uh,
00:52:43.480
spread. We have a cartel system, not in science as a whole, but each little field of science
00:52:48.040
is sort of a cartel with the leading people controlling the money and the publications
00:52:52.440
and so on. And I think we saw during the pandemic, because there was some very important questions
00:52:57.920
that we had, you know, does mask work or not? How does transmission, does children transmit,
00:53:02.760
uh, which is better the acquired, uh, infection acquired immunity or vaccine acquired immunity?
00:53:09.640
What's the, uh, uh, how long does the efficacy of the vaccine work? And those who came up with the
00:53:17.320
first information about these critical questions, but not the scientific powerhouses of United States
00:53:22.760
and the UK, it came from places like Denmark, like Qatar, Israel, Sweden, Iceland. There were excellent
00:53:31.800
papers from the pandemic coming from these countries. And these are countries with very good scientists,
00:53:37.080
scientists. So high quality scientists, but they are small and they have their own sort of
00:53:41.400
funding mechanism. They are still in the same publication realm. So they are not completely
00:53:46.040
isolated from this cartel system, but they are more independent. They are more on the fringe.
00:53:51.640
So I think that's the reason why something, a lot of the good scientific papers came, uh, from this
00:53:59.400
fringe country, so to speak, they are, they are fringe on the, in the, in the academic sense,
00:54:05.480
they're not part of the US, UK powerhouses. Interesting. So is, so in the answer that you
00:54:12.200
gave is, am I to read that, you know, one of your next big projects is to contribute to the amelioration
00:54:20.840
of the university ecosystem? Is that, is that where you're going to throw your hat into the ring?
00:54:25.960
Yeah. We want to decentralize how academia operates both, uh, both in the, where the funding is very
00:54:32.040
hard, but with scientific journals, we have now a few dominating scientific journals and they failed
00:54:37.560
us during the pandemic, where there was science or New Angelo medicine or British medical journal,
00:54:43.800
they did not perform well during the pandemic. So we need a new system for, for, uh, scientific
00:54:50.040
publications. And, uh, we have, I don't know if it's going to work, but we have to try to put the,
00:54:57.880
the academic ship back on the right track again. But I mean, to, to build on what you said, I mean,
00:55:02.600
you, you were, you, you largely focused on the, the, the funding element, which of course is an
00:55:07.720
important element, but I think there's even a, a more fundamental problem. And you'll tell me if you
00:55:13.000
agree. So one of the things that I've noticed, and I mean, I, I'm a bit younger than you, but let's,
00:55:18.760
let's call us both at least 30 plus years as professors. What I've noticed to my great chagrin and
00:55:25.560
disappointment is that many academics are actually not intellectuals in the true sense of the term.
00:55:32.920
In other words, they are professional academics, they're careerists. So that to our earlier point
00:55:38.920
of science as play, the true scientist, irrespective of which discipline he's in, he really is a child.
00:55:47.240
He's, he's looking, as you said, he's excited about the data. He, you know, right. I mean, I can sit
00:55:52.200
and I mean, one of the reasons why my show has resonated so well with so many people
00:55:56.280
is because, you know, I'm able to sit down with the biostatistician epidemiologist and
00:56:01.560
hopefully have a good conversation, but I could sit down with the classicist and have a good
00:56:05.720
conversation. And I could sit down with the cosmologist and have a good conversation.
00:56:09.400
Because again, to our earlier conversation point, I have a synthetic mind that allows me
00:56:14.280
to navigate. I'm, I'm never going to know about epidemiology more than Martin, but I know enough
00:56:20.520
about it that I can extract a lot of good stuff that we could have a conversation. And, and that
00:56:25.960
comes from the fact that I am a full blown orgiastic intellectual from A to Z. That's what I live all
00:56:33.320
day long, whether I'm in my thoughts or I'm reading. You're curious and you're open-minded.
00:56:38.680
I'm curious. Exactly. It seems though, so to, to ameliorating the university, so I'm going to come to
00:56:45.720
that point. It seems as though the selection process by which we pick academics on many
00:56:53.080
dimensions that does well. Yes, they're smart and they got good grades and they entered good to a
00:56:57.560
good graduate program. It would be nice if we could pick them in such a way that we, we truly know that
00:57:04.440
they are intellectually pure in their curiosity, in their open-minded, that they're true honey badgers.
00:57:10.280
I want my academics to be intellectual Navy SEALs, right? The Navy SEALs, we pick them on the fact
00:57:16.440
that they're courageous and heroic and they will physically fight for us. I want my academic to do
00:57:21.960
exactly that within the intellectual realm. But the reality, as I often say, and forgive me if those
00:57:27.800
words offend you, most academics are part of a novel species, which I call the invertebrate
00:57:33.480
castrati. Not only do they not have a spine, they don't have testicles. So when you end up with
00:57:39.960
a species called academia, where most people are spineless and are testicle-less, uh, that doesn't
00:57:48.760
result in the openness and the debate and the confidence because everybody is worried about
00:57:54.840
their little myopic careers thing. Everybody's cowardly. Does that resonate with you? Is that
00:57:59.640
what you've also picked up in your long academic career? Uh, no, I think what you're saying is
00:58:05.080
unfortunately true. And, uh, I mean, I have, I have colleagues who have that curiosity, but I also see
00:58:12.840
scientists for who is it. I think it's a career. They, they worked hard in school. They got good
00:58:18.200
grades and, uh, they, uh, they know how to, to play the game, not, not, uh, uh, rock the boat and to sort
00:58:26.920
of climb up. And I think it's especially true among people who go into, uh, administrative
00:58:31.240
positions. Exactly. In the rank and file scientist, I think maybe it's half and half. I don't know
00:58:37.240
exactly, but there's certainly both types, but no, we have to also change the promotions criteria
00:58:42.840
and so on. It shouldn't be about publishing, uh, as many papers you can in the fancy journals.
00:58:48.600
It should be about new ideas and, uh, try going out on a limb and trying new things. So we also have to
00:58:55.640
change how, how, uh, how promotions, uh, work and what we value in scientists.
00:59:01.960
And I'd like to say that, I mean, I, I appreciate your pessimism about academia in general, which of
00:59:07.720
course I've been speaking about for many years, but I like to hopefully always also have some hope,
00:59:13.240
some optimism. I do think that there are ways by which many of these perks
00:59:18.360
and reward mechanisms could be changed rather rapidly. And some of these, I've actually seen
00:59:23.000
them within the lifespan of my own career. So for example, when I first started going out
00:59:29.480
into the public, right, I'm probably the first professor to set up a long form, you know,
00:59:35.480
podcast or certainly one of the first, it's now been 10 years or more, you know, I was probably the
00:59:40.600
first or one of the first guys to go on Joe Rogan and so on. Well, all of these firsts,
00:59:47.720
all my academic colleagues, including the suit, you know, I tell the story in the parasitic mind
00:59:53.480
in this book about how, when I was invited to speak at Stanford business school, my host at
00:59:58.840
Stanford was, you know, looked down that I would talk to people such as Joe Rogan. And I thought
01:00:04.440
that was, you know, bafflingly, not only elitist, but suboptimal. What do you mean? You think, you think
01:00:10.840
it's not a good idea to talk about ideas with in front of 20 million people. You don't think that that's
01:00:16.040
worthwhile. No, no, no. It's a lot more worthwhile to publish in a peer review journal. That's going
01:00:20.920
to be read by seven people, including your mom and girlfriend. Right. So, so, but now a lot of
01:00:26.840
the people who look down on, you know, you know, me becoming famous in the public eye and so on,
01:00:32.920
will now write to me and say, Oh, come to our famous university and teach us how to become popular
01:00:38.920
like you are. So within a very few short years, within 10, 15 years, many of the people who used
01:00:45.480
to look down on this public engagement and public outreach are now saying, no, wait a minute. And
01:00:51.160
you even see it in some of the metrics where, you know, some of those portals will, will tell you how
01:00:57.160
often has this academic paper been cited on Twitter and discussed it. So people are waking up to these
01:01:04.360
things. So I'd like to think that, uh, yes, there are some sort of endemic and infrastructural
01:01:10.920
problems in academia, but hopefully with a lot of hard work, we can change it. No. Are you as optimistic
01:01:16.280
as I am? Uh, no, I'm not quite as optimistic, but I like, I'm, I'm happy that you are, and I hope
01:01:23.320
you're right. I think we have to try whether it actually works or not, but, but there are, there are,
01:01:29.880
there are, there are positive things. I think that, um, I think it's always been one of the
01:01:35.960
roles of scientists to actually engage with the public. That's obvious in public health, but I
01:01:41.400
think it's true for all sciences. So, uh, so that should be one of our roles and some people might do
01:01:48.120
it more than others, but that's fine. But I think certainly that should be, uh, one of our roles. But I
01:01:54.360
think one thing that happened during the pandemic is that when people saw that they couldn't rely on the
01:01:58.760
scientists because they know that something was wrong. Uh, they know that, well, why should I have
01:02:05.080
the vaccine if I already have immunity? I learned about immunity a long time ago. So they knew that
01:02:10.600
something was wrong. So, uh, a lot of regular people started to engage at a very high level with the
01:02:17.720
science. Yeah. And they will sometimes on Twitter, you will see, uh, a non-scientist, super smart,
01:02:25.480
uh, putting holes in the arguments of scientists. And of course, some of the scientists got very
01:02:32.360
uncomfortable about that. Yes. And how dare the great unwashed deal with us in the ivory tower.
01:02:42.440
So, uh, I know you're from Canada and, uh, I have a lot of respect for you as a scientist,
01:02:47.400
but I don't have the respect for all scientists. And I think in Canada, I would actually say that the
01:02:52.120
trackers are probably smarter than the scientists. I, I think I fully endorse that statement.
01:03:00.280
Yeah. Wow. Uh, all right. So what are you, are you, are you going to be, if, so you mentioned
01:03:05.480
earlier, Scott Apples, whom I've also got to know very well. And Jay, is this going to be through the
01:03:10.680
auspices that you're talking about through the Global Liberty Institute, or is this through Stanford
01:03:16.600
Hoover Institution or what, is there a mechanism by which you guys are trying to?
01:03:20.440
Yeah. So we're collaborating with, with the Hillsdale in Michigan.
01:03:25.000
So, uh, with the Academy for Science and Freedom. Yeah. The Liberty Institute is something else
01:03:29.320
that Scott is doing. That's a great thing. So, oh, that's one, that's sort of separate.
01:03:33.720
Oh, so, so do you, uh, you have some kind of, uh, position with Hillsdale? The reason I'm asking
01:03:39.960
is because I, uh, was invited to speak at one of their functions in Naples, Florida, and, uh,
01:03:46.200
February, 2022, where I got to meet the, the, the president on and so on. And I, it just blew
01:03:52.360
me away that their, that their whole approach to things was just unbelievable. So maybe you could
01:03:57.080
tell us a bit about your Hillsdale connection. Well, so one thing that happens was that they,
01:04:02.520
Hillsdale are one of the few colleges that didn't close down. Uh, so, uh,
01:04:10.360
so that was a good thing that sort of stood against the wind and, uh, the wind was blowing pretty hard.
01:04:15.640
And of course they never had any vaccine mandates either. So they got it right. Uh, they are,
01:04:21.160
they're not a research university. They live in a large college. Sure. Uh, they have students
01:04:26.600
and faculty who are super smart and very, uh, who, who sort of likes to things in, in different ways.
01:04:34.200
So, uh, I think, so we're trying to set up this, so I don't have a faculty appointment or any, any,
01:04:40.120
anything like that with them. Um, this is not what that's about. This is sort of an, uh, an academy,
01:04:45.800
uh, that they are sort of helping us to launch very kindly.
01:04:49.560
Oh, that's amazing. Uh, well, there is another school here and to the president of that school,
01:04:55.640
you're welcome at the promotion I'm giving you right now. Uh, new college of Florida. Have you,
01:05:01.160
have you heard of that school? Yes. Uh, I know about that school. That's sort of a, uh,
01:05:05.960
honors college, small honors liberal college in Florida. It's a state college, I think.
01:05:11.000
Exactly. And so, uh, as you probably know, I mean, uh, that school had started exactly,
01:05:16.520
as you said, as an honors college, uh, and then through the years, it got super woke-ified.
01:05:22.760
You know, if you're transgender and you have purple hair, you need to go to new college
01:05:26.680
of Florida and so on. And then Ron DeSantis came in, tried to clean it up, de-woke-ified it,
01:05:31.640
brought in a new president. And so they're trying to now be, I mean, I don't think they use those
01:05:37.000
words. I I'm using them on their behalf. Uh, I think they're kind of becoming or trying to become
01:05:42.680
kind of the Hillsdale of the South, super classical liberal, you know, uh, serious place. You could
01:05:49.480
study whatever you want, as long as it's wedded to reality and common sense and science and reason
01:05:54.280
and so on. And so, uh, that might be something worthwhile for you to explore. I know that I've
01:05:59.080
spoken to them. There might be some interesting partnerships that we do there. Uh, anything else
01:06:04.040
you want to promote, uh, Martin, before we wrap this up? Of course, I could keep you here for
01:06:09.000
another four hours, but I want to be mindful of your time. Anything you want to tell us about
01:06:12.440
in terms of future projects? No, uh, thank you for a very pleasant, uh, uh, conversation.
01:06:18.520
Oh, thank you. That was not just pleasant, but fun and exciting. It's, uh, it's fun to talk about
01:06:24.920
science and, uh, uh, I have done a number of interviews during this pandemic, but this is the first
01:06:29.800
one I got to talk about sequential analysis. So, well, I feel very good about that. Thank you so
01:06:35.320
much. Uh, it'll probably be up. I'm hoping later today, uh, both on YouTube channel and on, uh,
01:06:44.360
my podcast. And I also directly posted on X as a means of trying to support, uh, Elon's platform.
01:06:51.640
Not, not that he needs my small voice, but every small voice helps. Uh, thank you so much. What a
01:06:57.160
a pleasure to meet you. I hope that I get to meet you in person soon. Please stay in touch,
01:07:01.000
stay on the line so we can say goodbye offline and come back anytime you'd like.