Elon Musk Should Win A Nobel Peace Prize - Interview on Al Arabiya - Part I (The Saad Truth with Dr. Saad_803)
Episode Stats
Words per Minute
176.60156
Summary
In this episode, Riz Khan sits down with evolutionary behavioral scientist and best-selling author Dr. Gad Saad to discuss his new book, The Parasitic Mind. Dr. Saad explains the dangers of the parasitic mind, his abhorrence of what he calls social justice warriors, and how he justifies his position.
Transcript
00:00:00.000
Hello and welcome. I'm Riz Khan. My guest today is someone who's turned controversial
00:00:14.620
conversations on issues such as the woke culture into an art form. The renowned evolutionary
00:00:20.380
behavioral scientist Dr. Gadsad pulls no punches as he fights what he believes should be universal
00:00:26.340
truths and real freedom of speech. His best-selling book The Parasitic Mind explores how the human
00:00:32.740
thought process can be hijacked by irrational ideas and how these parasitic beliefs shape much
00:00:39.040
of modern society. Professor Saad's upcoming book is called Suicidal Empathy, exploring how excessive
00:00:45.620
compassion for harmful ideologies is threatening much of Western society. Well his impact and
00:00:52.000
presence on social media has earned him the friendship and support of people such as the
00:00:56.120
world's richest man. I found him to be incredibly engaging, very very humble, zero ego, just two
00:01:08.060
guys. I mean we literally hugged, sat there for four hours completely locked in. You know he was
00:01:14.280
confiding in me in a very trusting way, in a very intimate way. So I thought he was just absolutely
00:01:19.780
lovely. I mean he is an eccentric guy but I've got nothing but positive affection for him.
00:01:26.120
And Professor Saad has welcomed Donald Trump's second presidency as an opportunity to turn the tide on
00:01:31.880
woke culture and to boost freedom of speech. I caught up with him in his hometown of Montreal
00:01:36.980
in Canada for a candid conversation covering a number of controversial ideas that he's put forward.
00:01:42.960
In this, the first of two episodes, Dr. Saad explains what he sees as the real danger of the parasitic
00:01:48.880
mind, his abhorrence of what he calls social justice warriors, and how he justifies his position.
00:01:55.180
I am putting the epistemological noose around your neck. I don't have to emote louder than you,
00:02:01.320
I don't have to scream louder than you. If I can have this, you know, gargantuan
00:02:06.040
triangulation of evidence, I'll let that do the talking for me.
00:02:14.880
Professor Gad Saad, technically, but Gad Saad, I'm glad you could spend some time with me in this
00:02:20.700
conversation. There's so much to talk about. I guess I should start by asking, which pronouns
00:02:28.560
Happy to do that. Now, you've written so many books, or you've written a number of books,
00:02:32.900
I should say, that really look at the human condition from a number of angles. I'm going
00:02:37.280
to go through a couple of those down the road, but let's start with the parasitic mind because
00:02:41.280
that's the one that really got you out there. And having been through it, I don't know,
00:02:46.480
one sense I got was that you seemed almost angry or frustrated with the way society has gone.
00:02:53.940
I mean, in a sense, yes, because having been in academia for nearly, well, for 31 years now,
00:02:59.920
I always tell people I faced two great wars in my life. The first one was the Lebanese Civil War
00:03:04.820
growing up in Lebanon. The second one was the war on reason, on logic, on common sense,
00:03:10.480
on reality that I saw on university campuses. And I just couldn't understand how supposedly,
00:03:16.480
intelligent people could be parasitized by such nonsense. And so, it was really a result of me
00:03:23.700
having stood at the top of the mountain, screaming into the wind, nobody paying attention to me.
00:03:28.220
Eventually, they paid attention with the parasitic mind.
00:03:30.840
Isn't a best-selling book in its own way an infectious idea?
00:03:34.560
It is. So, in the mimetic sense. So, if you remember, Richard Dawkins famously wrote a book in
00:03:40.820
1976 called The Selfish Gene. In that book, he argued that, of course, humans are a biological
00:03:47.040
animal, but also a cultural animal. Our genes spread, but so do our memes, which is packets of
00:03:54.600
information. So, when you're reading my book, I am literally infecting your brain with my ideas.
00:04:00.160
Now, you state in the book that people should proactively judge, not just judge simply to be
00:04:05.380
politically correct. Where does that leave someone like me? I'm an old-school journalist,
00:04:10.020
and it was kind of beaten into me years ago as I trained, that I should not take a position,
00:04:15.220
that I should not be involving myself in the story with adjectives or a commentary or an opinion.
00:04:20.640
All that's changed now in the media, as we know. But I'm kind of stuck in that world. Where does that
00:04:25.080
leave me, where I don't want to be in an argument with you, screaming and shouting that you're right or
00:04:29.120
wrong? I just want to hear your opinions and let people decide. So, in your case, if you're wearing
00:04:33.760
your hat as a journalist, then perhaps not judging is the right ethos to invoke. But in my case, if I
00:04:41.340
am a creator of truth and a defender of truth, then by definition, I have to judge, right? I mean,
00:04:46.840
I belong to the Society of Judgment and Decision-Making, as someone who studies psychology of decision-making.
00:04:52.420
So, it's perfectly natural to judge. And in the book, I talk about the fact that many people are
00:04:57.760
reticent to judge because they view it as gauche. Who am I to judge? Who are we to judge?
00:05:02.640
Now, that's an injunction from the New Testament, but it refers to not being a moral hypocrite,
00:05:07.920
right? Don't judge others if you are committing the sin yourself. You know, people who cast stones
00:05:13.420
from a glass house and so on. But that doesn't mean that we never judge. I judge those who engage
00:05:20.240
in reprehensible acts. It's perfectly part of our repertoire of human behavior.
00:05:24.440
Of course, it makes it very difficult for me, because I'm kind of stuck in this world where
00:05:28.380
conversations like this normally would involve me trying to catch you out or, you know, trying to
00:05:33.200
have a go at you. But what I want to do is really get the most out of where your mind is and how it
00:05:39.400
works. And I wonder, you know, in the sense that you say, you say in the parasitic mind that, you know,
00:05:45.640
we're basically capitulating. We're surrendering our opinions to the woke culture and to political
00:05:50.720
correctness. What's wrong with, what's wrong in your mind with that particular approach of at least
00:05:58.580
Right. So each of those parasitic ideas that I talk about in the book, and maybe it's worth that I
00:06:04.740
So postmodernism is the granddaddy of all parasitic ideas, because it purports that there are
00:06:09.500
no universal truths. We are completely shackled by subjectivity. So to speak of an objective truth
00:06:15.680
is completely nonsensical, according to the postmodernists. Cultural relativism is an
00:06:20.480
offshoot of that. It says, who are you to judge absolute moral truths, right? If another culture
00:06:25.840
wants to cut off the clitorises of little girls, that's their business. Don't be a cultural imperialist.
00:06:31.420
Social constructivism is another parasitic idea. It basically says, we're not biological creatures.
00:06:37.340
Everything is due to social construction. So if Bubba can bench press more than Linda,
00:06:41.780
it must be because his parents taught him to play rough and tumble. It can't be because there are
00:06:46.280
physiological, hormonal, morphological differences between men and women. Now, each of these parasitic
00:06:53.220
ideas originally started off from a noble place. So take, for example, trans activism, right? Yes,
00:07:00.740
of course, trans people should live free of bigotry. That doesn't mean in the service of that goal,
00:07:07.000
we end up murdering and raping truth. So that yes, if a six foot four, 280 pound guy decides that
00:07:14.020
he's Linda tomorrow, who are you to judge whether he's not really Linda? So no, I do judge people who
00:07:20.520
in the service of an original noble goal, decide to kill truth. You see these ideas to explain the
00:07:28.760
parasitic mind as basically parallel to biological parasites. Exactly. Yeah. So let me, yeah, let me
00:07:33.980
give you, yeah. So I was trying, so earlier I mentioned memes. Memes are not the same as the
00:07:40.440
neuroparasithological framework that I took, and let me explain why. A meme could be positively valenced,
00:07:46.040
neutral, or negatively valenced. A parasitic idea is inherently negative, right? And the reason why I
00:07:53.760
chose the neuroparasithological framework comes from the animal kingdom. So in nature, the field of
00:07:59.300
parasitology is the study of host parasite interactions. So for example, you could be
00:08:04.420
parasitized by a tapeworm that takes over your intestinal tract. A neuroparasite is one that seeks
00:08:11.260
to end up in the host's brain, altering its neuronal circuitry to suit its reproductive interest. So for
00:08:17.920
example, a wood cricket abhors water. When it is parasitized by a hairworm, the hair
00:08:23.640
worm needs it to jump into water in order to complete its reproductive cycle. And so that was
00:08:28.700
my epiphany. Aha! I will now use that framework to argue that human beings could be parasitized by
00:08:34.560
actual physical brain worms, but also ideological brain worms, and hence the parasitic mind.
00:08:41.040
Why do you think humans are so vulnerable to these idea pathogens? Well, I love that you asked it that
00:08:47.720
way because I've often been asked, is the parasitic mind something that is specific to the current
00:08:53.580
period? And my answer is no. The capacity for the human mind to be parasitized is an indelible part
00:09:00.700
of the architecture of the human mind. What is specific to the current period are the specific
00:09:05.680
parasitic ideas that are taking over. 300 years ago, it was perfectly a good idea if we thought that
00:09:12.100
our neighbor Linda was a witch to throw her into the body of water. And if she swims, then we know
00:09:18.140
she's a witch. And if she drowns, oops, I guess we were wrong. That's a parasitic idea. So there are
00:09:23.340
many reasons for that. One of which is, as I said earlier, it helps us get through difficulties in
00:09:29.040
life. For example, some people argue that religion itself, Daniel Dennett, the famous evolutionary
00:09:35.260
philosopher argued that religion is akin to a parasite. So there are a wide variety of reasons
00:09:42.060
why we are susceptible to this parasitic thinking. I'll get on to religion a little bit later because
00:09:46.780
it's something you touch on quite a bit. But I'm guessing from what you say, even a good leader or
00:09:52.880
someone who's a great orator essentially is infecting everyone. Well, I mean, that's Machiavelli,
00:09:59.180
right? Machiavelli said it doesn't matter if the leader is truly an honest person. What matters is that
00:10:05.760
you manage that. That's why when you say, for example, someone has Machiavellian intelligence,
00:10:11.080
a salesperson who has high Machiavellian intelligence, what is he or she doing? They are tailoring their
00:10:17.440
sales pitch as a function of the customer that they're standing next to, right? That's what a politician
00:10:23.180
does. That's why when politicians are on the campaign, you always see them with a baby. Why? Because
00:10:28.980
that shows that I am nurturing, I am kind, I am compassionate, right? You never see them with babies
00:10:34.180
unless it's about seven days before the election cycle. So of course, it's all about this kind of
00:10:39.740
impression management. And I guess dating sites, the person always has a puppy. Yeah, a puppy. Or I've
00:10:44.960
actually joked once, but it's based on actual science. If you hold a puppy, if you hold a guitar, and you
00:10:51.800
come out of a Porsche wearing a fireman uniform, sit back and enjoy the ladies. Yeah, fair enough.
00:10:58.980
You know, you've alluded to, you know, the issues that you that you are concerned with in the
00:11:04.900
parasitic mind. What do you think are the three biggest risks facing us because of the parasitic
00:11:10.080
mind? And so the answer to that is also picked up in my forthcoming book called Suicidal Empathy,
00:11:16.400
which we'll touch on that. Look, it causes us to take certain public policy decisions that are
00:11:23.180
deeply problematic. On a smaller scale, although an important one, it causes about 900 medals to be
00:11:31.000
given. This was an actual study commissioned by the United Nations, because people say, well, what's
00:11:36.480
the big deal about this transgender thing? Well, there's, there've been almost, and that's an
00:11:40.620
underestimate, 900 medals in major competitions that have been given to biological males, and hence
00:11:47.280
real biological women have lost those places on the podium. So in a small scale, you get this kind
00:11:52.840
of consequence. But how about unchecked illegal immigration? How does, what does that do to the
00:11:59.520
vitality of a society? I mean, economically, it certainly doesn't help. How about if it changes the
00:12:04.600
fabric of that society? How about if you let in millions of people who don't share your foundational
00:12:09.360
values? You're going to irrevocably change the nature of that society. So there are a wide range of
00:12:15.120
consequences, most of them negative, that come about if we allow these parasitic ideas to flourish.
00:12:20.680
By the way, the perfect manifestation of someone who is both a manifestation of a parasitized mind
00:12:26.560
and suicidally empathetic is our current prime minister. Oh, we're here in Canada. In Canada,
00:12:31.800
right. Well, we can touch on the politics in a moment. But the reason I'm asking it in this way is
00:12:36.600
that you talk about, say, immigration, for example. Sure. It sounds like it's a broad sweeping brush
00:12:44.140
when you apply the idea to immigration. People forget the nuances. And that is that there are
00:12:49.340
benefits from immigration. Of course. In the US, they're going crazy in the agricultural sector
00:12:52.840
now because they've lost all the workers. So how do you balance not overgeneralizing in such a way
00:12:59.480
that it's counterproductive? Right. It just has to be common sense, right? I'm an immigrant. So
00:13:04.980
oftentimes people will write to me and say, you're such a hypocrite. You rail against open door
00:13:08.920
immigration, but you're an immigrant yourself. And your buddy Elon Musk is an immigrant. So why are you
00:13:13.320
railing against immigration? That's like arguing that my little cat, house cat Fido is a feline
00:13:19.120
and the lion in the jungle is a feline. Therefore, since they're both feline, you should also go to
00:13:25.320
the jungle and cuddle up with the lion in the jungle. Immigration has to be pursued in such a way that it
00:13:34.560
always leads to a net benefit to the host society. If that criterion is met, then come in, my brother,
00:13:42.080
and let's build a more pluralistic society together. If you allow entry to millions of people
00:13:47.820
who are coming in for no other reason than you being empathetic, it's not going to result in good
00:13:52.940
outcomes. One of the things you wrote in The Parasitic Mind, to quote you, is that progressivism
00:13:59.380
has become the enemy of reason. And you talk about, I love this term, the land of progressive unicorns,
00:14:04.880
referring to unicorns, a mythical but admired creature. Why do you feel that progressivism
00:14:10.680
is the enemy as such? Isn't there room for some understanding? Because it is always rooted in this
00:14:18.280
hopeful ethos. So take, for example, social constructivism, which basically says we're born
00:14:23.700
tabula rasa with equal potentiality. And the only reason that Lionel Messi became Lionel Messi or Albert
00:14:31.040
Einstein became Albert Einstein is because of a particular schedule of reinforcement, some
00:14:35.660
socialization. Maybe his parents hugged him a lot, or maybe they didn't hug him enough, or maybe they
00:14:40.180
took him to McDonald's enough, or maybe not enough. It can't be because there is innate differences in
00:14:46.300
potentiality across people. Because here, what the progressives are mixing up is equality under the
00:14:52.660
law with equal potentiality. Now, it's very hopeful for me to think that all of my children have the
00:14:59.900
potential to become the next Lionel Messi if only I find the right schedule of hugging. But the reality
00:15:05.880
is, my children are unlikely to become NBA players because I'm a short Jewish guy. Okay, so the genes
00:15:13.980
are already working. Now, there are some short guys that ended up in the NBA, but the statistics are
00:15:19.120
against you. So me recognizing that there are certain biological realities is harder to swallow than the
00:15:27.000
progressive unicornia that says, given the right schedule, we will all be beautiful. Socialism is a
00:15:32.840
wonderful idea if you're three years old. Because I teach my kindergarten kid that sharing is caring. It's
00:15:39.720
inherently unfair that Elon Musk has more money than the guy who's homeless, right? Wouldn't it be better
00:15:45.800
if benevolent government comes in and creates equality? That's what Kamala Harris told us. Well, socialism and
00:15:52.980
communism have been tried in endless countries. They've always failed. Why? Because unicornia is
00:15:59.400
inconsistent with human nature. I'll tell you a quick quote that I think you'll love. E.O. Wilson,
00:16:04.860
are you familiar with E.O. Wilson? I don't recognize the name. E.O. Wilson was a Harvard biologist
00:16:09.560
who recently passed away, one of the great scientists of the 20th century. His specialty was social ants. He was an
00:16:16.480
entomologist who studied social ants. Now, in social ant colonies, it actually is communistic. There is a
00:16:23.700
reproductive queen, and then there's a bunch of worker ants and a bunch of warrior ants that are
00:16:28.640
completely indistinguishable from each other. So when he was asked in an interview, Professor Wilson,
00:16:33.600
what are your thoughts on communism? His answer was, great idea, wrong species, right? So what the progressives
00:16:40.660
are doing, they're saying, we will impose our beautiful rosy view of unicornia to a species
00:16:47.800
that hasn't evolved to be consistent with that unicornia. So no, I think it's a bad idea to succumb
00:16:53.480
to these ideas. So essentially, it sounds like you're saying the land of progressive unicornia
00:16:57.240
ignores basic science. Exactly. It ignores the most fundamental features of our evolved human nature.
00:17:05.380
Here's a great example from, say, so I teach at a business school. So there were many years ago,
00:17:10.140
there was a company that decided that they wanted to break free of the antiquated notions of toxic
00:17:17.820
masculinity. In what context? Romance novels are almost exclusively read by women around the world,
00:17:24.760
right? I mean, literally almost exclusively. And I'm about to describe to you the exact same male
00:17:31.380
protagonist that's been depicted in every romance novel ever written. You ready? He's a count. He's got
00:17:39.220
some royal title. He's a neurosurgeon. He's tall. He's bold. He's aggressive. He wrestles alligators on
00:17:46.120
a six-ab pack. But he could only be tamed by the love of this one woman. I just gave you the plagiarized
00:17:52.320
version of every single romance novel that has ever existed. Why is that? Because women fantasize about
00:17:59.080
a particular type of male archetype. It's part of their innate human nature. Now, this progressive
00:18:05.680
unicornia company came along and said, we want to break free of this toxic masculinity. So they're
00:18:11.080
going to come up with a guy that cries a lot, sucks his thumbs, goes in a fetal position, is pear-shaped,
00:18:16.620
has a whiny voice. He's very sweet and sensitive. Guess what happened to that line? The market said,
00:18:22.360
I don't want it. So you thought that you had the chutzpah to impose what you want consumers to read.
00:18:30.680
And consumers said, no, no, no. I come with an endowed human nature and I decide what I want to
00:18:36.820
fantasize about. So there are real consequences to approaching political systems or consumer products
00:18:42.620
with a lack of an understanding of human nature. You reserve quite a lot of ire in your book
00:18:47.840
for what you call the SJW, social justice warrior. Explain that concept. So the social justice warrior is
00:18:54.360
typically someone who is very much entrenched in sort of progressive unicornia. I even give an example
00:19:00.440
of a type of social justice warrior with a spicy term. Do you know where I'm going with this?
00:19:05.760
Male feminists as sneaky. Do you remember what the last part is? Sneaky effers. Sneaky. Okay. Okay. Now,
00:19:13.820
that term, sneaky effer, actually comes from science. It's so in the 1970s, the official scientific term
00:19:22.700
is kleptogamy, which is the stealing of mating opportunities. But the colloquial term, even within
00:19:28.440
the scientific literature, is a sneaky effer. So example, you have many species where the males,
00:19:35.040
there are two phenotypes of male. There's the traditional male who is, you know, very masculine
00:19:39.620
and so on. There is another type of phenotype of a male that mimics females. So when the big dominant
00:19:46.580
male sees this one, it dupes him into thinking that he is a female and then goes around and
00:19:53.440
surreptitiously engages in mating opportunities. So I took this principle from the animal kingdom
00:19:58.640
and I argued that human male feminists, social justice warriors, are adopting the sneaky effer
00:20:04.960
strategy. I'm very, very kind. I hug the trees. When I put gas into my car, I start crying because I am
00:20:12.320
raping Mother Earth with the devil's juice and so on. And so I'm very kind. You could trust me,
00:20:17.480
Linda. You don't have to worry about me. I'm a kind and compassionate person. I'm a sneaky effer.
00:20:23.000
How do you propose to protect those who are genuinely vulnerable? Because we can't all be
00:20:28.580
that archetypal, you know, super strong alligator wrestling type. There are those who need, to some
00:20:34.080
degree, I guess, support and protection. So how does it not kind of dissolve into this, as you say,
00:20:39.660
unicornia kind of perspective? Look, I'm not suggesting that empathy is something that should
00:20:47.300
be eradicated. We are a multifaceted creature, right? We can be ruthlessly violent and we can be
00:20:55.980
infinitely kind and loving, right? She's not on camera right here, but my daughter is here. I would
00:21:02.280
be willing to jump in front of a truck to save her life, right? So that allows me to have empathy,
00:21:08.580
have love, have unlimited compassion for the right targets, right? But on the other, the problem is
00:21:15.660
when you take these virtues and they misfire. So I don't know if we want to talk about suicidal
00:21:22.300
empathy later, but maybe I can. We'll talk about it. We'll carry on. Yeah. So suicidal empathy is simply
00:21:27.800
the misfiring of an otherwise adaptive emotion. So let me draw an analogy. OCD is a psychiatric
00:21:35.000
disorder, right? But at its root, it is based on something that is quite evolutionarily advantageous.
00:21:42.080
The idea of scanning the environment for potential threats makes perfect evolutionary sense. But for
00:21:48.440
most of us, once we have tended to that threat, it then goes away. So for example, if you and I shake
00:21:54.140
hands, and I noticed that a minute before you shook my hand, you sneezed into your hand, I might later
00:21:59.000
quietly go into the bathroom and wash my hands. I wash my hands once, and then I resume our
00:22:04.860
interaction. The OCD sufferer will be stuck in an infinite loop for the next eight hours, washing
00:22:11.480
their hands until their skin is falling off in scalding hot water. So it is a misfiring of an otherwise
00:22:17.980
adaptive process. So yes, be empathetic, be kind, be generous, be compassionate. Those are all part of
00:22:25.260
the repertoire of human realities. But it has to be properly regulated.
00:22:30.680
What you refer to, though, in the parasitic mind is something that's very interesting. And
00:22:33.900
I think a lot of people recognize this, based on what we see in the media. And that is the faux outrage,
00:22:41.120
Exactly. So and here I can link it, if you'd like, very quickly to an evolutionary principle. So for a signal
00:22:48.980
to be meaningful, it has to be costly. And let me explain what I mean by that. You take a peacock's tail,
00:22:55.860
right? The peacock's tail could not have evolved via natural selection. Natural selection is the mechanism
00:23:03.380
that confers survival advantage for that trait to have evolved. Well, when the peacock evolves that big tail,
00:23:10.480
it is burdensome. It makes it more conspicuous to predators. It makes it more difficult to
00:23:16.000
take flight. So why would evolution have resulted in the evolution of such a burdensome tail?
00:23:23.040
Well, there we talk about sexual selection. Some traits evolve because they confer a reproductive
00:23:29.420
advantage. So even though it is handicapping me from a survival perspective, I am communicating to
00:23:35.900
the peahens, the females of that species, despite the fact that I've got this very burdensome tail,
00:23:41.580
I am still here. Shouldn't you choose me as your mate? How am I going to link this to what you just
00:23:46.780
asked? Putting hashtag bring the girls home, putting hashtag refugees welcome, love is love,
00:23:55.820
and all the other nonsense is costless. So it's not an honest signal. It's a faux signal. Any of us can
00:24:03.280
need it. But if I speak openly in some regions of the Middle East, where I know I might be taken into
00:24:12.000
the back corner and a bullet is put through my head, that is a costly signal. That is true signaling and
00:24:18.400
not virtue signaling. I remember the section where you were saying basically choose me when it came to
00:24:22.700
peacocks and peahens. But it made me also wonder, you use this term, it's just an aside that just
00:24:28.180
occurred to me, this term testicular fortitude quite a bit in the book. And that made me think,
00:24:33.280
well, what about the women? Yes. Well, I use it in a sort of hyperbolic sense. Margaret Thatcher
00:24:40.400
has more testicular fortitude than most men alive today in the West. So it applies equally to both
00:24:46.460
sexes. Or if I were a transgender activist to the 680 different genders. When it comes to making
00:24:53.560
decisions, I know you say there's a danger in prioritizing feelings over facts. But isn't
00:24:59.240
part of the problem that people don't really know what the facts are anymore, that they're so busy
00:25:04.100
looking for confirmation bias? Yeah, that's true. Let me discuss first the tension between feelings
00:25:09.460
and thought. Please. It's a false dichotomy. It's not that we are an animal of reason rather than an
00:25:17.580
animal of feeling. We are both, right? The challenge is to know when to invoke the right
00:25:23.740
system at the right time. So if I'm walking down in a dark alley because I'm taking a shortcut to get
00:25:29.140
home, and then I notice four young men who are loitering, I will get a fear-based response. My heart
00:25:35.540
will start racing. I will start breathing heavier. My blood pressure will go up. All of these things
00:25:42.340
have triggered my emotional-based system. And it makes perfect evolutionary sense for that to have
00:25:47.180
happen. On the other hand, if I'm trying to do well on a calculus exam, triggering my emotional
00:25:52.720
system is not going to make me do a better job on the calculus exam. Now, I apply that example when
00:25:58.900
it comes to, say, choosing our political leaders. Regrettably, most people end up invoking their
00:26:06.200
affective system rather than their cognitive system. And so here I talk about, actually, an expression from
00:26:12.360
Arabic says that getting drunk by simply smelling the cork of the wine bottle. Because what that
00:26:20.180
basically says, I don't need to be, I don't need to actually drink the wine to get drunk. I just take
00:26:26.020
a whiff and I'm already, right? So it refers to someone who has a weak cognitive constituency, right?
00:26:31.620
So for example, if you ask people, why do you love Obama? Well, he's tall, he's lanky, he's got a
00:26:37.480
left-lewish voice, he's got a radiant smile. It could well be that every single syllable he utters
00:26:42.280
is complete rubbish. But I'm already getting drunk by his cork. On the other hand, why do you hate
00:26:47.420
Donald Trump? Well, he's cantankerous. He speaks like a brawler from New York. He's not presidential.
00:26:53.860
But you didn't say anything about whether you agree with his immigration policy or his fiscal policy.
00:26:58.640
So the reason why I discussed that tension is because many of the problems that we face is because we
00:27:05.040
trigger the wrong system at the wrong time. Regarding your issue of what do we know what
00:27:09.840
is true or not? Well, one thing I could tell you is that there shouldn't be a government agency
00:27:14.980
that decides for us what is true or not. I don't know if you remember under the Biden administration,
00:27:19.740
there was a woman, Nina something, that came out who was going to be the first official czar of
00:27:25.400
this information. I mean, Orwell couldn't have come up with such a ridiculous notion. Do you remember
00:27:31.040
her? Vaguely. But I was thinking that the issue is partly that with politics, the scene has become
00:27:37.760
distorted anyway. Right. Well, and look, for example, in science, once you utter the words,
00:27:44.380
these two words, settled science, you understand nothing about science. Because science deals in
00:27:51.520
provisional truths. Yes, there is a truth, but that truth can move. What we thought was true,
00:27:57.400
I mean, the Copernican revolution was about the, you know, people thinking one way about the cosmos,
00:28:02.800
and then Copernicus and Galileo came along and said, No, no, that's not how it works. So the idea
00:28:07.860
that it's settled science, and therefore we need to remove your YouTube channel, we need to get you
00:28:13.920
off of Twitter, because you are espousing false beliefs. Is it disinformation to say that men menstruate,
00:28:21.220
or is it vertical information? So this whole idea of we tell you what the facts are,
00:28:26.600
is a terribly Orwellian dangerous idea. I'll get on to how this has been affected by social media and
00:28:31.700
the mainstream media as we progress. But what I was getting at also is, isn't it now that the case
00:28:36.740
with politics, it is more about smelling the cork than actually listening to what the real policy
00:28:42.140
should be? And I think there is a reason for that. Look, we face a lot of computational complexity
00:28:47.660
in our everyday life, right? Imagine if I sat there aching about every single decision by looking at
00:28:54.840
every single piece of information that is relevant before I decide which restaurant to go to or which
00:28:59.700
car to purchase, then I would be stuck in choice paralysis, right? So I use what are called
00:29:04.780
decision rules or heuristics that allows me to cut through the clutter, right? And so for example,
00:29:11.880
when people evoke their affective system, that's a very quick shortcut, right? Because it allows me to
00:29:18.900
say, look, how do I feel about Obama? He's lovely and majestic. How do I feel about Trump? He seems mean
00:29:25.020
and cantankerous, good enough, right? Because it cuts through the clutter. Now, it's a suboptimal way
00:29:30.420
of making decisions, but it is a very quick way to navigate through the computational complexity.
00:29:36.840
You know, when it comes to, in your opinion, which is more important, truth or freedom? And who wins when
00:29:44.100
they clash? For example, if you have people, you know, putting out because of free speech, putting out
00:29:48.540
falsehoods, disinformation, and so on in the media?
00:29:52.560
It's very hard to put an ordinal ranking on which one. One requires the other in an infinite loop,
00:30:00.000
right? I can't pursue truth if I don't have the freedom to pursue truth and vice versa, right? So
00:30:06.540
that's what I, and I'm glad you raised those two because I'm guessing you're referring to the
00:30:10.780
parasitic mind, where in chapter one, I say that the two fundamental ideals that, you know, that inform
00:30:17.820
my life are truth and freedom, right? I don't think you can have one without the other, right? I have to
00:30:24.500
have the complete freedom to be able to pursue truth where it takes me, which, by the way, leads to a
00:30:29.740
very dangerous idea, forbidden knowledge, which I briefly referred to in the parasitic mind, and I pick up
00:30:35.500
again in suicidal empathy. Forbidden knowledge is the idea that there are some inquiries that we should
00:30:44.720
refrain from pursuing because the consequences would be too great to pursue them, right? Now, and if I may,
00:30:51.400
let me explain here the concept of deontological versus consequentialist ethics. Deontological ethics
00:30:57.020
is absolute statements. It is never okay to lie would be a deontological statement. A consequentialist
00:31:03.740
statement would be, it's okay to lie if you're trying to spare someone's feelings. And so I often joke that if you
00:31:09.720
want to have a long, happy marriage, right? And your spouse says, do I look fat in those jeans? Put on your
00:31:16.540
consequentialist hat and say, you've never looked more beautiful. You're lying, perhaps, but that is a
00:31:21.880
consequentialist lie. For most things in life, it is perfectly reasonable for us to be consequentialist. For freedom of
00:31:29.680
speech, for the pursuit of truth, for journalistic integrity. Those are by definition, deontological
00:31:36.700
statements. So if you say, yes, pursue truth, but don't hurt people's feelings, the but there is a big
00:31:42.900
problem. Professor, from where do you get your news? How do you make sure you're getting what you would
00:31:47.580
consider accurate? Wow, what a great question. It's really a collection of stuff. Some of it TV, some of it
00:31:53.140
I read, some of it is just the scrolling through X. So I don't have a single one-stop shop place.
00:32:00.980
It's a bit of the... But what I try to do is triangulation of the veracity of information.
00:32:07.740
Do you see what I mean by that? Yeah. I always say that there's a triangle,
00:32:10.960
information plus diversity leads to perspective. Right. There you go. But the thing is, are you
00:32:16.000
constantly aware of the risk of an echo chamber, constantly trying to validate your own thoughts?
00:32:20.060
Because you're obviously quite, you know, forceful in your thoughts. There's a risk you could always
00:32:24.420
be seeking validation. Albeit, if you, for example, will look at some of my followers, I mean, now I
00:32:30.820
have a lot of followers, so I can't give you an empirical estimate. But I can oftentimes have on my
00:32:37.300
feed two people that just responded to something that I posted, and they couldn't be more diametrically
00:32:44.660
opposed from each other. Now, I actually take that with some encouragement, meaning that if every
00:32:51.320
time I post something, everybody always agreed with me, then probably I've created an echo chamber.
00:32:57.940
But if, you know, just with one post, I can have 20 people, 14 of whom agreed with me, six,
00:33:04.660
whoever, then I think I'm building a diversified chamber.
00:33:08.200
So I used to give this example that if I covered the Israeli-Palestinian situation, and I would have
00:33:13.820
people from both sides. But either way, no matter how I did the debate or the conversation, I would
00:33:19.080
get viewer feedback saying, oh, you're so pro-Palestinian, for the same show where they said
00:33:23.520
you're so pro-Israeli. And my producer would say, how can that be? I said, well, my explanation was
00:33:27.840
people bring their own baggage when they're watching. And that's why I ask, in your case, how you kind of
00:33:33.740
put your mind outside that potential echo chamber?
00:33:37.140
Well, so I'll give you an example of confirmation bias that my former doctoral supervisor had told
00:33:43.500
me many years ago. So he had been hired to help some company. He's a cognitive psychologist and,
00:33:49.960
you know, consumer specialist at Cornell. He just recently retired, by the way. So he had been hired
00:33:56.920
by a company where half the executives seemed to be for a product launch and the other half seemed to
00:34:04.680
be deadly set against the product launch. And so what was going to break the tie is they were going
00:34:09.660
to bring a bunch of them, put them in a focus group, and then based on that interaction, either
00:34:15.440
one side would win or the other side would win. After that interaction happened, then you pulled both
00:34:22.080
sides, both sides said, you see, we won. Now, how could it be? How could it be that they both saw the
00:34:27.620
exact same interaction and arrived at exactly the opposite confirmation, right? Well, it's exactly to
00:34:34.440
your point, which is I only look for information that supports my position and I filter out perceptually
00:34:41.520
and cognitively any information that doesn't agree with my position and therefore I solidify my
00:34:47.780
anchored position. I was recently asked by on a show by a British psychiatrist. This was about a
00:34:52.580
year ago. At the end of the show, maybe the best question I've ever been asked, maybe you'll beat it
00:34:57.160
on this show. He said, what is the singular phenomenon that, you know, you've seen in the human condition
00:35:06.740
in your 30 plus years as a professor that has most surprised you? So I had to pause for a second.
00:35:11.520
And then I said, the inability for most people to change their anchored positions, irrespective of
00:35:19.420
the amount of conflicting evidence that they might be shown, which itself, and that's why, by the way,
00:35:24.780
in chapter seven of the parasitic mind, I try to offer a means by which we can break through that
00:35:30.240
resistance. I talk about nomological networks of cumulative evidence, which we could talk about.
00:35:34.940
So I'm mindful of even my capacity to succumb to this kind of echo chamber. That's why I try
00:35:41.500
a sample from many different sources. How do you regard conspiracy theories? You know,
00:35:46.020
the moon landing wasn't real, or that there were chips and vaccines during COVID, these kind of
00:35:51.760
things that are out there. How do you regard them?
00:35:54.920
I apply the scientific method, right? So I think there is now enough evidence that the Holocaust
00:36:01.800
did happen. So if you're a Holocaust denier who argues that, you know, it was all the Hollywood made
00:36:08.560
and there was no, there were no ovens and so on. I don't have to take it seriously because I know
00:36:12.460
the accumulation of evidence has already established that that thing happened. And so you just apply
00:36:19.240
the scientific method as you navigate between what is true or false. There's no other way to do it.
00:36:23.380
It's difficult though with things like the moon landing. It doesn't have the same empirical
00:36:28.220
That is true. That is true. But by the way, I mean, I don't know if it's the right time to do it, but
00:36:31.580
maybe it is. Do you want me to talk about how you do the nomological networks?
00:36:38.020
So in a sense, it speaks to your general question, how do you seek truth? How do you know what's true,
00:36:41.800
what's not true? And so in chapter seven, the chapter's title, How to Seek Truth,
00:36:46.760
I argue that there is an incredibly powerful, albeit very cognitively taxing method to try to decide
00:36:54.920
whether something is true or not. So let's take, for example, I give several examples in the chapter,
00:36:59.080
but I'll discuss one here. Let's suppose I wanted to counter the idea that social constructivists
00:37:05.400
offer that toy preferences are sex-specific due to socialization, meaning it's because mommy taught
00:37:13.340
Linda to play with the doll and taught Bobby to play with the truck. That's why they have those
00:37:18.000
preferences. Let's suppose I wanted to demonstrate to you that that's not true. There are universal
00:37:23.660
biological and evolutionary-based reasons for those sex-specific toy preferences. How would I go about
00:37:28.960
doing that? So I'm going to build a nomological network of cumulative evidence, meaning I'm going
00:37:34.020
to get you data from across cultures, across time periods, across species, across methodologies
00:37:40.100
that will triangulate to demonstrate that this position is unassailable. And I'll give a few examples.
00:37:46.680
Well, from developmental psychology, we know that young infants who are too young to yet be socialized,
00:37:53.420
by definition, they don't have the cognitive apparatus to be socialized, already exhibit those
00:37:58.640
sex-specific toy preferences. So that already takes out the socialization argument. I can get you data
00:38:03.460
from vervet monkeys, rhesus monkeys, and chimpanzees showing that they have the same sex-specific toy
00:38:08.740
preferences. I can get you data from sub-Saharan Africa, where cultures that are very non-Western,
00:38:15.060
they exhibit the same sex-specific toy preferences. I can go back 2,500 years ago, showing you that in ancient
00:38:21.340
Rome and ancient Greece, on funerary monuments, little boys and girls are depicted playing with
00:38:26.600
exactly the same types of toys as we have today. So look how I am putting the epistemological noose
00:38:32.560
around your neck. I don't have to emote louder than you. I don't have to scream louder than you.
00:38:37.020
If I can have this, you know, gargantuan triangulation of evidence, I'll let that do the talking for me.
00:38:44.600
And so that's why, by the way, I'm epistemologically quite humble, in that I know what I know,
00:38:49.580
and I know what I don't know. That's why I never get caught in interviews. I give
00:38:53.740
thousands of interviews because I'll never try to wing it. If you ask me a question that I think
00:38:59.940
that I haven't done the requisite nomological network for, I'll say, hey, Riz, you know,
00:39:04.500
that's a great question. It's above my pay grade. I don't know. But if I know, good luck to you if you
00:39:08.820
want to debate me. Fair enough. No, because a lot of people obviously want to try to catch you out.
00:39:13.140
But if you know your stuff, that's obviously going to be important. But it was interesting because
00:39:16.760
you did say people who spew falsehoods should be judged. Now, that stuck with me.
00:39:23.540
Judge? That doesn't mean in prison. No, no, I didn't. I realized what you meant. It was the
00:39:27.500
whole chapter on us taking a position and so on, which is why I started asking you about my
00:39:31.720
situation as a journalist. Because, of course, as a human being, I have in my head judgments and
00:39:36.420
opinions. But I really, difficult as it is, I enforce them in terms of my journalistic work.
00:39:42.120
I want to stand away and hear the story. So many people say that politicians in general happily
00:39:47.340
lie. And in the case of Donald Trump, who I know you do like as a politician, he regularly is accused
00:39:55.460
of lying. And people actually, you know, metrically record this. How do you reconcile your belief then
00:40:08.020
So there are lies and then there are lies. So Donald Trump lies about, I've got the biggest
00:40:14.020
in the world. I'm the greatest lover in the history of humanity. The biggest audience was ever at my
00:40:20.000
inauguration, right? They're ego defensive lies, right? So is he fibbing? Undoubtedly. That is a very
00:40:26.600
different lie than if you are a postmodernist saying that there are no objective truths. That eradicates
00:40:34.420
the possibility of truth being. And it's that that allows them to say slavery is freedom, war is peace,
00:40:41.160
men are women. So lies don't all come in the same final shape, right? Is he lying in the sense that he's
00:40:49.180
constantly trying to protect his ego because he's a benign narcissist? Yes. But there are guys on the
00:40:55.860
progressive side that are lying. If you say, for example, that socialism is the means by which
00:41:02.640
people are most likely to flourish, that is a much more consequential lie than I am the best lover of
00:41:10.340
all time. Both are lies. One is much bigger than the other. But he does have those consequential ones
00:41:15.140
too. You know, they're eating the cats and dogs in Springfield, Illinois, and so on. People react to
00:41:19.060
this. Yeah. Okay. In that case, I think it's the fog of the rapidity of how information comes about,
00:41:25.460
right? I think though, and Elon Musk recently said it, he said, look, we will put out stuff often,
00:41:31.160
and it'll turn out to be false, in which case we're happy to correct our position. So just have
00:41:37.920
the epistemological humility to accept that sometimes you'll get it wrong. True. No, I heard
00:41:42.060
him say that. He was actually in the White House as he said it. Exactly. Now, the question is this,
00:41:45.680
though, if putting out this information without checking its veracity in the first place,
00:41:49.960
there are consequences, the risks are great. So from your side, how do you rationalize that with
00:41:56.440
saying someone like this is worth the leadership position? Because I think all politicians lie,
00:42:01.760
as you said. The question is, which one is the lesser of a liar? I don't think that when I put my
00:42:09.520
imprimatur in supporting a particular candidate, I think that they are interchangeable with a saint,
00:42:14.700
right? I think that given these options, this is probably the better option. So given the types of
00:42:21.060
values that are important to me, it is incontestable to me that Donald Trump would have been a much
00:42:27.240
better choice than Kamala Harris. All right, you're picking between those candidates, but I'm talking
00:42:31.720
in the general sense with politicians. They all lie. Yeah, I was going to say, so in the sense of
00:42:36.800
Donald Trump, you see him as something of a cure. As you said in a recent essay, you wrote about his
00:42:41.140
election victory saying, it's an unequivocal repudiation of the ideological parasites that
00:42:46.320
have wreaked havoc on our societies. And if, as you also say, it took decades of assiduous
00:42:51.740
indoctrination for these parasitic ideas to flourish in every nook and cranny of our institutions,
00:42:56.780
we'll get onto those further. How can four years of a Trump presidency turn them around? You've said
00:43:03.460
it takes decades. That's a fantastic question. It's actually one that I keep hammering now with
00:43:08.500
people that don't be complacent in thinking that just because Donald Trump, you know, he came along,
00:43:15.680
he's got the polio vaccine, he administers it everywhere, and polio seems to exist. To your
00:43:21.460
point, what you just mentioned, it took, depending on the parasitic idea, it's taken between 50 to 100
00:43:27.240
years for these idea pathogens to develop originally on university campuses and then to break out of the
00:43:34.440
university. So it won't be sufficient to only kind of rest on our laurels to say, hey, Donald Trump is
00:43:40.580
here, we've eradicated this nonsense. That's why even I've had people who say to me, so now that
00:43:46.000
Donald Trump has come to power, will people even read suicidal empathy? I say, of course they will,
00:43:51.980
because these parasitic ideas have not gone away. The reflex to be suicidally empathetic is not going
00:43:57.540
to magically go away because this guy came into office for four years. The next president might be one
00:44:02.680
that is as committed to these parasitic ideas and off we go on the woke train. So
00:44:08.240
there's much more in the fight to be had. I want to get your perspective on this as someone who's
00:44:13.920
actually a couple of years younger than me, but around the same kind of age. For me, it seemed like
00:44:17.720
there was a massive change in the curve where this suddenly all became public. I don't remember these
00:44:22.920
kind of debates for the last, say, 40, 45 years. Because you weren't in academia. So it takes a while
00:44:29.600
for all of these. So for example, take shingles. The virus is within you, it's dormant, and then
00:44:36.280
something will trigger it such that you might get an outbreak and I don't. But it is in there. So
00:44:41.780
what you were reacting to is the fact that you didn't know that shingles was flourishing in the
00:44:47.800
background, whereas I was living in the petri dish of shingles called academia. So, and let me give you
00:44:54.440
the background of how I first was exposed to Houston, we have a problem. So my scientific work
00:45:01.500
originally was in the application of evolutionary biology and evolutionary psychology and studying
00:45:06.580
human behavior in general and consumer behavior and economic behavior in particular. And so as I
00:45:11.800
tried to Darwinize the business school, the idea being that you can't study entrepreneurship or
00:45:17.960
or consumer behavior or economic behavior without ever invoking our biological nature. To me, that
00:45:25.800
seemed like a very banal and obvious statement. Well, apparently, it was completely heretical to most
00:45:31.600
of my academic colleagues. And so I was right, that was my first exposure to parasitic ideas. How could
00:45:38.020
these otherwise bright, educated people be questioning the fact that human beings, consumer behavior,
00:45:44.180
consumers are biological? You don't think our hormones affect our behavior? How? Right? And so
00:45:50.820
I saw it, but it took a while for other people who were not inhabiting that ecosystem to then hit them.
00:45:59.820
And that to me is frustrating because had people listened when I stood up on top of the mountain and
00:46:06.820
saying we have a problem, then we wouldn't have had to have 900 medals being stolen by
00:46:13.380
trans women. I appeared in 2017. I appeared in front of the Canadian Senate. I was summoned,
00:46:22.820
as was Jordan Peterson, separately to talk about Bill C-16, which at the time was a bill that hadn't
00:46:30.420
passed yet, trying to incorporate gender identity and gender expression under the hateful rubric umbrella.
00:46:39.060
And I said, yeah, of course, I support the idea that no person should face any bigotry, but beware of
00:46:45.700
the proverbial slippery slope. Now, if you go back and you listen to my warnings, I hate to say it,
00:46:52.820
but they were prophetic. How do you regard the way? Well, this also occurred to me, the word political
00:46:59.380
correctness came in, I don't know what, maybe 15, 20 years ago that I could really remember. And
00:47:04.820
suddenly all the parameters changed. I couldn't put on an African accent, for example. I certainly
00:47:09.860
couldn't, if I was an actor, go on, blacken my face and appear as an African. When that all started
00:47:15.460
to emerge, how did you regard that? I was very concerned by it because I saw the kind of stifling
00:47:21.620
environment it created in otherwise banal interactions on campuses. I remember at one point
00:47:28.740
someone asked me, this was probably about 10 years ago, oh, where are you from originally?
00:47:34.180
And I was about to answer, I said, oh, I'm so sorry, I know that I'm not supposed to ask that.
00:47:37.540
I said, what do you mean you're not supposed to ask that? You recognize that I don't look as though
00:47:41.860
I'm indigenous to Canada, because I have a particular morphological look. What's the problem
00:47:48.100
with you asking where I come from originally? So I've noticed that for a while, look, I compare it,
00:47:53.540
and you might remember this from the parasitic pine. The parasitic wasp stings a much bigger
00:48:03.300
spider, rendering it zombified, and then it takes it to its burrow, lays its eggs on it, and then its
00:48:10.580
eggs as they hatch eat the spider in vivo while it's alive. And so I argue that political correctness
00:48:18.420
is akin to the spider wasp sting, because it stultifies me, right? I'm now afraid to say,
00:48:24.980
I recognize that you don't look French-Canadian. So what if I were to say, Mr. Crown, where are you
00:48:29.780
from? How is that an insult? But that's what happens with political correctness. It removes our
00:48:35.300
ability to have natural dynamics. Just an aside, I tend to say, what are your roots? I find it kind
00:48:41.220
of safer. And this is the issue. Aren't there so many landmines now onto which we can easily step?
00:48:47.060
Not if you're called Gatsad. I mean, I don't give a S-H-I-T. You know, there's a, and I invoke
00:48:55.780
this in chapter eight of the parasitic mind, I say, activate your inner honey badger, right? The
00:49:00.740
honey badger, for the viewers and listeners who may not know, has been literally ranked as the most
00:49:06.340
ferocious and fierce animal in the animal kingdom, which is saying a lot. There are a lot of ferocious
00:49:10.820
animals in the animal kingdom. The reason, it's the size of a small dog. It is so fierce that
00:49:16.980
adult lions will stay clear of it, right? That's why there's the expression, honey badgers don't
00:49:22.820
give an F. I live my life this way, maybe to a fault, and I wish more people would invoke
00:49:29.780
him. You're in the book basically saying evoke your honey badger. Exactly.
00:49:33.220
How do you regard Andrew Tate? I interviewed him in his home in Romania.
00:49:37.940
Yeah, I've never interacted with him. I don't know much about him. I mean, he seems maybe not to my
00:49:44.900
aesthetic taste in terms of his delivery, like kind of the bro stuff. Some of his pictures,
00:49:49.780
and I'm hardly one who can't engage in locker room talk, right? I mean, I was literally a soccer
00:49:55.540
player myself. So it's not less though I'm a dainty, Puritan madam. But he seems a bit too
00:50:02.660
harsh for me, for my taste. He's had a lot of influence, especially on young men.
00:50:06.420
Yes. You know, in the approach to some of the things you talk about, you can't refute facts,
00:50:11.540
like, you know, men are physiologically stronger than women and so on, I think.
00:50:15.540
That's why I was curious in case you'd come across.
00:50:17.060
I mean, he certainly will say some things that are backed by evolutionary science, if not common
00:50:22.580
sense. But I think the style of how he does it is a bit problematic. What did you think of him?
00:50:30.580
Yeah, it was actually interesting talking with him, because again, I think he's used to people
00:50:33.860
engaging him for an argument. And he had to stop and think when I just ask an innocent question.
00:50:40.340
So, you know, I'm going to go back to Donald Trump just for a couple more questions. Joe Rogan,
00:50:46.500
someone with whom you've appeared a number of times now, has been very influential. How much
00:50:51.300
do you think he played a role in tipping the scale in favor of Donald Trump?
00:50:55.460
Huge. As a matter of fact, before he had appeared with Joe Rogan, I was in touch with
00:51:02.660
Trump's people about him coming on my show, which is a much smaller show that, I mean,
00:51:09.780
I'm a popular guy, but I'm not Joe Rogan popular. And when they had asked me, what would be your idea
00:51:15.620
of, you know, what would you want to get out of it? I said, well, you know what I'd love just to
00:51:19.620
sit with him and to humanize him so that the people who are on the fence can actually see he's a
00:51:25.140
funny guy, he's got a great sense of humor, he's a regular, he could, he could, you know, just shoot
00:51:31.620
the blank, right? Well, that's exactly what ended up happening when he went for three hours on Joe
00:51:38.260
Rogan, which of course, at that point, once he's done Joe Rogan, there's no point on coming on my
00:51:42.740
show, he's done the biggest show. But so to your question, I think it really helped to humanize him,
00:51:48.980
because so often his interactions are ones where he's put on the back foot, where he's defensive,
00:51:55.300
whereas here, he's calm, he's relaxed, there's no gotcha stuff with just two guys chatting.
00:52:01.860
I think it probably flipped a few people who were fence sitting.
00:52:05.060
How do you feel he's done so far now, about a month into the presidency as we're talking?
00:52:08.580
I don't think, and I mean, as you said, I'm a bit younger than you. I can't think of a four-week
00:52:15.540
period where a president did more stuff than this guy has done. Is it good stuff?
00:52:21.940
Well, you told me, is there some bad stuff? I'm asking you really, because I'm essentially just
00:52:27.860
taking the position of getting your perspective on this. So far, I haven't seen anything that has
00:52:32.500
caused me great alarm. He got rid of all the Title IX stuff where
00:52:37.780
biological males can enter female spaces. That's good. He's trying to cut down on open border
00:52:45.460
immigration. I think that's good. He's trying to reduce the excessive waste and corruption
00:52:51.700
through DOGE. That's good. So, so far, I think I'm happy.
00:52:54.900
Okay, because some would say there's been some backfiring, firing all those people in the government
00:52:59.460
who now suddenly have to be re-recruited because it's overdone. The Gaza Riviera, you know?
00:53:04.260
The Gaza Riviera. Yeah, that's a bit of Trump bombast. I'm not sure if I would have approached
00:53:10.340
it exactly the same way. But that, again, is, I think, stylistic, right? He's a great
00:53:15.940
negotiator. It could well be that he says that without literally meaning that because he's trying
00:53:21.460
to angle for a better negotiation position. So I can't pretend to know what's exactly in his mind,
00:53:26.500
but I can see how the optics of such a proposal could irk a lot of people the wrong way.
00:53:32.820
Now, in the parasitic mind, you talked about an analogy where there's an elderly woman being
00:53:38.180
And you could be one of two sorts of people, the one who goes in and helps or the one who kind of
00:53:42.180
walks away and avoids getting involved. And you said, be the first, get involved.
00:53:48.260
Some might say with what's just happened, as we're talking again with the situation with Ukraine,
00:53:52.660
he has now said, I'll go into Ukraine, but if we can have mineral rights, it's like,
00:53:56.420
I'll help you, old lady, but as long as you give me some money from your purse.
00:54:00.180
Except, hasn't there been already a lot of the purse of the United States taxpayer
00:54:05.780
given to Ukraine? So the question becomes, how far do you want to take this? I mean,
00:54:11.140
how much are the American and, frankly, the Canadian taxpayers owing in the protection of Ukraine?
00:54:19.140
If you are a strong isolationist, you'd say, we've already spent way too much money.
00:54:24.660
If you're a globalist, we haven't spent enough money. It depends on your perspective.
00:54:28.900
Talking of those close to the president, Elon Musk is also a fan of yours and your book.
00:54:34.260
And he's certainly, you know, a lot of his decisions. Do you feel that he was shaped to
00:54:39.060
some degree by what you wrote in The Parasitic Man?
00:54:41.860
Well, there is an article that came out in Wall Street Journal about seven, eight months ago. The
00:54:49.540
title of the article, The Man Who Fuels Elon Musk's Nightmares. Do you know who that man is?
00:54:57.300
So I suspect, I mean, it's hard to answer that question without being immodest, but certainly,
00:55:03.620
I probably paid a hand. I missed that in my research. I mean, you said that Elon Musk should
00:55:10.980
Well, probably peace in the sense that, I mean, someone will probably correct me, but I would say
00:55:19.860
probably within a week of him having purchased Twitter, I did a sad truth clip where I said of
00:55:27.460
all the things that Elon Musk has done and anything that he will do in the future, none will be as
00:55:34.340
historically consequential as him having bought X.
00:55:39.380
Exactly. And so, so the fact that he has freed many of us from the oversight of the, in French,
00:55:47.620
you say the bien pensants, the good thinkers, the ones who are better, or as Thomas Sowell would say,
00:55:53.140
the anointed ones. The fact that I could now go on a thing and even use the R word, you know what
00:55:58.260
the R word is? Somebody says something really idiotic. I go, come on, man, are you retarded?
00:56:03.860
So, so that was completely, no, no. I mean, I, the only strike I ever got on former Twitter,
00:56:11.380
I think it was maybe like a seven day violation was because I had used the R word. Real life cannot
00:56:17.700
be sanitized in this way, right? When I was in the Lebanese civil war, I didn't know if the next
00:56:23.140
minute I was going to live or not. And that was my reality growing up. You can handle the R word.
00:56:32.900
So we, we spent a lot of time together. He actually invited me down to meet him at his house.
00:56:39.700
We ended up spending maybe about four hours together. I found them to be incredibly engaging,
00:56:46.180
very, very humble. I mean, we had communicated many times, but that was the first time that we
00:56:51.780
had seen each other in person. Zero ego, just two guys. I mean, we literally hugged,
00:57:00.420
sat there for four hours, completely locked in, sometimes discussing some very personal,
00:57:05.620
intimate things, sometimes discussing about maintaining human consciousness via interplanetary
00:57:12.100
travel. So sometimes it got very, you know, esoteric and philosophical. Other times,
00:57:17.620
as a matter of now, I could say it because he's mentioned it himself. He told me that the original,
00:57:22.020
you know, trigger for him that got him into the, the woke mind virus was the reality that he faced
00:57:28.260
with his own son who became his daughter, right? At the time, I didn't know if that was public,
00:57:34.100
so I never had mentioned it, but you know, he, he was confiding in me in a very trusting way,
00:57:39.300
in a very intimate way. So I thought he was just absolutely lovely. I mean, he is an eccentric guy,
00:57:44.580
uh, but I've got nothing but positive affection for him. He got, he got some flack for doing the
00:57:50.500
Roman salute, which is often compared with the Nazi salute. As someone who's culturally Jewish,
00:57:54.820
how did you feel about that? Well, I, I said, I actually had right away tweeted. I don't know if
00:57:59.300
you saw it. I said, you know, I can't believe that I was able to spend all this time in, in,
00:58:06.740
you know, the grand Nazis house. And as a Jew, I came out alive. Of course, I was mocking the idea
00:58:12.260
that he is, you know, pro Nazi and so on. It's so silly. Come on. A lot of the stuff you write,
00:58:16.900
actually, this was interesting to, to notes, even in your book that your irony and satire gets lost on
00:58:22.180
a lot of people. Oh my God, it's unbelievable. Uh, so, well, first the power of irony and satire,
00:58:29.220
satire, and sarcasm. So in the book, in the Persec Vine, I say that it is akin
00:58:34.420
the, to the surgeon's scalpel cutting through warm butter, right? The reason why dictators
00:58:41.460
go after the satirists first, they don't go after the guys with the big muscles. Those are easy to
00:58:46.500
handle. But the guy who's got the sharp tongue, the guy who's got the, the sharp pen, hence the pen
00:58:52.420
is mightier than the sword. That's the one that I'm scared of because he's the one who can attack,
00:58:58.020
attack my ideology, my position, right? So he makes me scared. And so, so that's, that's the reason
00:59:04.740
why I use satire and sarcasm. But to your point, what amazes me is how often, usually it's somebody
00:59:10.580
new who doesn't know my style, who I'll, I'll write something that is completely satirical,
00:59:16.580
and then they will just start sending me endless hate mail, not realizing that I was being satirical.
00:59:26.180
Dr. Gadsard explaining how he believes the parasitic mind is shaping our world today.
00:59:31.540
On the next episode of the conversation I had with him,
00:59:34.100
I challenge his fears that Islam is a danger to the world.
00:59:38.180
Islam, though, is a set of ideas that are codified. And then we could say, does Islam
00:59:44.900
promote greater personal liberties and freedoms? Yes or no? There is only these answers. It's either
00:59:51.140
yes or it's no, right? I would venture that based on historical facts, it doesn't promote greater
00:59:59.460
personal liberties and freedoms. And so I can't be Islamophobic in saying that. It's a nonsensical
01:00:05.380
term. I am a Islamo-realist. Plus, a look at how Suicidal Empathy, also the title of his next book,
01:00:12.740
will drag down the world order if it's not fully resisted now. Join us for that. But in the meantime,
01:00:18.500
don't forget to leave your thoughts and comments, subscribe to the channel, and we'll see you next
01:00:22.660
time. A big thank you from me and the team for watching. I'm Riz Khan. See you soon.