The War on Reason - How Bad Ideas Hijack Our Thinking (The Saad Truth with Dr. Saad_682)
Episode Stats
Words per Minute
170.04092
Summary
Dr. Gad Saad is an author, evolutionary behavioral scientist, and professor of marketing who has written over 75 scientific papers and is the host of The Sad Truth, a popular YouTube channel that focuses on issues around religion, freedom of speech, and wokeism. He s also the author of four books, including The Parasitic Mind: How Infectious Ideas Are Killing Common Sense, which explores how the progressive left is successfully moving ideas once considered radical from university campuses into political and corporate arenas.
Transcript
00:00:00.000
Never violate deontological first principles that defines the beauty of the West under no
00:00:10.300
circumstances. There is no, I believe in freedom of speech, but the minute that you say, but you're
00:00:16.160
a cretin, you're a fraud. I come across this quote from you, which maybe sums up almost everything
00:00:21.240
you do. And you said this, any human endeavor rooted in the pursuit of truth must rely on facts
00:00:26.560
and not feelings. For many things in life, we navigate through the world through a consequentialist
00:00:32.120
prison. But when it comes to the pursuit of truth, I argue that you have to be deontological. In other
00:00:37.760
words, you know, we can all be in favor of transgender rights without murdering truth.
00:00:42.480
We don't have to say that men too can menstruate. I could be for individual rights. I could be for
00:00:48.280
fighting for a world free of bigotry without murdering truth. And what neuroparasites do
00:00:53.540
is exactly hijack your neuronal circuitry to the benefit of the parasite. They quietly lead us to
00:01:01.060
the abyss of infinite lunacy so that we say things like, of course men can menstruate. You throw the
00:01:06.080
truth out, then we lose everything. We can still have all of our nobilities. We can have our causes.
00:01:11.480
We can have our empathies. But if it doesn't all come back to some facts, we need to look ourselves
00:01:15.280
in the mirror and say, what are we doing here, guys? Like we're just kidding ourselves. All of these
00:01:19.580
idea pathogens were spawned on university campuses. It's not the fact that there's going to be six
00:01:25.680
foot four guys knocking at my door. It's the fact that in the internal privacy of my mind, I am
00:01:32.000
calculating as a student, should I say something in class or will I be penalized? Or anything else
00:01:38.120
around you. And pretty soon subconsciously you're self-censoring and just numbing yourself. And then
00:01:42.800
now you really are a zombie, which ultimately harms humanity. It harms our truth. It harms our progress.
00:01:48.460
It harms everything. Do we want to live in a world where there's this kind of overlord oversight
00:01:54.140
over our every syllable? I venture not. I am a honey badger. And honey badgers, as you know,
00:02:00.060
the expression is, don't give a F. Live your life that way. And you know what? It's liberating.
00:02:06.160
In Judaism, there's an expression, you never kneel in front of anybody other than God. Now,
00:02:11.160
I'm not a very religious person, but I live my life under that. There is nothing worse for me
00:02:16.720
than to lose a millimeter of my honor or dignity. So in other words, I walk like a 70 foot tall
00:02:26.700
The world is changing. Inspiration is everywhere.
00:02:42.740
It has never been so easy to connect, share, and bring people together.
00:02:52.420
We're learning from others and finding the best in ourselves.
00:03:09.460
Transforming ourselves so we can transform the world.
00:03:22.520
This is London Real. I am Brian Rose. My guest today is...
00:03:35.460
I am Brian Rose. My guest today is Dr. Gad Saad, the author, evolutionary behavioral scientist and
00:03:41.200
professor of marketing. You pioneered the use of evolutionary psychology and consumer behavior
00:03:45.960
and have written over 75 scientific papers. Your hugely popular YouTube channel and podcast,
00:03:51.420
The Sad Truth, deals with issues around religion, freedom of speech, and wokeism.
00:03:56.000
You've written four books, the last of which, The Parasitic Mind, How Infectious Ideas Are Killing
00:04:00.920
Common Sense, explores how the progressive left is successfully moving ideas once considered radical
00:04:06.440
from university campuses into political and corporate arenas. You believe that the West
00:04:11.580
intellectual freedom is at risk of extinction if we don't stop the spread of dangerous idea
00:04:17.380
pathogens that are killing common sense and our freedom of speech. Dr. Saad, welcome to London Real.
00:04:23.980
Thank you so much for that lovely invitation. I'm so glad that we are finally able to make it. So I
00:04:29.440
really appreciate the opportunity to chat with you. Now, I'm really looking forward to it. And,
00:04:33.960
you know, I come across this quote from you, which maybe sums up almost everything you do. And you
00:04:39.240
said this, any human endeavor rooted in the pursuit of truth must rely on facts and not feelings.
00:04:46.560
It's a very interesting statement. You know, I graduated from 1993 with a degree in engineering
00:04:52.620
from MIT. And I was taught that it all came down to the scientific method to prove things.
00:04:58.280
And now, 30 some years later, it seems like the world has changed dramatically. And I really want
00:05:04.460
to get your thoughts. Also, this is personal because my stepdaughter just started school in Boston
00:05:09.440
at a university there. She's 18, just turned 19 today. And so I'm deeply concerned that she's about
00:05:16.700
to go into a world that I don't even understand, that I can't control. So you tell me, where do we
00:05:22.560
start, doctor? Right. Well, I guess maybe we could start with my most recent experience at USC. So last
00:05:32.140
week, I was invited to speak at a one-day event celebrating the 10-year anniversary of a center
00:05:40.680
at USC. The center is called the Center for Economic and Social Research. And it was a day to celebrate the
00:05:47.620
values of the Enlightenment. And so I decided that I would deliver a lecture on something that is very
00:05:57.840
relevant to the quote that you introduced the segment with, you know, the dichotomy between,
00:06:05.360
you know, pursuing truth versus feelings. But I'm going to put it in more formal terms. So
00:06:10.060
I talked at USC about deontological ethics versus consequentialist ethics. And so let me spend a minute
00:06:17.640
or two explaining what that is. A deontological statement would be an absolute statement. So for
00:06:23.740
example, if I say, it is never okay to lie, that would be a deontological statement. On the other
00:06:29.640
hand, if I say, it is okay to lie, if I wish to spare someone's feelings. So if your spouse asks you,
00:06:37.320
do I look fat in those jeans, and you wish to have a long marriage, you might wish to put on your
00:06:42.800
consequentialist hat and say, no, you've never looked more beautiful. So for many things in life,
00:06:48.640
we navigate through the world through a consequentialist prism. But when it comes to
00:06:54.340
the pursuit of truth, I argue that you have to be deontological. In other words, there is no
00:07:00.760
freedom of inquiry, but dot, dot, dot. There is no freedom of expression, but dot, dot, dot. In other
00:07:07.380
words, I don't need to worry about what the downstream consequences of my research might be,
00:07:15.440
right? So for example, if I'm studying the difference between religious groups in terms
00:07:20.260
of their proclivity to commit terror attacks, well, some would argue that, well, you shouldn't do this
00:07:26.740
kind of research. Because even if that research were to prove veridical, you would be, quote,
00:07:32.560
marginalizing this group. And therefore, you shouldn't do it on consequentialist grounds. Whereas
00:07:37.920
I argue, no, that's a very dangerous, slippery slope. And so I try to take that message to the
00:07:42.980
people at USC in speaking about your stepdaughter and the vipers' den that she's about to enter.
00:07:49.820
And you would have thought that, you know, an esteemed place like USC would be very open to
00:07:57.920
what I consider to be very, very non-corrosive, you know, statements that I'm making that defend
00:08:05.160
freedom of inquiry, freedom of speech, individual dignity, personal agency, the scientific method.
00:08:10.860
And if you saw the hostility that I faced, now the hostility was not, they tried to shout me down,
00:08:18.040
they didn't throw tomatoes at me, it wasn't a physical confrontation, but the level of either
00:08:24.220
overt hostility or passive aggressive hostility. So I was there for the whole day. And usually,
00:08:31.060
you know, people will come up to me, if not to take selfies with me or so on, they at least engage me,
00:08:38.140
it's warm, it's academic. The manner by which people were turning their back to me, as though I
00:08:44.420
were this, you know, I was basically indistinguishable from Himmler, was really laughable.
00:08:49.900
Two people were very, very angry that I proposed something that was truly, you know, beyond the pale,
00:08:57.120
because I argued that, you know, we can all be in favor of transgender rights without murdering
00:09:02.360
truth. We don't have to say that men too can menstruate, and that men too can bear children
00:09:08.740
in the service of protecting your rights. And people thought that those were very, very,
00:09:15.980
very dangerous statements, so much so that one person argued that that kind of speech should be
00:09:22.240
regulated by the government. So if I am at a USC conference, speaking about the values of the
00:09:29.680
enlightenment, and a statement such as men cannot bear children can cause such a reaction,
00:09:36.940
we're down a pretty, pretty dark abyss. Well, that used to get you canceled on Twitter before Elon bought
00:09:43.020
it. And so I guess this has been going on for a while. A quick question for you. Were you getting
00:09:47.700
that from the students or from the faculty as well? And do you think it being in California,
00:09:52.140
where I was born, where you spend time as well, had a reason for that?
00:09:57.380
Of all of the hostile crowd that I faced in the Q&A, I think only one looked young enough to be a
00:10:05.780
student. And some, I actually recognize them. The first one that spoke is actually a very respected
00:10:11.740
psychologist who was so angered by me that his voice was quivering. I mean, you know, when someone is
00:10:19.060
incredibly angry and nervous, that they're having a hard time catching their breath, because that's how
00:10:25.040
triggered they were. So no, it may or may not surprise your audience that these weren't students who
00:10:33.340
were triggered by me. They were actual faculty members, if not research scientists.
00:10:38.840
Now, was it, sorry, to answer your question about California, I don't doubt that there is an element
00:10:46.200
of wokeness that is coming from the fact that USC is in California. But believe me, all of the
00:10:52.780
universities throughout these beautiful lands have fallen to these idea pathogens. So I could have
00:10:59.580
been anywhere else. I mean, I've, you know, I've given talks at University of Michigan, where, you know,
00:11:04.920
people were just as hostile. So I don't think it is something that is unique to California.
00:11:09.600
See, I don't think it's actually hit us over here in the UK yet, doctor. You know, we,
00:11:13.840
we, I see these things on Twitter and YouTube. And for me, it almost looks like I'm watching a
00:11:18.380
movie. It doesn't seem real. But I'm sure if I was in that classroom with you, I would be like,
00:11:23.500
it does it feel sometimes that this is almost a different species than you? It's almost like a
00:11:27.920
completely different type of, of animal that can get, I don't know, agitated or not listen to reason.
00:11:34.280
Because you seem like the kind of man that comes in and you're like, doesn't everyone see this is
00:11:39.020
logic? This is the scientific method. This is why we came so far in 500 years.
00:11:43.840
And it seems like you're almost speaking a different language to this new species.
00:11:48.720
You know, the incredulity that you're attributing to me is exactly how I feel. So despite the fact
00:11:56.000
that I've been navigating in this ecosystem for close to 30 years as a professor, despite the fact
00:12:01.180
that I've written the parasitic mind, hence, you would think that there is nothing that could surprise
00:12:05.180
me. As I walked out of the USC event, and my family, by the way, had joined me. So my children
00:12:12.720
and wife were in the audience. I have young children who were uniquely disturbed by the hostility that
00:12:20.680
they saw their father, you know, being exposed to. And so as we left, and we were walking around,
00:12:26.400
I said exactly what you said, without saying that there are different species. I said, you know,
00:12:30.960
I can't believe that I can trigger the type of hostility that I just, you know, was the recipient
00:12:39.720
of for saying things that should be so banal and obvious, right? I mean, you would think that we
00:12:47.340
have won that debate, you know, hundreds of years ago. We had, you know, we had the scientific
00:12:52.080
revolution. We had the renaissance. We had the values of the enlightenment. We were coasting along so
00:12:58.960
beautifully. And now to say things like, you know, men can't bear children is something that should be
00:13:04.200
regulated by the government. By the way, I'm really hopeful that USC, they've promised me that they
00:13:09.960
will be sending me the files of the day's event, which of course, when they do, if they do, I will be
00:13:16.920
posting. So hopefully, everybody could see the kind of lunacy. So yes, I am baffled by it. But I think
00:13:23.380
that exactly speaks to the key premise of the parasitic mind, which is, you know, we attribute,
00:13:29.540
you know, the faculty of reasons of reason to human beings. And yet, just like countless other
00:13:35.920
animals in the animal kingdom, we could be parasitized by brain worms. In this case, the brain
00:13:41.280
worm is not an actual physical brain worm. That's why I call them idea pathogens. But they end up doing
00:13:46.880
the exact same thing as other parasitic infestations that go to the brain, which is they completely
00:13:53.760
zombify you. They take supposedly intelligent people and make them behave in ways that are
00:13:59.520
truly baffling to the human spirit. Now, I wasn't there. And I'm curious, if I was to try to get into
00:14:05.040
their minds, what are they thinking? Like, what are the kind of things they say back to you? I'm sure it's
00:14:10.280
not one thing, but I'm sure it's probably about three different things. But what are the things that
00:14:14.580
are uttering from their minds? Is it cultures being oppressed, and we don't want to upset them,
00:14:18.540
we want to repair for them, we want to take away our privilege? And not what is coming from them?
00:14:23.920
Because obviously, they think they're right. So it's interesting to me to find out why they think
00:14:28.520
they're right. Yeah, that's, that's a beautiful question. I thank you for it. So there are several
00:14:33.120
elements. But if off the top of my head, one that doesn't capture the full phenomenon, but then we can
00:14:39.300
drill down further, is that they are driven by a sense of what I call faux empathy, right? And that,
00:14:46.580
and it perfectly speaks to the distinction between what I set up originally in our conversation between
00:14:53.280
deontological ethics and consequentialist ethics. If I, if I am a pursuer and defender of truth
00:15:01.160
in a deontological way, then I'm never encumbered by, as I said, the downstream effect of, oh, but you
00:15:10.000
know, if I say that men cannot bear children, that is a statement that is fully founded in truth.
00:15:18.000
Therefore, the fact that there might be 0.01% of people who suffer from gender dysphoria, who might
00:15:24.840
be hurt by that statement, is of no consequence to me, not because I'm a mean person, because my desire
00:15:32.640
to defend truth, and to navigate through reality, supersedes my desire to be full empathetic toward
00:15:40.660
you. Now, that doesn't mean that I'm not a very empathetic person. I'm, I'm, I'm sensitive, and
00:15:46.240
sentimental to a fault, but, but my desire to defend truth supersedes that. So I think each of these idea
00:15:53.520
pathogens, which we can, I suspect, get into during our conversation, so postmodernism, cultural
00:15:59.640
relativism, social constructivism, they're all different variants of parasitic idea pathogens,
00:16:07.800
but they share one thing in common. They start off with a noble cause, which then in the service of
00:16:14.820
that noble cause, you're willing to murder and rape truth. And I argue, no, I can chew gum and walk at
00:16:21.360
the same time. I could be for individual rights. I could be for fighting for a world free of bigotry
00:16:27.080
without murdering true in that service. And so let me give you a concrete example. Equity feminism is a
00:16:33.660
great idea that most reasonable people can support, which is that men and women should be equal under the
00:16:39.500
law. There shouldn't be institutionalized barriers for either sex to, you know, to, to flourish. So
00:16:46.200
that's great. But then militant feminism, feminists took this idea and said, well, wait a second, no,
00:16:52.980
in the pursuit of eradicating quote, patriarchal sexist ways, we now need to promulgate the idea
00:17:01.040
that men and women are indistinguishable from each other. Men and women do not have any evolved
00:17:07.360
biological based sex differences, because by promulgating that utter nonsense position,
00:17:12.920
then it will better allow us to fight against the status, the sexist status quo. So you could see
00:17:18.880
again, how it's consequentialist in the pursuit of a otherwise noble goal. I'm willing to fudge the
00:17:25.180
truth in the pursuit of that goal. So to answer your question, and hopefully not too long winded away,
00:17:30.220
they're all coming from what originally seems like a noble place. Why are you marginalizing the noble
00:17:36.860
and peaceful religion of Islam with your facts, Dr. Saad? Why are you marginalizing? Why don't you
00:17:43.700
instead say that all religions have both good and bad people, and all religions could potentially lead
00:17:50.420
to terrorism? Well, in an incredibly banal way, that is potentially true. But you might imagine that
00:17:56.800
extremist Jains, people who practice Jainism, who walk around with a, with a sweeping, you know,
00:18:06.460
sweeping the floor in front of them, lest they might literally crush an ant, that's how committed
00:18:13.040
they are to never harming any biological organism. You might extreme that, you might imagine that
00:18:20.240
extremist Jains might be less likely to engage in terrorist acts than other religions. But according
00:18:27.620
to the noble people at USC, that would marginalize people. So shut your mouth and go la la la, I don't see
00:18:34.120
the truth. So I think that's the reason why they do what they do. Right. So I think you explained
00:18:37.900
that really well with the militant feminists. They believe in this concept so much that they're
00:18:42.780
willing to, to, to pace it above the truth. And coming from the scientific method or a science
00:18:48.580
system where I was kind of born as well, you're like, but if you throw the truth out, then we lose
00:18:52.960
everything. We lose our sanity. And you're like, so even though it might offend people, we always need
00:18:58.760
to honor the truth. First, we can still have all of our nobilities. We can have our causes. We can
00:19:04.100
have our empathies. But if it doesn't all come back to some facts, we need to look ourselves in the
00:19:08.300
mirror and say, well, what are we doing here, guys? Like we're just kidding ourselves.
00:19:12.400
Perfectly right. Exactly right. And that's, that's, that's what frustrates me so much because
00:19:16.300
the reality is I'm probably more sentimental and sensitive than all of these people. That's why I call it
00:19:23.440
faux empathy because when you drill down with a lot of these people, they really are, they're
00:19:28.860
authoritarian, but they protect themselves with the cloak of so-called progressivism, but nothing in
00:19:35.600
what they do is actually consistent with truly classical liberal values. And that's why I get so
00:19:41.920
frustrated. So to go back to your earlier question about, you know, do I think that they're aliens?
00:19:45.520
I look at them with the incredulity as though they are aliens, because how could, so the first
00:19:51.740
gentleman who, who spoke that I said is a very well-known psychologist. I mean, if you look
00:19:56.580
objectively at his scientific output, you'd say, I mean, obviously this person is not, you know,
00:20:02.960
a babbling idiot. I mean, he's a very accomplished psychologist. He used to be at University of
00:20:08.300
Michigan. Now he's at USC. He certainly has had a wonderful esteemed career, and yet he is fully
00:20:14.920
parasitized. He's a complete zombie. And so, and I think, again, that's the power of the metaphor of
00:20:21.620
using this neuroparasitological model, because what neuroparasites do is exactly hijack your
00:20:28.900
neuronal circuitry to the benefit of the parasite, right? And that's what these ideas do. They quietly
00:20:36.520
lead us to the abyss of infinite lunacy so that we say things like, of course, men can menstruate.
00:20:42.460
Of course, you can have six foot four biological males competing with five foot one biological women,
00:20:50.140
and there are no innate differences between these two. Men don't have any morphological, physiological,
00:20:57.180
anatomical, hormonal, behavioral differences from women. This is not me satirizing their positions.
00:21:04.240
These are their official positions. So when we've gotten that far, that far along the abyss,
00:21:11.280
Doctor, can you tell me where this all or when all this happened and why all this happened? Because
00:21:18.120
I don't remember maybe this being an issue. Okay, I was at MIT. We didn't exactly talk about
00:21:22.380
social causes. We were in the lab, and they were just beating me until I could figure out my
00:21:27.000
differential equations. But I don't remember it in the early 90s. When did this come about? Why did it
00:21:32.840
come about? I have a guy that's been on my show who was a 77-year-old man who's got a castle in
00:21:38.380
Scotland. He's a high-performance coach, extremely, extremely politically incorrect. And he said
00:21:44.600
that political correctness is a manifestation of a lack of self-esteem. He's been saying that for 20
00:21:50.900
years. And maybe that's just one part of it. But where do you think this came from?
00:21:57.220
So again, fantastic question. Because if you wish to understand, I mean, in the exact same way that
00:22:03.460
we, you know, we tried to understand where the COVID virus originated. And in the pursuit of that
00:22:09.680
objective, people were feather and tarred as racist, because you can't even talk about the geographic
00:22:14.980
origin of an actual biological virus anymore, lest it might marginalize people. So to your point,
00:22:22.940
you know, where did these idea pathogens begin? Why did they begin? Well, of course,
00:22:27.200
I'm here to tell you that in a slight rephrasing of George Orwell's famous quip, it takes intellectuals
00:22:36.620
to come up with some of the uniquely dumbest ideas. And so all of these idea pathogens were spawned and
00:22:44.760
began their proliferation on university campuses. So that explains the origin. Now, when did they come
00:22:51.440
about, it depends on the specific idea pathogen. So cultural relativism, the idea that we, you know,
00:22:58.160
we can't really tell, you know, judge a culture unless we judge it in its own context. There are
00:23:05.120
no human universals. Each culture is a microcosm of its own reality. That started, you know, almost
00:23:13.340
180, 90 years ago by Franz Boas, a cultural anthropologist at Columbia, and then his students
00:23:19.860
promoted that. Now, that idea pathogen, and the reason why I call it idea pathogen is because it
00:23:25.740
is perfectly incorrect. All cultures are not equal. Cultures that cut off the clitorises of little
00:23:33.440
girls are not equal to cultures that don't do that. Cultures that throw off gays from rooftops are not
00:23:41.180
equal to cultures that don't. So the idea that all cultures are equal is insanely laughable. Okay, but
00:23:48.300
so let's talk about why that idea pathogen began. Well, in the social sciences, one defines their
00:23:57.200
identity based on their abdication of biology, right? So if you are a sociologist, or an economist, or
00:24:04.280
most psychologists, or an anthropologist, unless you're a biological anthropologist, you share your
00:24:11.500
common disdain for using biology to explain human behavior. So, and that's something that I have faced
00:24:17.940
from the start of my career when I tried to Darwinize the behavioral sciences in general, and the
00:24:24.360
business school in particular, right? So I came in in 1994 as a fresh young assistant professor, where I
00:24:31.020
was arguing, well, how could we study consumer behavior and economic decision making and employee
00:24:35.760
behavior and employer behavior without ever recognizing the biological and evolutionary mechanisms
00:24:41.280
that drive our behaviors? Surely, as consumers, we don't somehow exist in a plane outside the purview of
00:24:47.560
our biological heritage. And that was viewed as, this is insane. What kind of nonsense are you saying,
00:24:52.960
Professor Saad? So cultural relativism arose precisely because, again, it came from a, quote,
00:24:59.420
noble place, which is, there were all sorts of miscreants and cretins that had misused evolutionary
00:25:07.160
theory. Think about the British social Darwinists, who, it had nothing to do with Darwin, but they
00:25:13.360
usurped Darwinian theory to say, hey, there's a natural struggle between the classes. We're the
00:25:18.000
upper, we're the upper class. You're the lower class. Hey, that's just nature. So if you die out
00:25:23.500
in your polluted environments without any education, hey, that's just nature. Then the eugenicists come
00:25:28.580
along and say, hey, that's Darwinian. If we sterilize gay people so that they can't reproduce,
00:25:32.640
hey, that's just natural. That's Darwinian. The Nazis came along and said, hey, it's a natural struggle
00:25:37.140
between races. We're the Aryans. We win. Hey, that's just Darwinian. And so the cultural
00:25:41.800
anthropologist said, well, we need to find a way through consequentialist ethics to create new
00:25:48.340
edifices of reason bereft of biology, because then that could, you know, stop future scientists or
00:25:56.660
politicians or, you know, demagogues from misusing biology. So that would be a long-winded answer of how
00:26:04.060
that particular cultural, I mean, idea pathogens came to be. Postmodernism, on the other hand, is a
00:26:10.340
bit, you know, younger. So about 40, 50 years ago, a whole bunch of French imbeciles, Jacques Derrida,
00:26:18.400
Michel Foucault, Jacques Lacan, and then others, developed the idea that, you know, there is no
00:26:24.480
objective truth. Language creates reality. That's called deconstructionism by Jacques Derrida. To speak
00:26:30.360
of a universal truth is insane, because we're always bound by our idiosyncrasies, by our personal
00:26:36.920
biases, by subjectivity. So there is no capital T truth. So that's why, by the way, I refer to
00:26:42.720
postmodernism as the granddaddy of all idea pathogens, because you can't wake up as a scientist
00:26:48.880
in the morning if you were a postmodernist, because scientists, while they recognize that all truths are
00:26:55.080
provisional, meaning that what was true 50 years ago may no longer be true today, science is
00:27:01.040
autocorrective, we do operate under the premise that there is a truth out there to be discovered.
00:27:06.280
Well, postmodernists say, no, no, no, there is no such thing. And so depending on the id pathogen,
00:27:12.880
the timing of it is slightly different. But I would say between 40 and 80 years ago is when most of them
00:27:19.880
happen. And then it takes a while for it to simmer. And then they all come together and slowly erode
00:27:26.360
our edifices of reason. And that's where we are, where we are today.
00:27:29.940
Has social media accelerated it because of the monkey see monkey do effect, or everyone can publish
00:27:36.280
ideas? Or has that accelerated it? And will it continue to accelerate it?
00:27:41.840
So social media in of itself can be used for good or bad. So in the same way that dreadful
00:27:49.840
ideas could spread, to your point, one could easily argue, but then why can't people who've
00:27:55.380
got the good ideas also use social media as a tool to spread better ideas? So I don't think it's
00:28:01.640
necessarily social media as an inherent system that spreads bad ideas. Now, what you could argue
00:28:07.940
is that there are inherently more, you know, blue haired Taliban activist types who use social media,
00:28:15.720
and therefore they end up being overrepresented on that medium, but it's not inherently the media.
00:28:22.200
Now, the other thing you could also argue is that to the extent until Elon Musk came along and bought
00:28:27.400
Twitter, to the extent that those social media companies are run by supremely woke people who
00:28:34.540
banish others who may not share their views, then to your point, you're exactly right.
00:28:39.940
But inherently, in principle, social media could be used for good or bad. I would have never
00:28:47.060
have built the platform that I have had it not been for all of the tools that are afforded to
00:28:53.320
me through social media, right? I mean, you know, who could have thought that when I first started my
00:28:57.900
YouTube channel, right? I mean, you would think I'm, you know, I'm a professor, I publish papers
00:29:01.940
in peer review journals, you get excited as an academic if your paper gets cited 100 times 10 years after
00:29:09.480
you publish it 100 times. Well, I can get on Joe Rogan's show, and there have been 20 million
00:29:15.980
downloads. How does that compare in terms of the speed and rapidity of the information? So again,
00:29:23.100
social media is not inherently nefarious. It's the people who control social media that we have to
00:29:29.360
All right, well, then let's talk about that phenomenon. Probably the best way to do it is to
00:29:32.740
talk about the Twitter files, which were revealed after Elon decided to buy Twitter, which some people
00:29:39.220
said was not a company, he was actually buying a crime scene. And I've watched the Twitter files
00:29:43.900
come out. I kind of have a vested interest in that because we were censored pretty heavily
00:29:48.100
in April of 2020. And I can get into that later. But what do you see being revealed in the Twitter
00:29:54.660
files? Like you said, I think it's been proven that 97% of the staff members there all either voted or
00:30:02.320
donated to the Democratic Party. So I think there have been analytics of showing they were a certain way,
00:30:07.180
even though everyone kind of knew that. What do you see there, doctor?
00:30:10.920
Yeah, right. Well, actually, in the parasitic mind, there's a section where I break down the
00:30:15.720
political donations, you know, Democrat to Republican across not just Twitter, across all of the major,
00:30:22.140
you know, social media players. And it is simply astounding. I mean, it's just, you know,
00:30:26.300
it's not 70-30, right? It's 99.2 to 0.08. It's 97 to 3. It's at that level of bias. So you're exactly
00:30:37.040
right. Everybody already knew it. I think all that you're going to get through the analysis of the
00:30:42.060
Twitter files is that people are going to have new hard data to support what everybody already knew. So
00:30:47.900
it's certainly not a useless exercise to go through it. But I doubt that you will see anything that most
00:30:54.380
sane people could not have already told you, I told you so. I already knew that. Because, I mean, I saw it just,
00:31:00.600
you know, I mean, luckily, I was never banned from Twitter. But I very much noticed that my voice seemed to
00:31:07.620
never get amplified. I mean, I, you know, I was completely stagnant for a very, very long time. Then Elon Musk
00:31:14.300
came along. And suddenly, the explosion of subscribers I had, I mean, literally, from one day to the next was just
00:31:22.060
breathtaking. So I don't think we needed the Twitter files to know what was happening. But it certainly
00:31:28.200
helps in terms of building a case of how dangerous it could be to have these companies who control all
00:31:34.160
of our informational flow being lopsided. Have you gone deep on the Twitter files and looked into some
00:31:39.560
of the things surrounding the FBI and the CIA or some of the other repressions or shadow banning? Have
00:31:45.560
you gone into that? I haven't. And that the most that I've done is when they would release,
00:31:50.860
you know, whichever journalist was handling a particular, you know, instantiation of the
00:31:56.000
Twitter files, when they would release their like a long thread, right, that's a 30 piece
00:32:01.760
or 30 tweet thread, I would just read those, but I haven't gone into a, you know, a deep dive into the
00:32:08.860
files. Are you asking because you haven't, there's something that you'd like to discuss?
00:32:13.280
Yeah, well, so I've kept an eye on it. I'll just tell you quickly our story. And I want to get your
00:32:16.840
thoughts and maybe what happened with us. So doctor, I started this show in 2011. I was inspired
00:32:21.880
by Joe Rogan. I was an early fan of the UFC and Brazilian jujitsu. So I was walking, watching his
00:32:27.380
show even 13, 14 years ago when it was just on Vimeo, wasn't even on YouTube. So I was a fan and
00:32:33.660
I got jaded with the banking industry and I just decided I was going to start my own version. And we
00:32:39.340
were having long form conversations, unedited, unscripted. And I did this for nine years, got up to
00:32:45.800
2 million subscribers on YouTube, half a billion views, published about 10,000 videos on the YouTube
00:32:51.100
platform. I was a partner. I'd gone down to the headquarters, obviously getting paid ad revenue.
00:32:56.780
And on April 6th of 2020, in the middle of the pandemic, I had a guest on my show that was asking
00:33:03.220
questions about masks, about PCR tests, about if we were doing the right things. It was the second
00:33:09.800
largest YouTube live stream in the world that day. 65,000 concurrent viewers. The largest was
00:33:16.420
President Trump's coronavirus briefing at the White House that night. So he just beat us out
00:33:21.160
just by about 20,000 live viewers. And based on a previous episode I'd have with this guest that
00:33:27.680
had been watched about 10 million times, the live stream was 4X. So we extrapolated this would have
00:33:33.340
been watched 40 million times this video podcast. Wow. Which at the time would have been the most
00:33:38.800
watched podcast in history because Elon smoking weed on Rogan was about 25, 30 million. So this
00:33:46.200
thing was going to go to Pluto. And it was an amazing conversation. Again, it's me and a 67-year-old man
00:33:52.260
having a calm conversation like this. We didn't incite violence. We didn't incite political overthrow.
00:33:57.020
We were asking questions when, funny enough, nobody else was. 30 minutes later, doctor,
00:34:02.620
YouTube deleted and banned that video. And it was the first time that ever happened to me in nine
00:34:07.340
years. I thought the weirdos got censored, you know, not me. And it started off this massive fight
00:34:14.320
I had with YouTube and all the other technology platforms about what I could publish. And I was
00:34:20.380
right on the cutting edge of it. And we're about to make a, we're about to release a documentary film
00:34:24.880
about everything that happened. But just to finish the story, I ended up getting two strikes,
00:34:30.740
almost had my entire channel deleted and disappeared, which by the way, was also nine
00:34:36.360
years of my life. It was my digital identity that was being taken away. I got removed from Dropbox,
00:34:43.100
which is interesting because I didn't know they would watch my content, but they took my account out.
00:34:47.160
PayPal suspended my account with over $100,000 in there. Obviously, shadow banned by Instagram,
00:34:53.120
Facebook, LinkedIn dropped me. Like I said, a bunch of other things systematically blocked me.
00:34:58.900
And when I said, who's looking out for my right for freedom of speech? And I looked at the
00:35:02.620
governments, they all pointed the finger. And I know you talk about this in your book. And they
00:35:06.700
said, no, no, no, that's a private company. And you tick the box on their private policy.
00:35:11.480
So you got to go talk to them. And I was like, well, when you lock us in our houses, this feels like
00:35:16.620
a public utility. And there was actually some precedent here in the UK when a company acts as a public
00:35:21.480
utility. It might have to follow the law of the land. But when I finally looked at the Twitter
00:35:25.700
files and I saw that the FBI and the CIA were going into Twitter and suppressing the right to
00:35:32.580
free speech, I thought, wow, here I am asking my government to protect me. They point the finger
00:35:37.740
at the private sector. And then meanwhile, they go in the back door and violate my rights by telling
00:35:42.500
the private sector what to do. And I'm not a conspiracy theorist. I leave that to my guests,
00:35:47.120
but it also possibly makes sense. And I'm waiting for the YouTube files for a reason why I might
00:35:53.180
have been systematically de-platformed across a lot of different technology companies at once,
00:35:59.280
if it was something like an FBI, CIA, Senate subcommittee. So that's what happened to me.
00:36:04.340
Final point, we ended up creating our own platform and we pulled off on May 3rd of 2020,
00:36:10.840
the largest live stream of a human conversation on our own platform. And I'll be honest,
00:36:15.520
it was kind of a big fuck you to the tech platforms to show them that we could do it ourself.
00:36:20.140
And so that's how it kind of ended, didn't really end. So that's what happened to me. What are your
00:36:25.560
thoughts on that, doctor? Wow. Well, thank you for sharing that. And I'm really sorry it happened to
00:36:29.800
you, but I wish I could say I'm surprised. But to me, it's like you just describe every Tuesday of
00:36:36.480
anyone who says anything that those are overlords don't agree with. Now, again, I've never been kicked
00:36:42.660
off in the way you were and the strikes and so on. But first of all, let me just say this.
00:36:47.680
The story that you just said, had you told that story at USC when I was there, I can assure you
00:36:55.060
that you wouldn't have had one empathetic or sympathetic ear. To the contrary, they would be,
00:37:00.920
well, they should have gotten rid of you even earlier. It's amazing that they let you stay there.
00:37:06.120
And again, they would have said so precisely using a consequentialist ethic, which is it doesn't
00:37:13.080
matter whether what you and that gentleman were discussing could potentially be true. As far as
00:37:19.340
their concern, it was too dangerous given public policy measures for you to be holding such
00:37:25.200
conversations. So, I mean, the Nazis use the consequentialist ethic for why they need to get
00:37:30.500
rid of, you know, those vermin Jews, right? So, the idea of using consequentialism to support your
00:37:36.480
otherwise, quote, noble cause is an indelible part of the human spirit. So, it doesn't surprise me.
00:37:42.860
Now, in my case, speaking, I think the gentleman you were talking to was about COVID, correct?
00:37:49.420
So, let me tell you, and I mentioned this, by the way, at USC. So, and I don't know if this was the
00:37:55.480
gentleman that you talked to, but do you know who Matt Ridley is? I just had him on my show,
00:38:01.480
actually, about three, four weeks ago. Okay. Well, so Matt and I know each other well. He's been on my
00:38:07.320
show, I think, twice. He actually provided a blurb for the parasitic mind. Just a lovely chap,
00:38:17.540
trained in evolutionary biology, and so on and so forth. House of Lords. So, certainly not a quack guy.
00:38:23.600
His team reached out to me when his last book came out, where he, you know, he was discussing
00:38:31.180
the possibility that it's, you know, the lab leak theory and so on. And at the time, and I said,
00:38:38.780
I'm ashamed that I'm admitting this, and I said this at USC. At the time that they reached out to me,
00:38:44.980
I told his folks, I can't remember who it was, the publicist or the publisher or whatever. I said,
00:38:50.480
look, I'm perfect. I mean, you know, I'm the ultimate honey badger, right? I mean, I get a
00:38:55.240
million death threats from people that are a lot scarier than, you know, the mask orgy folks. But
00:39:01.460
let's be pragmatic. If we hold the chat, here's what's going to happen. It's going to be removed
00:39:09.800
14 seconds after we post it. And in the worst case, in the best case scenario, I get a strike
00:39:16.560
against my channel. In the worst case scenario, I'm taken off. And just like you, you had spent,
00:39:22.560
you said nine years. I started in 2014. I'm going to lose all the years of effort that I've built to
00:39:29.840
build this inventory of content. And so I said, look, it seems like reckless martyrdom to just do
00:39:37.540
it. So it's, again, it's not that I'm, I'm afraid to hold difficult conversations because I've,
00:39:42.500
I've, I've taken positions that are astoundingly more quote dangerous than discussing the lab leak
00:39:48.320
theory, but pragmatically, I know it's not going to lead us anywhere. Now that you can't imagine how
00:39:54.000
much that kept me up at night because what was happening is in my pragmatic understanding of how
00:40:00.520
the overlords control us, I was acquiescing to the fact, you know what, let's not have the chat.
00:40:06.820
It's not going to lead anywhere. And I felt dirty. I felt disgusted. I felt cowardly,
00:40:12.160
even though I was being too harsh on myself, because as Aristotle explained to us, you know,
00:40:17.720
being a cowardly soldier is bad, but being a reckless martyr of a soldier is also bad.
00:40:24.620
There is the golden mean. And so I was kind of beating myself because I, I was saying, well,
00:40:30.340
why am I being pragmatic in my calculation? But the reality is I know what would have happened.
00:40:36.100
The exact same thing that happened to you. Now, is this the kind of world we want to live in
00:40:40.200
where someone like me or someone like you, the size, you know, should I even have this
00:40:45.280
conversation? By the way, that's, that's the biggest danger is, and I discussed this in the
00:40:49.160
parasitic mind. It's the calculus of self-censorship. That's the most dangerous part. It's not the fact
00:40:55.260
that there's going to be a six foot four guys knocking at my door. It's the fact that in the
00:41:00.900
internal privacy of my mind, I am calculating as a student, should I say something in class or will
00:41:08.520
I be penalized? Should I host Matt Ridley or will my YouTube channel? That's how freedom dies bit by
00:41:16.940
bit. And so I'm truly sorry for what you went through, but I'm glad that you've come back bigger
00:41:21.380
and stronger than ever before. Yeah, I'm thankful for it now, but it was a real defining moment in
00:41:26.040
my life. And I think, I don't know if it was my upbringing in the States, but my initial reaction
00:41:30.680
was one of anger when that happened. And so we just fought back and just decided to push it. But
00:41:37.340
the self-censorship is the most dangerous thing because you start doing at a subconscious level
00:41:41.960
and it's that parasite which you were saying, and that might be about your wife in the jeans
00:41:46.940
or anything else around you. And pretty soon subconsciously you're self-censoring and just
00:41:52.000
numbing yourself. And then now you really are a zombie and you walk around and you can't say what
00:41:56.920
you think, which ultimately harms humanity. It harms our truth. It harms our progress. It harms
00:42:02.480
everything. And to watch it happen on a social media channel is fascinating because first they take away
00:42:08.560
your shares and your virality. Then when you're still a bad boy, then they demonetize you. Then when
00:42:15.140
you're still a bad boy, they give you a strike. And then when you really don't want to listen,
00:42:18.280
they take you off the platform. And these little things start to just really make you smaller and
00:42:25.200
smaller and smaller. And it works. Can I, can I give you a few examples that I've had? So
00:42:30.700
on LinkedIn, I've had countless posts removed for insane reasons. So for example, if I criticize
00:42:39.560
some policy of, you know, by Joe, Joe Biden's administration, I will receive a thing, this
00:42:48.300
tweet, this post has been removed because it is considered, you know, harassed, you know,
00:42:53.820
unprofessional and harassment and bullying. So in a free society, nevermind that I'm a professor who
00:42:59.760
weighs in on these issues. If I'm just a, if I'm a trucker, anybody, I cannot directly criticize a
00:43:07.280
policy by the president of the United States, because that's considered harassment and bullying
00:43:12.600
to whom, to the, to Joe Biden. Right. So, so I've had countless of those on LinkedIn. On Facebook,
00:43:19.480
I've had several strikes that where they say, you know, you're about to lose your Facebook page
00:43:26.380
forever, because I will take a screenshot of a death threat that I received. For those of you who
00:43:35.400
don't know, who are listening to me for the first time, we are Lebanese Jews. I'm Jewish from Lebanon,
00:43:41.060
from the Middle East. So I received many death threats of the, this is how we're going to boil
00:43:46.600
you Jew. This is how we're going to put you in the oven Jew. We like you. So you'll be the last
00:43:51.320
to go in the oven and all kinds of stuff. So I will take a screenshot of it, especially when the person
00:43:57.080
is stupid enough to identify who they are or their email, and I will advertise it. I will get
00:44:03.640
banned for spreading hate. So then I try to, to write to Facebook. I say, I'm just describing
00:44:12.020
the death threat that I'm receiving. Now, if you want to be charitable, you might say, oh, but that's
00:44:17.700
just an algorithmic glitch, or that's just, you know, the, the, the person who is reviewing the
00:44:23.180
things that is not paying attention. But again, do we want to live in a world where there's this kind
00:44:27.880
of overlord oversight over our every syllable? I venture not.
00:44:33.260
No, we don't. And yeah, sorry for the fact that you get those death threats. They're not nice.
00:44:38.380
I ended up running for mayor of London afterwards, and I got them too, which isn't a nice thing. But
00:44:43.640
when, when they start threatening your family members, that's really not nice. And then when
00:44:48.160
the police tell you there's not much they can do, it's not a nice thing to do. But I guess it's just
00:44:52.940
par for the course, if you want to try to get those messages out there. Doctor, you have something
00:44:57.580
in the book where you say. To continue watching the rest of the episode for free, visit our website
00:45:03.400
londonreal.tv or click the link in the description below.