The Saad Truth with Dr. Saad - September 04, 2024


Suicidal Empathy and Deontological Ethics - The Will Cain Show (The Saad Truth with Dr. Saad_707)


Episode Stats

Length

34 minutes

Words per Minute

168.21727

Word Count

5,810

Sentence Count

109

Misogynist Sentences

10

Hate Speech Sentences

19


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, Professor Yitzchak Gadsad joins the show to talk about the importance of free speech and why candidates like Kamala Harris and Tim Walz don t believe in it. He also talks about why they don't have much of an ethical position to stand on the right side of the argument.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 At some point I think you have to take them at their own words. Robert Reich says we should
00:00:03.980 arrest Elon Musk. Brazil outlaws Twitter. Kamala Harris and Tim Walz in their past have said they
00:00:09.400 do not believe in free speech and now in the pages of the New York Times they ask is the
00:00:14.840 constitution dangerous. It's the Will Cain Show streaming live at foxnews.com. On the Fox News
00:00:20.220 YouTube channel the Fox News Facebook page right now comment streaming in on Facebook on YouTube
00:00:24.480 jump in to the Will Cain Show. Hit subscribe at Apple Spotify or on YouTube. He is the author
00:00:30.800 of The Parasitic Mind. He is Professor Gadsad. He's a friend here of the Will Cain Show and these are
00:00:35.820 some of our deepest conversations and I love having him here. What's up professor? Hey how you doing?
00:00:41.540 So good to be with you. It's good to be with you. You know you often are very active on X. You're
00:00:50.380 also what I would say is one of the people who are in the influencer circle of Elon Musk. What do
00:00:58.680 you think now that people are A in America calling for the arrest of Elon Musk? That's Robert Reich. I
00:01:04.540 believe he was labor secretary under Clinton and we're watching Brazil basically outlaw X.
00:01:13.420 Yeah it's unbelievable. Look and I've talked about the distinction between
00:01:17.660 deontological and consequentialist ethics. So let me break it down to you for you because it will be
00:01:23.100 relevant to your question. Deontological ethics are absolute statements. So for example if I say it
00:01:28.100 is never okay to lie that would be a deontological statement. If I say it's okay to lie to spare
00:01:33.620 someone's feelings then that would be a consequentialist statement. Now for many things we're all
00:01:37.920 consequentialist and that makes perfect sense but when it comes to foundational values that define the
00:01:43.640 west those things have to be deontological. So you can't say I believe in freedom of speech but not
00:01:50.380 for Elon Musk or Donald Trump. I believe in presumption of innocence but not for Brett Kavanaugh.
00:01:56.020 I believe in journalistic integrity but not when it came to Hunter Biden's laptop. Once you apply a
00:02:02.760 consequentialist ethics ethic to a deontological position you get the kinds of problems that we're
00:02:09.080 saying to that. So the interesting thing in part about what you had to say is we are all consequentialist
00:02:15.940 to some point and if we're all being honest and we look in the mirror it is it is difficult to be
00:02:21.300 principled meaning we all can give voice to principles but living according to principle is
00:02:26.620 very difficult. That's the challenge of life to be a moral person. When you hear and I'm going to read
00:02:32.520 two of these Abigail Schreier has this up in the the free pass talk they've revisited. Now these are
00:02:37.560 quotes from like this is going to be 2020 I believe for Kamala Harris where she openly advocated for
00:02:46.360 shutting down X and she did it on multiple occasions. I think it was two or three different occasions.
00:02:50.640 I think once from a debate stage once with CNN's Jake Tapper. She openly called for for shutting down
00:02:59.500 X and in Tim Waltz famous statement now his is from 2022 was that free speech does not cover
00:03:05.600 misinformation and disinformation. When you hear something like that from the two people running
00:03:11.120 for president and vice president the question I think is what do they believe and what would they do
00:03:17.300 right. So are they consequentialists. I guess everybody is like they don't believe in a principle
00:03:23.360 against free speech. They just want to apply it when it's in their favor and not apply it when it's in
00:03:28.400 their disfavor. So I guess the question would be what would it. Yeah. Go ahead. No I got you.
00:03:34.300 Sorry to interrupt you. That's right. They're being consequentialists. So what would they be in an
00:03:38.660 administration when it comes to free speech. Well look I am Jewish. I grew up in very difficult
00:03:45.000 circumstances in the Middle East. We faced execution. My parents were kidnapped by Fatah. So you know we
00:03:51.760 we we've seen it all in terms of actual victimhood and yet I support the right of Holocaust deniers to
00:03:58.700 spew the worst of offensive and insulting speeches right. What could be more offensive and what could
00:04:05.000 be more misinformation than to deny a historically documented event whereby six million people were
00:04:14.220 you know extinguished on an industrial scale level. But in a free society you have to tolerate
00:04:20.440 imbeciles, racists, falsehood spreaders and so on. So that's how you walk the walk and talk the talk
00:04:26.240 right. If I a Jewish person with my personal history I'm willing to tolerate Holocaust deniers then
00:04:33.320 Tim Walz and Kamala Harris don't have much of a ethical position to stand on. But again the reason why
00:04:39.560 they do that is because they must find some consequentialist calculus to explain why they need
00:04:45.400 to shut you down right. Don't say that COVID is lab leak because that's misinformation. Don't say
00:04:53.260 whatever that the Hunter Biden laptop was real because then that would allow Donald Trump to
00:04:59.800 ascend for a second term. One person's truth is another person's misinformation. So for example
00:05:05.580 can men menstruate or only women menstruate which is the disinformation position right. So again there is
00:05:14.600 no such thing as forbidden knowledge in science precisely because as long as you adhere to the
00:05:19.320 scientific method all bets are off. You pursue truth unencumbered by consequentialist ethos. So I think
00:05:26.880 what these two are doing is exactly what Orwell warned us against. Maybe it's even worse than what Orwell
00:05:33.500 thought. Well if anybody questioned his intellect just I've been walking around the world saying
00:05:40.500 menstruate. And he just taught us very clearly it's menstruate. That there's a U in there.
00:05:45.380 It's not menstruate. Just a hick from Texas. Also we have this information. Like the whole thing like what would
00:05:52.160 they do? Well they've done it. Like they did it in the Biden administration. We know what they did to suppress
00:05:57.040 free speech. It wasn't just calls to arrest Elon Musk or whatever it may be. And one of the big leaders I was just
00:06:03.740 trying to look this up. So I had this name while I was talking to you. His name is Rob Flaherty.
00:06:09.260 Rob Flaherty was one of the people who was integral to the whole disinformation campaign in the Biden administration.
00:06:18.140 Punishing, identifying, you know, limiting free speech.
00:06:21.680 Okay well you know Rob Flaherty. He's one of the guys for example
00:06:25.120 officials. I'm reading from Abigail Schreier and the Free Pest who reportedly pressured
00:06:29.740 Facebook into censorship with
00:06:33.700 the then director of Biden's digital strategy
00:06:35.880 Rob Flaherty. And who is Rob Flaherty
00:06:38.080 now? Well he is the deputy campaign manager
00:06:40.060 for Kamala Harris. It's not as
00:06:42.040 though this was a moment in time
00:06:43.700 with COVID. A moment in time
00:06:45.820 with 2020 election skepticism.
00:06:48.680 This is someone who has
00:06:50.280 benefited, been
00:06:52.060 promoted in a leadership
00:06:54.140 position for the person now running for president, Kamala
00:06:56.260 Harris. Right. I mean
00:06:57.900 we know what Kamala Harris thinks about
00:07:00.460 the espousing of positions that are
00:07:02.400 unpopular according to her calculus. Right.
00:07:04.740 You saw the clip that's been going around recently where she said you know
00:07:07.900 you can't allow people to just
00:07:10.140 spew whatever they want without
00:07:12.160 any controls or oversight.
00:07:14.660 I mean uttering
00:07:16.160 those words should be enough to
00:07:18.080 disqualify you as president
00:07:19.820 of the United States. But yet again
00:07:22.240 because we now are
00:07:23.780 so inundated with this
00:07:25.940 consequentialist ethos
00:07:27.340 people don't even bat an eye. I mean
00:07:29.680 what could be more dangerous than to
00:07:31.720 say things like
00:07:32.540 there has to be oversight over
00:07:34.840 what people say.
00:07:36.820 There has to be
00:07:37.960 equality
00:07:38.500 in the outcomes. As you said
00:07:40.640 in your wonderful monologue
00:07:42.000 that's exactly what Marxism is.
00:07:44.180 By the way let me give you a great quote which
00:07:45.660 I don't think I've ever mentioned to you
00:07:47.180 in our previous conversations.
00:07:48.980 E.O. Wilson the famous
00:07:50.560 evolutionary biologist from Harvard who
00:07:52.480 recently passed away
00:07:53.340 he studied social ants.
00:07:54.920 Now in social ant species
00:07:56.940 there's only one reproductive queen
00:07:59.040 and then all of the other
00:08:00.440 ants are exactly equal whether they
00:08:02.240 are worker ants or soldier
00:08:03.700 ants right. So
00:08:04.480 ant society is
00:08:06.640 communistic. So when he
00:08:08.080 was asked
00:08:08.940 Professor Wilson what do you think
00:08:10.860 about socialism
00:08:11.500 communism his answer was
00:08:13.300 great idea
00:08:14.600 wrong species.
00:08:16.440 Human beings are not
00:08:17.780 communistic in their innate
00:08:19.300 human nature.
00:08:20.860 Some of us are taller
00:08:21.820 shorter
00:08:22.460 harder working
00:08:23.720 less hard working
00:08:24.660 so to impose
00:08:26.100 an end result
00:08:27.580 of equality
00:08:28.460 of outcomes
00:08:29.300 on human beings
00:08:30.380 is to literally be
00:08:31.980 anti-human nature.
00:08:34.040 So not only is it
00:08:35.120 called Marxism
00:08:36.240 it defies
00:08:37.420 evolutionary principles.
00:08:39.400 It's very dangerous
00:08:40.200 and I hope that my
00:08:41.720 neighbors to the south
00:08:43.380 make the right decision
00:08:44.280 on November 5th.
00:08:48.100 Talking to Professor Gadsad here
00:08:50.000 by the way the author of The
00:08:51.380 Parasitic Mind
00:08:52.020 you got a new book I believe
00:08:53.120 that you're working on right?
00:08:54.840 I am working on a book called
00:08:56.320 Suicidal Empathy
00:08:57.180 and if I can just mention
00:08:58.940 because I want to promote
00:09:00.660 my latest affiliation
00:09:01.680 I just accepted a visiting
00:09:03.440 professorship
00:09:04.180 and global
00:09:05.220 ambassadorship
00:09:06.100 in your country
00:09:07.360 at Northwood University
00:09:08.740 the Free Enterprise
00:09:10.020 University
00:09:10.600 so I'm very excited to be
00:09:11.660 spending a year there.
00:09:13.040 Oh that's
00:09:13.460 oh yeah awesome
00:09:14.740 cool
00:09:15.120 and you also did a paper
00:09:16.400 you've already done one paper
00:09:17.820 on suicidal empathy
00:09:18.620 it's up for your subscribers
00:09:19.840 I saw on your ex
00:09:20.980 now I'm fascinated by this idea
00:09:22.680 of suicidal empathy
00:09:23.620 and I was
00:09:24.560 looking at some of the things
00:09:25.860 you've had to say
00:09:26.540 about it
00:09:27.360 all I really know
00:09:28.880 so far
00:09:29.420 is those two words
00:09:30.580 juxtaposed against each other
00:09:32.460 empathy
00:09:33.000 and suicide
00:09:33.860 however
00:09:34.580 you know something
00:09:36.440 I've always
00:09:36.920 I don't know
00:09:38.400 who can claim
00:09:39.620 they've discovered
00:09:40.720 any idea for themselves
00:09:41.700 but like
00:09:42.160 one of the things
00:09:43.300 one of the pieces of wisdom
00:09:44.880 I feel like I've acquired
00:09:45.860 with some age
00:09:46.760 uh professor
00:09:47.660 is that
00:09:48.260 I've learned
00:09:49.320 through children
00:09:49.880 and through myself
00:09:50.560 your
00:09:51.500 your greatest strength
00:09:52.760 is your greatest weakness
00:09:54.060 in that trait
00:09:55.340 whatever that trait may be
00:09:56.320 that makes you special
00:09:57.300 is also
00:09:57.920 that trait
00:09:59.160 that is your
00:09:59.940 your downfall
00:10:00.820 and so it's like
00:10:01.780 management of that trait
00:10:03.240 is your life's mission
00:10:04.380 so when I was thinking
00:10:05.380 about suicidal empathy
00:10:06.240 um
00:10:06.880 uh
00:10:07.980 Ray Dalio
00:10:08.740 um
00:10:09.580 big thinker
00:10:10.680 he tweeted this out
00:10:11.660 something similar
00:10:12.520 he said like
00:10:12.880 attributes
00:10:13.680 people have individual attributes
00:10:16.100 whether or not they are
00:10:17.700 um
00:10:18.600 a strength
00:10:19.760 or a weakness
00:10:20.360 depends upon
00:10:21.640 their application
00:10:22.920 and obviously
00:10:23.760 application is circumstantial
00:10:25.180 and I just started
00:10:26.360 thinking about that
00:10:26.980 when it comes to you
00:10:27.780 and the idea of empathy
00:10:28.780 empathy
00:10:29.280 is treated
00:10:30.180 as a virtue
00:10:31.260 but modern society
00:10:32.720 has made it abundantly clear
00:10:34.600 how empathy
00:10:35.180 can be a vice
00:10:35.920 yes
00:10:37.140 so let me explain this
00:10:38.160 in a couple of ways
00:10:38.860 but thank you for the
00:10:39.800 for the lead up
00:10:40.880 uh
00:10:41.400 just a small correction
00:10:42.300 on my subscriber
00:10:43.300 exclusive content
00:10:44.440 I shared
00:10:45.520 a paper
00:10:46.460 by someone else
00:10:47.500 on
00:10:47.680 someone else
00:10:48.560 on empathy
00:10:49.040 not my own
00:10:49.700 it was part of the research
00:10:50.860 I'm doing for my book
00:10:51.900 on suicidal empathy
00:10:52.600 but but but thank you
00:10:53.560 for that plug
00:10:54.020 uh
00:10:54.840 so in the in the
00:10:56.300 sac truth about happiness
00:10:57.220 my last book
00:10:57.940 my happiness book
00:10:58.840 I talk about
00:10:59.720 the inverted you
00:11:01.780 being the most
00:11:03.200 universal law of nature
00:11:04.880 inverted you
00:11:05.680 basically means
00:11:06.500 too little
00:11:07.260 of something is not good
00:11:08.940 too much of something
00:11:09.980 is not good
00:11:10.540 it's exactly what
00:11:11.280 Aristotle called
00:11:12.220 the golden mean
00:11:13.360 right
00:11:13.600 he argued
00:11:14.240 look if you are
00:11:15.280 a cowardly soldier
00:11:16.380 that's not good
00:11:17.200 if you are
00:11:18.000 incredibly reckless
00:11:19.220 in your risk taking
00:11:20.400 you're going to die
00:11:21.400 and you're going to be
00:11:22.240 a useless martyr
00:11:23.460 that's not good
00:11:24.320 everything happens
00:11:25.840 in the sweet spot
00:11:26.800 everything in moderation
00:11:27.960 as the ancient greek
00:11:28.840 said right
00:11:29.320 so that exactly
00:11:30.900 applies to your point
00:11:31.840 about personality traits
00:11:33.160 perfectionism
00:11:34.400 for example
00:11:35.040 if you're not
00:11:36.040 in the least bit
00:11:36.760 perfectionist
00:11:37.500 your work will suffer
00:11:38.580 because you won't
00:11:39.140 have attention to detail
00:11:40.320 you won't be conscientious
00:11:41.440 if you're too
00:11:42.580 perfectionist
00:11:43.440 as I am
00:11:44.180 you'll end up
00:11:45.060 checking your
00:11:45.600 galley proofs
00:11:46.340 45,000 times
00:11:47.640 god forbid
00:11:48.420 there might be
00:11:48.980 one comma
00:11:49.540 out of place
00:11:50.180 and that's
00:11:50.940 counterproductive
00:11:51.860 what if there is
00:11:52.720 a comma
00:11:53.120 out of place
00:11:53.740 you could have
00:11:54.300 spent your time
00:11:55.020 doing something else
00:11:55.900 again it's in the
00:11:57.040 sweet spot
00:11:57.540 so now let's
00:11:58.160 come to empathy
00:11:58.940 empathy
00:11:59.980 is noble
00:12:01.020 as a virtue
00:12:01.760 because we're
00:12:02.660 a social species
00:12:03.640 therefore we have
00:12:04.380 to have theory
00:12:05.040 of mind
00:12:05.500 we have to put
00:12:06.100 ourselves
00:12:06.500 in the mind
00:12:07.040 of another
00:12:07.460 we have to put
00:12:08.340 ourselves
00:12:08.740 in the shoes
00:12:09.280 of another
00:12:09.760 and therefore
00:12:10.880 our emotional
00:12:11.680 system
00:12:12.280 has evolved
00:12:13.400 because it
00:12:13.940 confers upon us
00:12:14.940 certain evolutionary
00:12:15.820 advantages
00:12:16.460 but
00:12:17.580 it has to
00:12:19.120 be invoked
00:12:20.680 within certain
00:12:22.020 functional ranges
00:12:23.380 right
00:12:23.700 so take for example
00:12:25.080 OCD
00:12:25.600 and I'll come back
00:12:26.280 to empathy
00:12:26.760 in a second
00:12:27.700 OCD
00:12:28.340 is the misfiring
00:12:30.340 of an otherwise
00:12:31.360 adaptive process
00:12:32.440 and let me explain
00:12:33.080 what I mean by that
00:12:33.760 it is perfectly
00:12:35.580 adaptive for us
00:12:36.660 to scan the world
00:12:37.600 for environmental
00:12:38.640 threats
00:12:39.240 I check the back
00:12:40.260 door to make sure
00:12:41.020 that it's locked
00:12:41.640 but I only check
00:12:42.720 it once
00:12:43.220 I wash my hands
00:12:44.920 when I come home
00:12:45.680 once to make sure
00:12:46.840 that if I shook
00:12:47.920 hands with anybody
00:12:48.660 who had a virus
00:12:49.500 I don't get
00:12:50.260 contaminated
00:12:50.840 what happens
00:12:51.980 with OCD
00:12:52.640 that adaptive
00:12:54.220 mechanism
00:12:54.980 misfires
00:12:55.800 right
00:12:56.160 as soon as
00:12:56.920 the warning flag
00:12:57.600 goes up
00:12:58.100 and you tend
00:12:58.780 to it
00:12:59.100 it goes back
00:12:59.900 up again
00:13:00.360 so I spend
00:13:01.220 eight hours
00:13:02.180 washing my hands
00:13:03.240 and scalding
00:13:03.880 hot water
00:13:04.580 and my skin
00:13:05.800 is falling off
00:13:06.680 that's germ
00:13:07.400 contamination
00:13:08.140 OCD
00:13:08.740 I keep checking
00:13:09.880 the back door
00:13:10.540 for four hours
00:13:11.560 even though one time
00:13:12.780 would have been enough
00:13:13.480 so now let's apply
00:13:14.780 that mechanism
00:13:15.500 to empathy
00:13:16.200 it doesn't make sense
00:13:18.020 that you be
00:13:18.980 so suicidally
00:13:20.320 empathetic
00:13:21.200 so that
00:13:22.300 as a means
00:13:23.200 of demonstrating
00:13:23.980 how virtuous
00:13:24.820 you are
00:13:25.500 you literally
00:13:26.820 kill your society
00:13:28.120 right
00:13:28.580 so if you
00:13:29.680 target your empathy
00:13:30.820 to the wrong person
00:13:32.080 Guatemala
00:13:33.200 who come in
00:13:34.240 illegally
00:13:34.880 are more
00:13:36.360 worthy of our
00:13:37.240 empathy
00:13:37.720 than our own
00:13:38.640 American vets
00:13:39.700 the homeless
00:13:41.020 people who are
00:13:41.840 defecating
00:13:42.460 and fornicating
00:13:43.380 in the
00:13:44.040 children's parks
00:13:45.700 are more valuable
00:13:47.460 than the children
00:13:48.360 who should be able
00:13:49.080 to play in those
00:13:49.900 parks
00:13:50.360 void of that
00:13:51.660 stimulus
00:13:52.180 right
00:13:52.480 and so
00:13:53.180 what the book
00:13:54.080 is about
00:13:54.600 is
00:13:55.120 what starts
00:13:56.560 off as a
00:13:57.320 noble emotion
00:13:58.320 becomes
00:13:59.360 quite dysfunctional
00:14:00.800 when it's
00:14:01.280 dysregulated
00:14:01.960 so that's
00:14:02.460 the purpose
00:14:02.820 that's
00:14:03.160 the point
00:14:03.400 of the book
00:14:03.760 but is
00:14:05.800 that real
00:14:06.460 empathy
00:14:06.840 because like
00:14:07.220 when I see
00:14:07.920 that
00:14:08.360 I
00:14:08.720 totally agree
00:14:10.800 with you know
00:14:11.520 the moderate
00:14:12.140 mean or whatever
00:14:12.860 is too much
00:14:14.040 of any one
00:14:14.460 thing
00:14:14.820 is bad
00:14:16.620 moderation
00:14:17.480 is the key
00:14:18.180 but I also
00:14:19.480 question
00:14:19.900 why not
00:14:20.120 that's real
00:14:20.480 empathy
00:14:20.800 so
00:14:21.320 you talked
00:14:22.400 about it
00:14:22.720 in terms
00:14:23.020 of like
00:14:23.280 you're
00:14:23.440 projecting
00:14:23.740 virtue
00:14:24.120 but if
00:14:24.440 it's
00:14:24.580 about
00:14:24.880 you
00:14:25.500 and your
00:14:26.320 image
00:14:26.900 or even
00:14:27.420 your self
00:14:28.120 image
00:14:28.540 and that's
00:14:29.620 the reason
00:14:30.260 you're adopting
00:14:30.880 a position
00:14:31.480 and by the way
00:14:32.060 that's usually
00:14:32.760 the extent
00:14:33.320 of the
00:14:33.820 commitment
00:14:34.220 adopting a
00:14:35.860 position
00:14:36.240 it rarely
00:14:37.080 is accompanied
00:14:37.600 by action
00:14:38.400 although some
00:14:39.200 people are
00:14:39.780 empathetic
00:14:40.340 and employ
00:14:41.240 that in
00:14:41.900 their life
00:14:42.620 but if
00:14:43.640 it's about
00:14:44.140 you
00:14:44.600 adopting
00:14:45.420 a position
00:14:46.060 and you
00:14:47.060 are doing
00:14:48.080 it as part
00:14:48.520 of your
00:14:48.760 self
00:14:48.960 projection
00:14:49.600 is that
00:14:50.460 even really
00:14:51.000 empathy
00:14:51.380 yeah that's
00:14:52.380 a great
00:14:52.620 question
00:14:52.860 so in
00:14:53.240 my first
00:14:53.980 book
00:14:54.220 ever
00:14:54.460 this one
00:14:54.980 an academic
00:14:55.560 book
00:14:55.840 the evolutionary
00:14:56.320 basis
00:14:56.600 of consumption
00:14:57.120 i talk
00:14:58.080 about
00:14:58.380 altruism
00:14:59.140 and why
00:14:59.860 altruism
00:15:00.800 has evolved
00:15:01.460 and i argue
00:15:02.760 that the
00:15:03.300 jewish
00:15:03.660 philosopher
00:15:04.300 maimonides
00:15:05.080 nearly
00:15:05.720 1000 years
00:15:06.700 ago
00:15:06.980 talked about
00:15:07.920 eight levels
00:15:08.840 of altruistic
00:15:10.140 piety
00:15:10.700 the highest
00:15:11.860 level
00:15:12.300 to your
00:15:12.780 point
00:15:13.220 is when
00:15:14.500 the altruist
00:15:15.600 and the
00:15:16.140 beneficiary
00:15:16.860 of the
00:15:17.260 altruistic
00:15:17.860 act
00:15:18.240 don't know
00:15:19.140 one another
00:15:19.680 because then
00:15:20.740 they can't
00:15:22.360 reap the
00:15:23.240 social rewards
00:15:24.400 of having
00:15:25.100 committed
00:15:25.600 that altruistic
00:15:26.420 act
00:15:26.680 so you're
00:15:27.060 exactly
00:15:27.500 right
00:15:27.960 that even
00:15:28.900 when we
00:15:29.340 call something
00:15:30.200 supposedly
00:15:31.180 empathetic
00:15:32.120 oftentimes
00:15:33.080 it's
00:15:33.760 full empathy
00:15:34.620 that is
00:15:35.220 only serving
00:15:35.980 as virtue
00:15:37.200 signaling
00:15:37.720 currency
00:15:38.320 but nonetheless
00:15:39.300 if you ask
00:15:40.660 those people
00:15:41.320 why are you
00:15:42.200 holding those
00:15:42.720 positions
00:15:43.140 they'll say
00:15:44.060 because all
00:15:44.880 refugees are
00:15:45.740 welcome
00:15:46.160 because no
00:15:47.120 human is
00:15:47.660 illegal
00:15:48.000 so we can
00:15:48.840 debate as to
00:15:49.680 whether they
00:15:50.160 truly are
00:15:50.960 feeling that
00:15:51.760 empathy
00:15:52.140 or whether
00:15:52.580 it is
00:15:52.920 full empathy
00:15:53.620 to gain
00:15:54.400 social currency
00:15:55.200 but the
00:15:56.000 reality is
00:15:56.940 that it
00:15:57.720 is a
00:15:58.280 dysfunctional
00:15:59.580 application
00:16:00.180 of empathy
00:16:00.860 right
00:16:01.220 there is
00:16:02.300 no
00:16:02.620 logical
00:16:03.460 reason
00:16:04.120 why you
00:16:04.940 should
00:16:05.180 allow
00:16:05.660 entry
00:16:06.240 into
00:16:06.680 your host
00:16:07.920 society
00:16:08.620 of millions
00:16:09.700 of people
00:16:10.540 who don't
00:16:11.940 share
00:16:12.320 a single
00:16:13.400 one of your
00:16:14.100 foundational
00:16:14.580 values
00:16:15.180 right
00:16:15.400 so for
00:16:15.860 example
00:16:16.260 in Germany
00:16:17.400 and in
00:16:17.940 Denmark
00:16:18.280 and in
00:16:18.820 Sweden
00:16:19.180 and in
00:16:19.700 France
00:16:20.140 and in
00:16:20.580 Britain
00:16:20.900 certainly
00:16:21.820 now in
00:16:22.380 Canada
00:16:22.740 it's coming
00:16:23.380 for you
00:16:23.740 in the
00:16:24.000 US
00:16:24.260 you have
00:16:24.900 people
00:16:25.240 that are
00:16:25.520 coming
00:16:25.760 from
00:16:26.060 countries
00:16:26.620 where there
00:16:27.420 is
00:16:27.620 orgiastic
00:16:28.580 Jew
00:16:28.880 hatred
00:16:29.220 where for
00:16:29.640 example
00:16:30.060 in Pew
00:16:30.780 survey
00:16:31.280 research
00:16:31.880 95 to
00:16:33.440 99%
00:16:34.460 of the
00:16:35.080 surveyed
00:16:35.620 people
00:16:35.900 from those
00:16:36.460 societies
00:16:37.060 have endemic
00:16:38.420 Jew
00:16:38.800 hatred
00:16:39.240 as part
00:16:40.140 of their
00:16:40.560 self-identity
00:16:41.940 so then
00:16:42.580 it's not
00:16:42.980 going to be
00:16:43.560 surprising
00:16:44.120 that you
00:16:44.800 have
00:16:45.160 rampant
00:16:46.040 antisemitism
00:16:46.860 on campuses
00:16:48.720 demography
00:16:49.760 is destiny
00:16:50.760 and people
00:16:51.480 shake their
00:16:51.980 heads
00:16:52.240 why do we
00:16:53.320 have increased
00:16:53.980 Jew
00:16:54.180 hatred
00:16:54.420 well import
00:16:55.220 those values
00:16:56.020 and there
00:16:57.400 are downstream
00:16:57.920 effects
00:16:58.540 right
00:17:00.320 what one
00:17:01.520 more
00:17:01.860 philosophical
00:17:02.700 question
00:17:03.300 before I
00:17:04.440 kind of go
00:17:04.840 back to
00:17:05.260 the application
00:17:06.080 to maybe
00:17:06.740 current events
00:17:07.220 for politics
00:17:07.680 so you
00:17:09.680 know you
00:17:09.840 talked about
00:17:10.140 the hierarchy
00:17:10.620 of altruism
00:17:11.220 the different
00:17:12.340 levels of
00:17:12.840 altruism
00:17:13.620 it kind of
00:17:14.480 made me
00:17:14.680 think does
00:17:15.100 it matter
00:17:15.600 though
00:17:15.900 like if
00:17:16.900 you do
00:17:17.280 something good
00:17:18.060 and you
00:17:18.500 claim reward
00:17:19.240 for it
00:17:19.720 you claim
00:17:20.380 you claim
00:17:21.200 some social
00:17:22.240 credibility
00:17:22.760 for doing
00:17:23.220 something good
00:17:23.840 versus the
00:17:24.320 guy who
00:17:24.660 remains totally
00:17:25.200 anonymous
00:17:25.580 and we
00:17:26.280 do kind
00:17:27.000 of
00:17:27.320 internally
00:17:28.180 know we
00:17:28.620 all give
00:17:28.940 a little
00:17:29.160 more respect
00:17:29.800 to the
00:17:30.220 person
00:17:30.520 that we
00:17:30.820 don't even
00:17:31.100 know
00:17:31.380 the presumption
00:17:32.240 of anonymity
00:17:32.920 out there
00:17:33.360 but does it
00:17:34.480 matter if
00:17:35.260 they're both
00:17:35.760 effectuating
00:17:36.720 a good
00:17:37.100 like they're
00:17:37.800 doing something
00:17:38.340 like and
00:17:39.420 let's use
00:17:39.740 Donald Trump
00:17:40.200 because I
00:17:40.560 think Donald
00:17:40.840 Trump's a
00:17:41.120 great example
00:17:41.640 like if
00:17:42.220 Donald Trump
00:17:42.560 does something
00:17:42.980 good he's
00:17:43.380 probably going
00:17:43.840 to tell you
00:17:44.160 about it
00:17:44.500 you know
00:17:45.080 versus the
00:17:45.760 guy who
00:17:46.120 remains anonymous
00:17:46.800 does it
00:17:47.160 matter if
00:17:47.520 the outcome
00:17:47.880 is the
00:17:48.140 same
00:17:48.380 yeah so
00:17:49.500 again that's
00:17:50.320 a great
00:17:50.560 question
00:17:50.880 I talk
00:17:51.800 about this
00:17:52.520 again in
00:17:53.380 the in
00:17:53.680 my first
00:17:54.100 book ever
00:17:54.480 the evolutionary
00:17:54.900 basis of
00:17:55.280 consumption
00:17:55.580 I talk
00:17:56.040 about
00:17:56.320 philanthropy
00:17:57.260 as a
00:17:58.440 costly
00:17:58.920 signal
00:17:59.480 meaning
00:18:00.220 that when
00:18:01.200 we engage
00:18:01.920 in philanthropic
00:18:03.020 acts
00:18:03.460 we typically
00:18:04.440 engage in
00:18:05.380 these acts
00:18:06.020 because we
00:18:06.660 want to
00:18:07.260 have them
00:18:07.980 advertised
00:18:08.660 it's the
00:18:09.660 will
00:18:09.940 kane
00:18:10.400 oncology
00:18:11.200 center
00:18:11.600 it's
00:18:12.260 rarely ever
00:18:13.060 the
00:18:13.660 xxx
00:18:14.540 anonymous
00:18:15.460 oncology
00:18:16.300 center
00:18:16.680 right
00:18:16.920 and
00:18:17.360 incidentally
00:18:18.120 to your
00:18:18.540 point
00:18:18.920 even
00:18:19.680 when
00:18:20.040 people
00:18:20.620 supposedly
00:18:22.340 engage
00:18:23.700 in
00:18:24.060 anonymous
00:18:24.900 altruistic
00:18:25.820 acts
00:18:26.320 believe me
00:18:27.540 their inner
00:18:28.380 circles
00:18:28.960 the people
00:18:29.580 who matter
00:18:30.140 to them
00:18:30.600 will know
00:18:31.480 that they
00:18:31.860 did this
00:18:32.280 act
00:18:32.580 right
00:18:32.780 so even
00:18:33.380 though
00:18:33.660 will
00:18:33.920 kane
00:18:34.340 may not
00:18:35.060 know
00:18:35.440 that I
00:18:36.000 gave
00:18:36.220 a hundred
00:18:36.560 million
00:18:36.960 dollars
00:18:37.440 to cancer
00:18:38.060 research
00:18:38.580 I'll be
00:18:39.800 sure to
00:18:40.180 make
00:18:40.340 to tell
00:18:40.880 all of
00:18:41.400 my
00:18:41.620 billionaire
00:18:42.040 friends
00:18:42.520 what a
00:18:43.160 wonderful
00:18:43.460 guy
00:18:43.760 I am
00:18:44.040 and that's
00:18:44.560 exactly
00:18:44.980 why
00:18:45.340 Maimonides
00:18:45.980 a thousand
00:18:47.000 years ago
00:18:47.640 said that it
00:18:48.380 is almost
00:18:48.960 impossible to
00:18:50.060 reach this
00:18:50.540 highest level
00:18:51.240 of tzedakah
00:18:51.920 that's the
00:18:52.440 Hebrew word
00:18:53.000 of this
00:18:53.620 kind of
00:18:54.160 such pure
00:18:55.460 piety
00:18:55.940 because we're
00:18:56.900 human beings
00:18:57.640 we care about
00:18:58.560 our social
00:18:59.060 standing
00:18:59.500 and one of
00:19:00.060 the ways
00:19:00.360 that I
00:19:00.620 ascend the
00:19:01.000 social
00:19:01.240 standing
00:19:01.680 is to
00:19:02.320 show what
00:19:02.640 a good
00:19:02.880 guy I
00:19:03.260 am
00:19:03.440 that's
00:19:03.700 just part
00:19:04.100 of human
00:19:04.420 nature
00:19:04.720 and either
00:19:05.400 way
00:19:05.720 there's an
00:19:06.680 oncology
00:19:07.100 center
00:19:07.380 so I'm
00:19:07.960 not sure
00:19:08.280 but I do
00:19:12.220 think there's
00:19:12.480 a distinction
00:19:12.840 between that
00:19:13.420 action and
00:19:14.040 those that
00:19:14.360 simply
00:19:14.720 adopt a
00:19:16.480 position to
00:19:16.960 project virtue
00:19:17.700 of empathy
00:19:19.060 one
00:19:19.740 okay
00:19:20.380 where you
00:19:21.540 and I
00:19:21.760 are having
00:19:21.980 this
00:19:22.140 conversation
00:19:22.620 and I
00:19:23.080 know
00:19:23.520 and I
00:19:23.860 think
00:19:24.000 probably
00:19:24.320 a lot
00:19:24.540 of
00:19:24.620 listeners
00:19:24.900 and viewers
00:19:25.340 would be
00:19:25.880 able to
00:19:26.140 pick up
00:19:26.620 where you
00:19:27.400 are
00:19:27.560 although
00:19:27.740 you
00:19:27.900 are
00:19:28.020 unpredictable
00:19:28.480 and you're
00:19:28.840 not
00:19:29.080 crackerjack
00:19:29.680 box
00:19:29.940 politics
00:19:30.440 where a
00:19:31.140 lot of
00:19:31.300 your
00:19:31.420 leanings
00:19:31.700 might be
00:19:32.060 and people
00:19:32.400 certainly
00:19:32.660 know
00:19:32.840 where mine
00:19:33.200 are
00:19:33.400 so we're
00:19:35.200 having this
00:19:35.600 conversation
00:19:35.980 about
00:19:36.200 consequentialism
00:19:37.040 and we're
00:19:37.700 doing it
00:19:38.080 through a
00:19:38.320 lot of
00:19:38.500 the
00:19:38.600 positions
00:19:38.980 which now
00:19:39.940 have become
00:19:40.460 championed
00:19:41.200 positions of
00:19:41.660 the left
00:19:42.040 like free
00:19:42.860 speech
00:19:43.200 and we're
00:19:43.960 going to
00:19:44.080 come back
00:19:44.400 in a minute
00:19:44.700 to the
00:19:44.960 constitution
00:19:45.460 but
00:19:47.520 Donald Trump
00:19:49.820 is famously
00:19:50.800 pragmatic
00:19:51.440 okay
00:19:52.200 is pragmatism
00:19:53.620 not a
00:19:55.420 more
00:19:56.660 acceptable
00:19:57.940 word for
00:19:58.780 consequentialism
00:19:59.700 so in other
00:20:00.400 words
00:20:00.640 you know
00:20:02.640 Donald Trump
00:20:03.600 will make
00:20:04.060 a deal
00:20:04.640 he'll
00:20:05.460 understand
00:20:06.120 I
00:20:06.540 now I'm
00:20:07.220 not running
00:20:07.760 on some
00:20:08.200 hardcore
00:20:08.580 pro-life
00:20:09.080 position
00:20:09.460 if I
00:20:09.800 can't
00:20:10.060 get
00:20:10.180 elected
00:20:10.600 Donald
00:20:11.260 Trump
00:20:11.540 will
00:20:11.820 make
00:20:12.080 pragmatic
00:20:12.660 slash
00:20:13.140 consequentialist
00:20:14.040 decisions
00:20:14.500 so in
00:20:16.340 that way
00:20:17.000 I mean
00:20:18.140 there you
00:20:18.720 can say
00:20:19.040 well and
00:20:20.180 yeah I think
00:20:20.420 this is where
00:20:20.740 you are going
00:20:21.060 to go but
00:20:21.520 on foundational
00:20:22.400 values like
00:20:23.200 free speech
00:20:23.600 that's not the
00:20:24.480 place to be
00:20:25.020 consequential
00:20:25.660 that's exactly
00:20:26.820 it so let's
00:20:27.620 apply what you
00:20:28.740 just mentioned
00:20:29.440 to say
00:20:29.960 academia
00:20:30.580 right
00:20:31.000 so some
00:20:31.900 people would
00:20:32.480 say we
00:20:33.260 absolutely
00:20:33.860 should have
00:20:34.660 an ethos
00:20:35.400 of forbidden
00:20:36.080 knowledge
00:20:36.580 there are
00:20:36.980 certain
00:20:37.520 areas
00:20:38.180 that you
00:20:38.900 should never
00:20:39.520 study
00:20:40.100 because the
00:20:41.240 consequences
00:20:42.340 of finding
00:20:43.140 out those
00:20:43.740 things would
00:20:44.540 be too
00:20:45.000 hurtful or
00:20:45.740 too dangerous
00:20:46.400 right so
00:20:46.900 don't do
00:20:47.900 racial
00:20:48.460 differences
00:20:49.100 in IQ
00:20:49.820 studies
00:20:50.560 even though
00:20:51.360 you may
00:20:51.840 properly
00:20:53.940 apply
00:20:54.600 honestly
00:20:55.220 apply the
00:20:55.800 scientific
00:20:56.140 method
00:20:56.660 because if
00:20:57.400 the results
00:20:58.220 come out
00:20:58.720 in a way
00:20:59.240 that are
00:20:59.900 difficult for
00:21:00.940 the politically
00:21:01.380 correct narrative
00:21:02.240 well then
00:21:02.980 some group
00:21:03.640 will be
00:21:04.020 marginalized
00:21:04.520 don't study
00:21:05.680 sex differences
00:21:06.660 or study
00:21:07.980 sex differences
00:21:08.760 as long as
00:21:09.700 women are
00:21:10.280 always shown
00:21:10.980 to be superior
00:21:11.780 on every
00:21:12.520 task that's
00:21:13.180 ever studied
00:21:13.840 god forbid
00:21:15.420 a million
00:21:16.140 times men
00:21:17.280 are shown
00:21:17.760 to be
00:21:18.100 superior
00:21:18.520 on some
00:21:19.160 task
00:21:19.580 please make
00:21:20.400 sure to
00:21:20.800 hide that
00:21:21.260 data in
00:21:22.020 your file
00:21:22.540 drawer
00:21:22.860 because otherwise
00:21:23.720 you're part
00:21:24.240 of the
00:21:24.440 patriarchy
00:21:25.020 no
00:21:25.620 academia
00:21:27.160 the pursuit
00:21:28.120 of truth
00:21:28.840 has to
00:21:29.420 be unencumbered
00:21:30.800 by consequentialist
00:21:32.200 calculations
00:21:32.860 because if
00:21:33.980 we apply
00:21:34.600 that
00:21:34.980 then we
00:21:35.580 should have
00:21:35.900 never studied
00:21:36.640 ballistics
00:21:37.260 and physics
00:21:38.020 because that
00:21:39.020 led to the
00:21:39.660 dropping of
00:21:40.340 the atomic
00:21:40.860 bomb
00:21:41.280 so we
00:21:41.920 absolutely
00:21:42.460 need to
00:21:43.260 cancel
00:21:43.720 physics
00:21:44.320 because there
00:21:44.840 are simply
00:21:45.320 too many
00:21:46.540 nefarious
00:21:47.300 consequences
00:21:48.100 of understanding
00:21:49.360 physics
00:21:49.820 by the way
00:21:50.400 in the
00:21:51.520 parasitic
00:21:51.960 mind
00:21:52.400 many of
00:21:53.260 the parasitic
00:21:54.260 idea
00:21:54.680 pathogens
00:21:55.200 that I
00:21:55.580 discuss
00:21:56.160 stemmed
00:21:57.020 from that
00:21:57.480 reflex
00:21:57.920 so for
00:21:58.340 example
00:21:58.660 social
00:21:59.120 scientists
00:21:59.800 have spent
00:22:00.680 the last
00:22:01.160 100 years
00:22:02.160 developing
00:22:03.020 the disciplines
00:22:04.000 of anthropology
00:22:04.860 and sociology
00:22:05.620 and economics
00:22:06.760 without any
00:22:08.220 biological
00:22:09.000 underpinnings
00:22:09.980 because some
00:22:11.280 really imbecilic
00:22:12.340 academics
00:22:12.940 thought that
00:22:14.020 biology is
00:22:14.980 simply too
00:22:15.720 dangerous
00:22:16.340 to apply
00:22:17.380 to the
00:22:17.760 human
00:22:18.000 condition
00:22:18.620 eugenicists
00:22:20.000 used it
00:22:20.660 to justify
00:22:21.600 their cause
00:22:22.380 Hitler
00:22:23.200 used it
00:22:23.960 to justify
00:22:24.780 hey there's
00:22:25.360 a natural
00:22:25.820 struggle
00:22:26.280 between the
00:22:26.820 races
00:22:27.160 we're the
00:22:27.840 Aryans
00:22:28.300 sorry Jews
00:22:29.220 you lost
00:22:29.840 British
00:22:30.540 social
00:22:30.940 class
00:22:31.480 applied
00:22:32.140 social
00:22:32.860 Darwinism
00:22:33.580 to say
00:22:34.020 hey we're
00:22:34.440 the upper
00:22:34.820 class
00:22:35.180 if you
00:22:35.800 die
00:22:36.080 from
00:22:36.420 tuberculosis
00:22:37.380 in your
00:22:37.800 squalor
00:22:38.320 hey that's
00:22:39.000 just Darwinian
00:22:39.760 so let's
00:22:40.680 now create
00:22:41.480 a world
00:22:42.060 view
00:22:42.380 where biology
00:22:43.540 ceases to
00:22:44.360 matter for
00:22:44.900 human behavior
00:22:45.660 because at
00:22:46.400 least we're
00:22:46.860 being noble
00:22:47.520 or as Plato
00:22:48.620 said the
00:22:49.140 noble lie
00:22:49.780 right
00:22:50.020 no when
00:22:51.160 it comes
00:22:51.640 to the
00:22:52.060 truth
00:22:52.480 not a
00:22:53.220 single
00:22:53.660 millimeter
00:22:54.340 is ever
00:22:55.000 seeded
00:22:55.540 well but
00:22:57.760 that's not
00:22:58.240 the case
00:22:58.680 right
00:22:58.940 like what
00:22:59.660 you just
00:22:59.920 described
00:23:00.260 I would
00:23:00.620 think we
00:23:00.880 were at
00:23:01.040 the apex
00:23:01.600 of the
00:23:02.100 moral truth
00:23:02.500 we're at
00:23:02.800 the apex
00:23:03.420 of all
00:23:03.840 those things
00:23:04.240 you just
00:23:04.520 described
00:23:04.880 we are
00:23:05.440 not studying
00:23:06.080 those
00:23:06.440 those are
00:23:06.880 off the
00:23:07.380 map
00:23:07.700 you do
00:23:08.360 not learn
00:23:08.900 more on
00:23:09.480 those subjects
00:23:10.040 today we
00:23:10.860 are probably
00:23:11.500 at the apex
00:23:12.220 of that
00:23:13.320 ignorance
00:23:13.880 exactly
00:23:14.820 ignorance
00:23:15.300 by the way
00:23:15.620 that's
00:23:16.160 that's how
00:23:16.820 I originally
00:23:17.920 you know
00:23:19.040 I often
00:23:20.000 say that
00:23:20.620 writing the
00:23:21.460 parasitic
00:23:21.940 mind
00:23:22.360 was really
00:23:23.280 a 30
00:23:23.820 year journey
00:23:24.840 I mean
00:23:25.780 starting from
00:23:26.720 when I was
00:23:27.120 in the
00:23:27.300 Lebanese
00:23:27.580 civil war
00:23:27.960 but I've
00:23:28.240 been a
00:23:28.440 professor
00:23:28.740 now for
00:23:29.080 this is
00:23:29.380 my
00:23:29.520 31st
00:23:30.080 year
00:23:30.300 when I
00:23:31.100 was first
00:23:31.720 trying to
00:23:32.440 Darwinize
00:23:33.400 the business
00:23:33.900 school
00:23:34.240 what do I
00:23:34.640 mean by
00:23:34.900 that
00:23:35.120 applying
00:23:35.880 evolutionary
00:23:36.540 biology
00:23:37.160 and evolutionary
00:23:37.820 psychology
00:23:38.420 to study
00:23:39.040 economic
00:23:39.900 decision
00:23:40.340 making
00:23:40.720 consumer
00:23:41.280 behavior
00:23:41.920 personnel
00:23:42.720 psychology
00:23:43.500 all of
00:23:44.720 my
00:23:44.960 colleagues
00:23:45.420 were like
00:23:45.900 what the
00:23:46.900 hell is
00:23:47.200 this
00:23:47.480 biology
00:23:48.740 doesn't
00:23:49.200 apply to
00:23:49.660 human
00:23:49.900 behavior
00:23:50.260 and I'm
00:23:50.520 thinking
00:23:50.780 how could
00:23:51.480 it not
00:23:51.920 apply to
00:23:52.440 human
00:23:52.660 behavior
00:23:52.980 you think
00:23:53.680 biology
00:23:54.240 applies
00:23:54.780 to every
00:23:55.420 single
00:23:55.840 species
00:23:56.480 except one
00:23:57.560 called
00:23:57.920 homo
00:23:58.260 sapiens
00:23:58.640 and so
00:23:59.180 that's
00:23:59.600 when I
00:23:59.940 started
00:24:00.640 realizing
00:24:01.360 that
00:24:02.160 Houston
00:24:02.980 we have
00:24:03.440 a problem
00:24:04.000 right
00:24:04.340 even very
00:24:05.420 intelligent
00:24:06.040 professors
00:24:06.820 with many
00:24:07.800 titles
00:24:08.320 before and
00:24:09.040 after their
00:24:09.560 names
00:24:09.840 could be
00:24:10.520 completely
00:24:11.060 lobotomized
00:24:11.820 and that's
00:24:14.320 it's interesting
00:24:15.000 when you
00:24:15.180 describe that
00:24:15.680 that's
00:24:15.940 actually
00:24:16.300 a little
00:24:17.420 bit of
00:24:17.700 what's
00:24:17.920 been
00:24:18.040 discussed
00:24:18.320 when it
00:24:18.540 comes to
00:24:18.780 artificial
00:24:19.060 intelligence
00:24:19.620 like you
00:24:19.960 talked about
00:24:20.260 with physics
00:24:20.680 and ballistics
00:24:21.240 and the
00:24:21.720 leading to
00:24:22.200 the atomic
00:24:22.520 bomb
00:24:22.820 there are
00:24:23.140 people are
00:24:23.480 saying
00:24:23.660 this AI
00:24:24.400 thing is
00:24:25.140 scary
00:24:25.480 we gotta
00:24:25.840 stop
00:24:26.300 we gotta
00:24:26.620 figure out
00:24:27.020 a way
00:24:27.340 for it
00:24:28.080 not to
00:24:28.500 continue
00:24:28.920 as a field
00:24:29.660 of study
00:24:30.120 right
00:24:31.400 I mean
00:24:32.120 there are
00:24:32.740 two sides
00:24:33.340 to the AI
00:24:34.060 story
00:24:34.560 some say
00:24:35.180 that you know
00:24:35.740 the robots
00:24:36.240 are going to
00:24:36.800 kill us
00:24:37.120 in 15 minutes
00:24:37.840 once they're
00:24:38.240 strong enough
00:24:38.700 others say
00:24:39.480 that's complete
00:24:40.100 science fiction
00:24:40.760 hype
00:24:41.100 so I'm
00:24:42.300 not gonna
00:24:42.560 weigh in
00:24:42.920 on you know
00:24:43.680 who's right
00:24:44.800 but it's the
00:24:45.900 same reflex
00:24:46.720 right
00:24:47.140 it's basically
00:24:47.840 saying
00:24:48.360 don't pursue
00:24:49.320 some truth
00:24:50.160 because the
00:24:51.100 consequences
00:24:51.820 of that
00:24:52.340 truth
00:24:52.720 will simply
00:24:53.560 be too
00:24:54.000 harmful
00:24:54.280 by the way
00:24:54.740 you may or
00:24:55.280 may not
00:24:55.480 remember this
00:24:55.980 and if
00:24:56.340 your viewers
00:24:57.620 haven't watched
00:24:58.280 it they
00:24:58.560 should
00:24:58.800 the movie
00:25:00.220 The Name
00:25:00.680 of the Rose
00:25:01.360 in the mid
00:25:02.200 80s
00:25:02.700 that came
00:25:03.140 out with
00:25:03.740 do you know
00:25:04.920 it Will
00:25:05.240 do you know
00:25:05.520 the movie
00:25:05.820 that I'm
00:25:06.000 talking about
00:25:06.420 No
00:25:07.000 Name of the Rose
00:25:07.680 Never heard of it
00:25:08.380 Name of the Rose
00:25:09.220 No
00:25:09.500 It's with
00:25:10.820 Christian Slater
00:25:12.180 and Sean Connery
00:25:13.340 Sean Connery
00:25:14.420 plays a monk
00:25:16.240 I think a
00:25:17.000 Gregorian monk
00:25:17.820 this is in
00:25:18.760 the Dark Ages
00:25:19.500 all of these
00:25:20.740 monks are
00:25:21.400 dying from
00:25:22.480 a poison
00:25:23.020 and their
00:25:23.540 tongues are
00:25:24.160 blue
00:25:24.560 and he's
00:25:25.300 wondering
00:25:25.620 how did
00:25:26.180 they die
00:25:26.880 from that
00:25:27.240 and it turns
00:25:27.720 out just so
00:25:28.220 I can fast
00:25:28.820 forward
00:25:29.160 that the
00:25:30.160 head monk
00:25:30.940 had placed
00:25:32.160 some Aristotle
00:25:33.200 books dealing
00:25:34.120 with humor
00:25:34.820 in a forbidden
00:25:36.220 library
00:25:37.000 because he
00:25:37.680 didn't want
00:25:38.440 the monks
00:25:39.040 to read
00:25:40.060 any book
00:25:40.760 on humor
00:25:41.360 because humor
00:25:42.320 is the work
00:25:42.960 of the devil
00:25:43.600 but what they
00:25:44.760 were doing
00:25:45.300 and so he
00:25:45.900 had laced
00:25:46.900 the bottom
00:25:48.040 of the pages
00:25:48.800 with a poison
00:25:49.540 so that if
00:25:50.640 a monk
00:25:51.080 actually went
00:25:51.880 in there
00:25:52.320 and read
00:25:52.800 the forbidden
00:25:53.380 books
00:25:53.800 then he
00:25:54.700 would be
00:25:55.060 poisoned
00:25:55.420 and get
00:25:55.840 killed
00:25:56.180 that's the
00:25:57.300 reflex
00:25:57.800 of forbidden
00:25:58.680 knowledge
00:25:59.220 I decide
00:26:00.520 what you
00:26:01.340 can read
00:26:02.060 what you
00:26:02.660 could think
00:26:03.260 what you
00:26:03.860 can say
00:26:04.320 and that's
00:26:04.920 why I
00:26:05.300 quoted that
00:26:05.940 book in
00:26:06.320 The Parasitic
00:26:06.840 Mind
00:26:07.120 and that's
00:26:07.580 exactly what
00:26:08.240 Kamala Harris
00:26:09.000 is doing
00:26:09.460 she is the
00:26:10.300 head monk
00:26:10.820 I love that
00:26:13.520 I love
00:26:14.000 that analogy
00:26:14.440 Kamala Harris
00:26:15.020 is the head
00:26:15.460 monk
00:26:15.660 when you
00:26:15.920 say I
00:26:16.440 decide
00:26:16.920 it's always
00:26:17.500 who's I
00:26:18.220 who is the
00:26:18.960 all-seeing
00:26:19.340 all-knowing
00:26:19.840 all-powerful
00:26:20.460 decider
00:26:20.900 of knowledge
00:26:21.480 and yeah
00:26:22.200 Kamala Harris
00:26:22.760 is putting
00:26:23.080 herself in
00:26:23.680 the position
00:26:24.180 of the head
00:26:25.620 monk
00:26:25.880 so that takes
00:26:26.620 me to the
00:26:26.900 final thing
00:26:27.320 here
00:26:27.500 I'm not
00:26:28.940 shocked
00:26:29.420 and by the
00:26:30.540 way
00:26:30.800 maybe in
00:26:31.320 the spirit
00:26:31.720 of you
00:26:32.080 should never
00:26:32.500 put any
00:26:32.920 information
00:26:33.460 or knowledge
00:26:33.960 off the
00:26:34.440 table
00:26:34.680 you should
00:26:34.940 consider
00:26:35.240 all perspective
00:26:36.000 and all
00:26:36.320 types of
00:26:36.760 human thought
00:26:37.300 maybe I
00:26:38.240 shouldn't
00:26:38.520 criticize
00:26:38.960 the New
00:26:39.240 York
00:26:39.400 Times
00:26:39.700 for publishing
00:26:40.260 this
00:26:41.300 which is
00:26:42.440 in
00:26:42.740 it's an
00:26:44.060 opinion
00:26:44.340 piece
00:26:44.720 I don't
00:26:46.840 know who
00:26:47.100 this person
00:26:47.500 is
00:26:47.680 Jennifer
00:26:47.960 Sazali
00:26:49.240 I don't
00:26:50.800 know who
00:26:51.060 that is
00:26:51.460 you may
00:26:51.800 maybe she's
00:26:52.260 in academia
00:26:52.760 she's a
00:26:53.980 non-fiction
00:26:54.400 book critic
00:26:55.080 so maybe not
00:26:56.480 quite as
00:26:56.840 esteemed
00:26:57.280 as a
00:26:57.720 professor
00:26:58.060 but the
00:26:58.760 headline is
00:26:59.200 the constitution
00:26:59.820 is sacred
00:27:00.600 is it also
00:27:01.840 dangerous
00:27:02.460 and the sub
00:27:03.600 that is one
00:27:03.960 of the biggest
00:27:04.340 threats to
00:27:04.720 America's
00:27:05.060 politics might
00:27:05.620 be the
00:27:05.860 country's
00:27:06.200 founding
00:27:06.440 document
00:27:06.960 for something
00:27:07.700 like I'd
00:27:08.200 say a
00:27:09.540 thousand
00:27:09.840 words she
00:27:10.500 goes on
00:27:10.980 to talk
00:27:11.400 about the
00:27:11.840 constitution
00:27:12.420 it's
00:27:13.460 shocking
00:27:13.820 for me
00:27:14.540 or well
00:27:15.080 as somebody
00:27:15.560 who did
00:27:15.820 study the
00:27:16.240 law
00:27:16.540 and I
00:27:16.940 know you're
00:27:17.220 Canadian
00:27:17.560 and the
00:27:18.080 constitution
00:27:18.420 is uniquely
00:27:18.940 American
00:27:19.440 for me
00:27:20.700 it is a
00:27:21.120 sacred
00:27:21.480 sacred
00:27:21.760 document
00:27:22.220 it is
00:27:22.700 accumulation
00:27:23.180 of I
00:27:23.700 think
00:27:23.940 centuries
00:27:24.460 of
00:27:24.640 knowledge
00:27:25.080 but the
00:27:27.700 whole point
00:27:28.200 of the
00:27:28.440 constitution
00:27:29.020 is to
00:27:29.780 thwart
00:27:30.040 authoritarianism
00:27:30.920 like it
00:27:31.900 is it's
00:27:32.780 a separation
00:27:33.240 of power
00:27:33.700 it's checks
00:27:34.160 and balances
00:27:34.780 it's certain
00:27:35.460 rights are
00:27:35.940 not subjected
00:27:36.520 to
00:27:36.720 democracy
00:27:37.280 yes
00:27:37.660 democracy
00:27:38.220 meaning a
00:27:38.920 protection
00:27:39.260 of a
00:27:39.620 minority
00:27:39.980 and she
00:27:40.980 spends
00:27:41.440 this whole
00:27:41.920 thing
00:27:42.240 or
00:27:42.420 wellingly
00:27:43.040 twisting
00:27:44.760 language
00:27:45.180 to say
00:27:46.200 yes it
00:27:46.700 subverts
00:27:47.000 democracy
00:27:47.600 but in
00:27:48.540 the protection
00:27:48.980 of authoritarianism
00:27:49.900 and to me
00:27:50.420 when we're
00:27:51.180 talking about
00:27:51.440 free speech
00:27:51.880 or whatever
00:27:52.280 increasingly
00:27:53.980 this is
00:27:54.620 the end
00:27:55.040 game
00:27:55.360 it's not
00:27:55.960 the end
00:27:56.260 game
00:27:56.420 maybe it's
00:27:56.780 the final
00:27:57.120 hurdle
00:27:57.480 to the
00:27:58.080 end
00:27:58.240 game
00:27:58.480 but the
00:27:59.160 suspension
00:27:59.580 of the
00:27:59.920 constitution
00:28:00.500 indeed
00:28:01.760 look
00:28:02.120 in the
00:28:03.220 parasitic
00:28:03.580 mind
00:28:03.860 I have
00:28:04.160 a quote
00:28:04.560 which I'm
00:28:04.940 sure
00:28:05.220 you know
00:28:06.100 what I'm
00:28:06.320 talking about
00:28:06.700 but I don't
00:28:07.020 have in front
00:28:07.480 of me
00:28:07.680 the Reagan
00:28:08.360 quote
00:28:08.840 where he
00:28:09.580 basically
00:28:09.980 says that
00:28:10.560 every generation
00:28:11.540 you have to
00:28:12.200 be assiduous
00:28:13.440 in defending
00:28:14.160 freedom because
00:28:15.000 every generation
00:28:16.020 there are new
00:28:16.640 folks who are
00:28:17.160 trying to
00:28:17.840 to kill
00:28:18.720 freedom
00:28:19.160 right
00:28:19.440 and so
00:28:20.040 therefore you
00:28:20.560 can't assume
00:28:21.260 that it's going
00:28:21.840 to be the
00:28:22.340 default value
00:28:23.200 forevermore
00:28:23.920 that you're
00:28:24.260 going to have
00:28:24.760 the freedoms
00:28:25.320 and liberties
00:28:25.840 that you take
00:28:26.700 for granted
00:28:27.220 to that point
00:28:28.880 one of the
00:28:29.360 reasons why
00:28:30.040 some of the
00:28:30.740 staunchest
00:28:31.480 defenders
00:28:32.200 of the
00:28:33.000 constitution
00:28:33.480 and to
00:28:34.180 your point
00:28:34.640 I'm
00:28:35.020 Canadian
00:28:35.560 yet frankly
00:28:36.600 I should
00:28:37.460 be granted
00:28:37.980 American
00:28:38.500 citizenship
00:28:39.080 yesterday
00:28:39.760 the reason
00:28:41.020 why
00:28:41.400 immigrants
00:28:42.260 are some
00:28:42.700 of the
00:28:42.960 staunchest
00:28:43.480 defenders
00:28:43.980 of the
00:28:44.900 constitution
00:28:45.540 and of
00:28:46.180 the western
00:28:46.580 tradition
00:28:47.040 is because
00:28:47.900 we've
00:28:48.340 sampled
00:28:48.920 from the
00:28:49.700 buffet of
00:28:50.280 societies
00:28:50.900 out there
00:28:52.200 and we
00:28:52.980 know
00:28:53.380 that the
00:28:54.100 American
00:28:54.540 experience
00:28:55.240 is a
00:28:56.260 bleep
00:28:56.740 in history
00:28:57.360 it's an
00:28:57.900 anomaly
00:28:58.420 it's a
00:28:58.880 miraculous
00:28:59.420 anomaly
00:28:59.960 but that's
00:29:01.120 not the
00:29:01.580 default
00:29:02.000 value
00:29:02.420 the default
00:29:03.140 value
00:29:03.720 is for
00:29:04.680 head monk
00:29:06.320 Kamala
00:29:06.940 to tell
00:29:07.640 the rest
00:29:08.100 of us
00:29:08.460 how to
00:29:08.920 dress
00:29:09.240 when to
00:29:09.700 talk
00:29:10.020 when to
00:29:10.500 eat
00:29:10.780 what to
00:29:11.200 put on
00:29:11.460 our
00:29:11.620 bodies
00:29:11.940 that's
00:29:12.800 the way
00:29:13.260 societies
00:29:13.860 have
00:29:14.100 organized
00:29:14.680 themselves
00:29:15.600 since
00:29:16.320 time
00:29:16.660 immemorial
00:29:17.240 so it
00:29:18.060 breaks my
00:29:18.760 heart
00:29:19.040 to see
00:29:19.480 someone
00:29:19.780 like this
00:29:20.240 woman
00:29:20.440 I don't
00:29:20.720 know
00:29:20.860 who she
00:29:21.140 is
00:29:21.400 who has
00:29:22.080 benefited
00:29:22.680 from all
00:29:23.340 those
00:29:23.580 freedoms
00:29:24.000 where
00:29:24.960 people
00:29:25.300 like me
00:29:25.860 came
00:29:26.260 and knocked
00:29:26.900 on the
00:29:27.160 door
00:29:27.260 and said
00:29:27.480 please
00:29:27.780 let us
00:29:28.140 in
00:29:28.340 they're
00:29:28.520 about
00:29:28.720 to
00:29:28.900 kill
00:29:29.080 us
00:29:29.300 and
00:29:29.840 she
00:29:30.000 doesn't
00:29:30.280 appreciate
00:29:30.700 what she
00:29:31.120 has
00:29:31.940 what a
00:29:32.860 shame
00:29:33.100 well the
00:29:35.180 other thing
00:29:35.600 the one
00:29:35.940 other thing
00:29:36.240 I'll explore
00:29:36.840 with you
00:29:37.040 here that
00:29:37.300 I find
00:29:37.580 fascinating
00:29:37.980 I'm sort
00:29:38.420 of just
00:29:38.740 freelancing
00:29:39.360 this thought
00:29:39.840 with you
00:29:40.100 which is
00:29:40.340 always
00:29:40.520 dangerous
00:29:40.900 but
00:29:41.180 the
00:29:42.380 constitution
00:29:42.960 is
00:29:44.300 absolutely
00:29:45.780 a subversion
00:29:46.780 of democracy
00:29:47.520 when I say
00:29:48.540 subversion
00:29:49.000 that makes
00:29:49.360 it seem
00:29:49.680 like it's
00:29:50.100 secretive
00:29:50.440 it's not
00:29:50.800 it's
00:29:51.040 overt
00:29:51.400 it is
00:29:52.520 a
00:29:53.020 it is
00:29:53.820 a document
00:29:54.280 that says
00:29:54.720 these certain
00:29:55.780 rights
00:29:56.380 will not be
00:29:57.420 subjected
00:29:57.880 to a
00:29:58.280 majority
00:29:58.620 vote
00:29:59.040 and that's
00:29:59.800 why it's
00:30:00.100 a projection
00:30:00.560 of
00:30:01.200 minority
00:30:01.740 so 51%
00:30:03.120 can't decide
00:30:04.120 to set
00:30:04.560 aside
00:30:04.960 freedom
00:30:05.520 of speech
00:30:06.020 it's not
00:30:06.620 going to
00:30:06.800 happen
00:30:07.060 because the
00:30:07.780 united states
00:30:08.320 constitution
00:30:08.940 and a pure
00:30:10.240 democracy
00:30:10.780 it is pure
00:30:12.260 power
00:30:12.680 you talked
00:30:13.740 about the
00:30:14.800 singular head
00:30:15.380 monk but
00:30:15.860 it's also
00:30:16.220 like tribal
00:30:16.760 power and
00:30:17.500 that's how a lot
00:30:17.960 of societies
00:30:18.460 work especially
00:30:19.380 in third world
00:30:19.900 countries it's
00:30:20.380 like whoever
00:30:21.300 is the majority
00:30:22.260 power doesn't
00:30:23.100 always mean by
00:30:23.580 numbers because
00:30:24.140 it often means
00:30:24.700 by force
00:30:25.340 they dictate
00:30:26.320 the terms
00:30:26.740 of society
00:30:27.300 over the
00:30:27.840 minority
00:30:28.220 and it makes
00:30:29.320 me wonder
00:30:29.720 like that
00:30:30.880 is sort of
00:30:31.400 where you're
00:30:31.760 pushing this
00:30:32.500 either a you
00:30:33.860 have the ultimate
00:30:34.440 confidence that
00:30:35.240 you will always
00:30:36.220 be on the side
00:30:36.860 of the 51%
00:30:37.820 or you believe
00:30:39.640 that somehow
00:30:40.260 you'll be the
00:30:41.360 one in power
00:30:43.240 and in that
00:30:44.080 pursuit of power
00:30:45.060 ultimately is
00:30:46.000 brute it's
00:30:46.920 not it's not
00:30:47.920 at a voting
00:30:48.340 booth like it
00:30:49.200 is like it
00:30:50.160 gets violent
00:30:50.860 it always does
00:30:52.280 get violent
00:30:52.880 and I just
00:30:53.660 wonder with
00:30:53.980 somebody like
00:30:54.400 Jennifer
00:30:54.820 Salai
00:30:56.780 do you think
00:30:59.740 in a brute
00:31:00.340 fight for power
00:31:01.440 a violent fight
00:31:02.840 for power
00:31:03.380 your side
00:31:04.680 will be the
00:31:05.100 victor
00:31:05.820 because I'm
00:31:06.420 not so confident
00:31:07.260 that the side
00:31:08.360 that asks for
00:31:09.040 no guns
00:31:10.220 gender fluidity
00:31:12.740 legalization of
00:31:13.920 drugs is the one
00:31:14.760 that wants to
00:31:15.260 reduce this
00:31:15.960 to a fight
00:31:16.860 of brute
00:31:17.260 violent power
00:31:17.980 well and listen
00:31:19.680 this kind of
00:31:20.520 parasitic thinking
00:31:21.260 that you're
00:31:21.780 describing in
00:31:22.560 this Jennifer
00:31:23.480 lady is exactly
00:31:24.960 why you have
00:31:25.860 queers for
00:31:26.500 Palestine
00:31:27.000 right it's I
00:31:28.140 mean in what
00:31:29.080 world does it
00:31:29.980 make sense
00:31:30.640 where if my
00:31:31.900 I my most
00:31:33.940 fundamental identity
00:31:35.000 that I present
00:31:35.660 to the world
00:31:36.300 which is I'm
00:31:37.360 queer in what
00:31:38.820 world would you
00:31:39.880 then place all
00:31:41.080 of your chips
00:31:41.860 on the place
00:31:42.980 that would put
00:31:44.780 you through a
00:31:45.620 100% effective
00:31:47.600 gravitation based
00:31:49.200 conversion method
00:31:50.460 by logging you
00:31:52.620 off the
00:31:53.120 you know the
00:31:53.960 building as
00:31:55.020 they do in
00:31:55.540 Gaza for
00:31:56.160 people who are
00:31:56.860 queer versus
00:31:57.820 Tel Aviv where
00:31:59.100 you actually have
00:32:00.360 one of the most
00:32:01.260 queer friendly
00:32:01.880 places I mean
00:32:02.400 short of New
00:32:03.160 York San
00:32:03.980 Francisco and
00:32:04.780 actually Montreal
00:32:05.620 my home city
00:32:06.360 Tel Aviv is
00:32:07.480 probably right up
00:32:08.300 there in the
00:32:08.700 top five most
00:32:09.700 queer friendly
00:32:10.460 places and yet
00:32:11.480 people say no
00:32:12.120 no I'm putting
00:32:12.780 all my chips
00:32:13.520 with the society
00:32:15.160 that would you
00:32:15.960 know throw me
00:32:16.560 off rooftops
00:32:17.580 that's the
00:32:18.640 kind of
00:32:19.300 unbelievably
00:32:20.580 myopic
00:32:21.300 parasitic
00:32:21.920 thinking quote
00:32:23.500 thinking that I
00:32:24.940 rail against all
00:32:25.740 day but you
00:32:26.500 know I don't
00:32:27.120 want to end
00:32:27.520 this on a
00:32:28.060 pessimistic note
00:32:29.140 but it seems
00:32:30.780 to me that no
00:32:31.560 some of these
00:32:32.300 folks are so
00:32:33.080 impenetrable to
00:32:34.060 reason to you I
00:32:34.740 know that in your
00:32:35.560 monologue you
00:32:36.500 talked about using
00:32:37.840 feelings rather
00:32:38.880 than thought in
00:32:40.200 in choosing your
00:32:41.300 president well I
00:32:42.720 talk about this in
00:32:43.440 the parasitic mind
00:32:44.160 right most people
00:32:45.400 want to deploy
00:32:46.240 fast and frugal
00:32:47.480 heuristics to
00:32:48.340 make a decision
00:32:48.960 right well my
00:32:49.900 emotional system
00:32:50.840 is much quicker
00:32:51.820 it's autonomic
00:32:52.860 right I love
00:32:54.100 Kamala Harris I
00:32:55.540 hate Donald Trump
00:32:56.620 thinking is so
00:32:57.880 hard right it's
00:32:59.020 so effortful it's
00:33:00.420 so cumbersome best
00:33:01.900 to simply use my
00:33:03.340 emotional system
00:33:04.260 and I keep
00:33:05.100 warning people
00:33:06.000 including my
00:33:06.820 colleagues you're
00:33:08.260 trained as
00:33:09.080 psychologists use
00:33:10.400 your brain which
00:33:11.880 policies are more
00:33:13.500 in line with your
00:33:14.300 values they go
00:33:15.360 no but she's so
00:33:16.460 positive she's
00:33:17.860 joyful she's fun
00:33:19.480 I'm going with
00:33:20.620 her we need a
00:33:21.540 change they're
00:33:22.440 impenetrable to
00:33:23.520 reason yeah my
00:33:26.760 only hope in that
00:33:27.740 monologue talking
00:33:28.420 about the people
00:33:28.860 that ride along
00:33:29.400 the surface you
00:33:30.220 know who will
00:33:30.760 ultimately make
00:33:31.420 that emotional
00:33:31.940 decision is and
00:33:34.400 I don't at this
00:33:35.500 point it's hard to
00:33:36.120 find the undecided
00:33:37.220 emotional reaction
00:33:38.760 to Donald Trump
00:33:39.420 I think most
00:33:40.160 people's emotional
00:33:41.360 reaction has been
00:33:42.440 defined to Donald
00:33:43.400 Trump so it's
00:33:44.960 shocking to me
00:33:45.740 that wherever
00:33:46.080 they are
00:33:46.560 Pennsylvania
00:33:47.620 you know
00:33:48.660 Michigan
00:33:49.020 Wisconsin
00:33:49.660 I guess we
00:33:51.700 have to implore
00:33:52.160 them to think
00:33:52.760 I don't know I
00:33:53.740 don't know how the
00:33:54.280 emotions can be swayed
00:33:55.200 at this point you
00:33:55.620 have to figure out
00:33:56.280 how to think before
00:33:57.320 63 days expires and
00:33:59.340 we have election
00:33:59.840 day I look forward
00:34:02.580 to suicidal
00:34:03.440 empathy I
00:34:04.800 encourage everyone
00:34:06.000 check out
00:34:06.420 parasitic mind and
00:34:07.500 all his stuff he's
00:34:08.200 on X
00:34:08.580 Gadzad he's great
00:34:09.880 Professor Gadzad
00:34:10.680 love having you on
00:34:11.260 the Will Cain
00:34:11.680 show thanks
00:34:12.140 professor
00:34:12.580 thank you so
00:34:13.620 much cheers
00:34:14.100 all right take
00:34:16.280 care
00:34:16.460 really really smart
00:34:18.580 guy really really
00:34:19.340 forget smart
00:34:20.580 overvalued
00:34:22.340 adjective
00:34:23.140 wise
00:34:24.200 thoughtful
00:34:25.180 mind
00:34:26.640 sound
00:34:27.600 music
00:34:28.320 loud
00:34:29.200 shows
00:34:30.120 false
00:34:30.940 widz