The Saad Truth with Dr. Saad - November 12, 2024


Suicidal Empathy & Post-Election Liberal Meltdown! Livestream with Viva Frei (The Saad Truth with Dr. Saad_751)


Episode Stats


Length

1 hour and 26 minutes

Words per minute

173.89407

Word count

15,015

Sentence count

930

Harmful content

Misogyny

12

sentences flagged

Toxicity

99

sentences flagged

Hate speech

41

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, I'm joined by Dr. Gad Said to discuss the new title of his new book, "My Own Donald Trump: An Orgastic Dear Diary Entry of Full Fear." Dr. Said is a behavioral scientist and professor of psychology at the University of Toronto. He's been a behavioral psychologist for 30+ years and has been involved in the field of consumer behavior psychology for the past 30 years. He has a Bachelor's degree in psychology and a Master's in Operations Research, an MBA and an MS in Management Science, and a PhD in Decision Making with one of the leading cognitive psychologists in the world.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 The newest title of The Dr. Gad Said, who's sitting there.
00:00:04.980 If you saw what I see right now, what you're going to see,
00:00:07.460 I'm looking at a statue of a Greek hero chiseled out of rock.
00:00:12.440 Gad, I'm bringing you in.
00:00:14.320 Oh, my.
00:00:15.640 Oh, my.
00:00:16.260 I like that.
00:00:17.040 Chiseled Greek god.
00:00:18.180 I love it.
00:00:19.920 I look bigger than you on screen, but you're a good six inches taller than me
00:00:23.220 and more impressive than me in real life.
00:00:27.020 Gad, how's the battle?
00:00:28.140 In the upper six inches or where it really counts, in the pan six inches?
00:00:33.200 Which one were you referring to?
00:00:34.400 Probably both.
00:00:35.260 All that I know is that mine worked well enough for three kids,
00:00:37.340 and that's all I needed in life.
00:00:39.480 How are you doing, Viva?
00:00:40.480 So good to see you.
00:00:41.380 The same.
00:00:41.960 Gad, you're stuck behind the maple gulag up in Canada 1.00
00:00:45.120 where the shit is hitting the fan these days. 1.00
00:00:47.960 Yes. 1.00
00:00:48.420 Well, I don't know if you saw.
00:00:50.280 I actually prepared it for you.
00:00:52.300 I just put out a tweet of my fear.
00:00:55.740 You know, the full fear that all of my colleagues are exhibiting.
00:00:59.240 So could I share with you?
00:01:00.520 Can I read out my tweet?
00:01:01.780 No, please.
00:01:02.400 Do you see?
00:01:03.580 Oh, go to the private chat, and then you can send me the link.
00:01:06.440 But I'll just go to you right now and get it.
00:01:09.080 Yeah.
00:01:09.860 You're asking me to multitask.
00:01:11.520 It's so much for you.
00:01:12.440 I'll get it.
00:01:13.100 Is this the...
00:01:14.540 Let me see here.
00:01:15.220 It's pretty, it's pretty, it's very early in my feed,
00:01:18.700 like probably about half an hour ago.
00:01:21.020 Oh, 1226.
00:01:22.720 So it's...
00:01:23.280 1226.
00:01:23.880 It's, it's, it's called, it's my own Donald Trump orgiastic
00:01:28.920 Dear Diary entry of full fear.
00:01:31.640 You ready?
00:01:32.400 Yeah, I got it.
00:01:33.180 I'll bring it up when you read it, and I'll bring it up in two seconds.
00:01:35.780 Now, I was going to splash like, you know, fake tears on me
00:01:40.320 because I'm so scared.
00:01:41.300 So I'm going to read it for your fans.
00:01:43.400 Here we go.
00:01:43.940 Okay, ready?
00:01:46.040 Today, I stand before you as a fearful Jew of color. 0.85
00:01:50.620 Now that Trump has won, I realize that my days as a free man are numbered.
00:01:55.900 Also, my wife, who self-identifies as a man,
00:02:00.140 is fearful that he will be sent to a trans gulag.
00:02:05.420 Because my biological female wife is a man, 1.00
00:02:10.060 this implies that we are in a same-sex gay marriage. 1.00
00:02:13.340 I know that Trump is going to commit a genocide on gay people. 0.99
00:02:17.600 So I worry that if we were to ever go to the U.S., 0.99
00:02:21.360 we will be rounded up into extermination camps.
00:02:25.020 Finally, as a behavioral scientist and professor of 30 plus years,
00:02:30.300 I know that Trump is anti-science.
00:02:33.040 Hence, I'm likely going to be fired from an academic job
00:02:36.160 because he plans on eradicating science.
00:02:38.180 I'm looking at the possibility of starting a falafel food truck with my friend Ahmed,
00:02:46.380 but I worry that Trump is going to round up all Muslims.
00:02:50.320 Finally, my husband, meaning my biological female wife who identifies the man, 0.95
00:02:55.960 might wish to have unlimited abortions. 0.99
00:02:58.640 And I fear that he won't be able to do that anymore.
00:03:02.180 Please help me.
00:03:03.840 I'm so scared.
00:03:05.300 And then I put in brackets,
00:03:07.120 it's really surprising that Trump won on Tuesday.
00:03:11.280 Do you buy my story of full fear?
00:03:13.820 Do you feel my fear?
00:03:15.540 Well, the funny thing is,
00:03:16.940 if I didn't know you,
00:03:18.900 I would not necessarily be a thousand percent certain that that is satire and parody.
00:03:23.300 I mean, let's start from ground zero here,
00:03:29.020 like the step one.
00:03:30.240 People call mass formation psychosis a conspiracy theory
00:03:35.240 or something that doesn't exist.
00:03:37.100 What are your credentials for those who may not know you?
00:03:40.580 Well, I have a bachelor's of science and mathematics, computer science.
00:03:45.260 I have an MBA with a thesis in operations research.
00:03:48.440 I have an MS in science.
00:03:50.060 I mean, officially it's in management,
00:03:52.560 but it's in behavioral sciences.
00:03:54.320 And my PhD is in psychology of decision-making
00:03:56.920 with one of the leading cognitive psychologists in the world.
00:04:00.400 He just recently retired.
00:04:01.800 I've been a behavioral scientist for 30 plus years.
00:04:04.860 I pioneered the field of evolutionary consumer behavior.
00:04:08.740 So I know one or two things about human behavior.
00:04:11.960 So let's start with the number one thing.
00:04:14.280 Mass formation psychosis,
00:04:15.660 which we're told doesn't exist as a conspiracy theory.
00:04:18.640 Oddly enough,
00:04:19.380 the authorities that fact checkers use to debunk it
00:04:22.760 are behavioral science,
00:04:24.540 or they're so-called behavioral group psychologists,
00:04:27.200 which to me,
00:04:28.500 group behavior is mass formation psychosis by another name.
00:04:32.600 Your professional opinion, Doc,
00:04:35.660 does it exist or does it not exist?
00:04:37.500 And what is it in reality?
00:04:39.320 It does.
00:04:39.920 Look, it's hardly the first time
00:04:42.540 that the manifestation of this kind of unhinged collective irrationality
00:04:47.780 has taken place, right?
00:04:49.360 What makes it unique?
00:04:50.880 So, for example, oftentimes,
00:04:52.380 so to your question,
00:04:53.320 and I'll just add to it,
00:04:54.300 many times people ask me,
00:04:55.660 so what you refer to in your book
00:04:57.800 as parasitic idea pathogens,
00:05:00.480 is this a novel thing?
00:05:02.320 So to your question,
00:05:03.220 is mass psychosis something new?
00:05:05.940 It's not new,
00:05:07.280 but the specific parasitic ideas are new.
00:05:10.280 So to draw an analogy,
00:05:12.580 it's not as though viruses didn't exist prior to COVID,
00:05:17.420 but COVID was a new virus, right?
00:05:19.780 So viruses have always existed,
00:05:21.500 but there's now a new manifestation.
00:05:23.320 In 1918,
00:05:24.380 there was the Spanish flu,
00:05:25.920 another virus, okay?
00:05:27.240 So the problem arises today
00:05:29.980 that over the past 50 to 100 years,
00:05:33.100 there is a set of dreadful ideas,
00:05:36.520 all of which were spawned on university campuses,
00:05:39.780 and it took a while to proliferate
00:05:42.040 outside of the ivory tower,
00:05:44.020 and what we're seeing now,
00:05:45.920 in part,
00:05:46.520 which you call mass psychosis,
00:05:48.380 I call, you know,
00:05:49.620 parasitic idea pathogens,
00:05:51.920 is the result of that indoctrination
00:05:55.000 that has been slowly taking place
00:05:57.420 for 50 to 100 years.
00:05:59.080 The reason why I say 50 to 100,
00:06:00.620 because depending on the parasitic idea,
00:06:03.860 you can put a stamp of,
00:06:06.680 it's about 100 years ago.
00:06:07.860 So say, for example,
00:06:08.620 cultural relativism,
00:06:10.260 the idea that all cultures are equal
00:06:12.360 and who are we to judge the values
00:06:13.980 of another culture and so on,
00:06:15.580 that came from Franz Boas,
00:06:17.660 a Columbia University professor,
00:06:19.660 about 100 years ago.
00:06:20.860 On the other hand,
00:06:21.780 postmodernism is a movement
00:06:23.780 that started about 50 years ago or so.
00:06:26.580 So between 50 and 100 years
00:06:28.280 is when all the lunacy really started.
00:06:30.640 And now,
00:06:31.760 like,
00:06:32.160 I can only think of,
00:06:33.280 like, two modern areas.
00:06:34.160 You think of Nazi Germany
00:06:35.640 is the classic paradigm
00:06:37.020 of when society descends into madness.
00:06:38.740 You can think of Mao's revolution
00:06:40.520 or Stalinism in general.
00:06:42.620 And some people will say,
00:06:43.520 well, that's not,
00:06:44.100 that's not a form of mass group hysteria
00:06:46.580 or that's more living under tyranny
00:06:49.760 and everyone's too scared to do whatever.
00:06:51.160 But there were true believers
00:06:52.320 in all of those,
00:06:53.780 in as much as we're seeing
00:06:54.520 true believers today.
00:06:55.480 I mean, I guess,
00:06:58.280 so nothing new.
00:06:59.700 We're tapping into something
00:07:00.480 of human condition
00:07:01.640 of, I would call,
00:07:02.380 the weak spirits,
00:07:03.260 the weak minds.
00:07:04.340 Exactly.
00:07:05.020 So to take your example
00:07:06.780 of, say, the Soviet Union,
00:07:08.460 and I referenced this 0.97
00:07:09.600 in the parasitic mind, 0.78
00:07:11.340 Lysenkoism, okay, 1.00
00:07:13.800 was a scientific theory
00:07:16.500 named after Lysenko,
00:07:19.240 who was the head of science
00:07:20.720 under Stalin.
00:07:21.980 They believed that
00:07:23.900 the laws of heredity,
00:07:25.640 the laws of genetics,
00:07:27.160 were not consistent
00:07:28.780 with Marxist philosophy.
00:07:31.660 And therefore,
00:07:32.480 they purported
00:07:34.120 that there is another
00:07:35.260 more veridical genetic theory,
00:07:37.820 which then became known
00:07:39.440 as Lysenkoism.
00:07:41.220 Without getting into
00:07:42.400 the technical weeds,
00:07:43.420 it is based on something
00:07:44.600 in evolutionary theory
00:07:45.760 called Lamarckism,
00:07:47.520 which is a theory
00:07:48.340 of acquired traits,
00:07:49.380 which is not how
00:07:50.620 evolutionary mechanisms operate,
00:07:52.860 at least not when it comes
00:07:53.900 to, say, crossing crops,
00:07:56.540 when you're developing
00:07:57.920 an agricultural program.
00:07:59.060 But they thought
00:08:00.220 that that theory
00:08:01.280 was more in line
00:08:02.380 with their Marxist philosophies,
00:08:04.020 so they built
00:08:04.880 their food programs
00:08:06.060 and their agricultural programs
00:08:07.960 in line with this
00:08:09.320 false scientific theory,
00:08:11.580 which then led to
00:08:12.820 the death by starvation
00:08:15.060 of 20 to 30 million people.
00:08:16.980 The reason why
00:08:17.640 I tell that story
00:08:18.580 is because it demonstrates
00:08:20.160 that it's not only,
00:08:22.040 you know,
00:08:22.700 the uneducated
00:08:24.240 that are parasitized 0.84
00:08:25.580 by these weak minds, 0.79
00:08:26.660 as you said. 0.67
00:08:27.560 To the contrary,
00:08:28.700 it's usually
00:08:29.440 the intellectuals
00:08:31.340 who come up
00:08:32.240 with those dreadful ideas,
00:08:34.260 and then because
00:08:35.220 of their imprimatur
00:08:36.620 as experts,
00:08:37.920 they can then pass
00:08:39.000 on those theories 1.00
00:08:40.000 to all of the imbeciles 1.00
00:08:41.800 out there. 1.00
00:08:42.960 It's, like,
00:08:44.040 Lysenkoism 1.00
00:08:44.640 is kind of amazing,
00:08:45.740 especially when
00:08:46.200 you're talking about farming.
00:08:47.240 It's almost as absurd,
00:08:49.700 but not quite as absurd,
00:08:51.200 or I guess,
00:08:51.940 you've seen Idiocracy,
00:08:53.280 correct?
00:08:54.320 I know of it.
00:08:55.500 I didn't see it
00:08:55.880 from start to finish.
00:08:56.900 You absolutely must,
00:08:58.040 because there's a part
00:08:59.320 of it where they're
00:08:59.700 giving Brondo,
00:09:00.540 you know,
00:09:00.720 plants need electrolytes.
00:09:02.200 It's got electrolytes,
00:09:02.980 so they're feeding plants
00:09:04.040 basically Gatorade,
00:09:05.400 the movie version of Gatorade,
00:09:06.660 and all the plants are dying,
00:09:07.720 and the guy's like,
00:09:08.120 why are you giving
00:09:08.740 the plants Gatorade?
00:09:10.340 And they say,
00:09:10.640 well, it's got electrolytes.
00:09:12.500 It's like,
00:09:12.800 do you know what
00:09:13.240 electrolytes do?
00:09:14.220 And they say,
00:09:15.140 they just need water.
00:09:16.000 And he's like,
00:09:16.520 the water growing
00:09:17.200 out of the toilet?
00:09:18.180 The water you have
00:09:18.880 in the toilet?
00:09:19.340 And he's like,
00:09:19.720 yeah,
00:09:19.920 he's like,
00:09:20.320 well,
00:09:20.660 Mr. Smarty Pants,
00:09:21.740 why don't I see plants
00:09:22.660 growing out of the toilet then?
00:09:23.780 And it's that level
00:09:24.880 of insanity
00:09:25.500 that you have
00:09:26.500 when the science
00:09:27.020 doesn't work,
00:09:27.820 you make up
00:09:28.580 some sort of excuse,
00:09:29.540 political excuse for it,
00:09:30.520 and lo and behold,
00:09:31.780 they had the dust bowl 0.99
00:09:33.140 and idiocracy, 0.95
00:09:33.880 and they had a famine
00:09:34.540 that killed 20 million people
00:09:35.840 as a result
00:09:36.840 of Lensankoism.
00:09:39.180 However,
00:09:39.920 you're in the thick
00:09:40.920 of Canada,
00:09:41.780 where I see people
00:09:43.020 virtue signaling
00:09:43.780 the election of Trump,
00:09:45.180 not understanding
00:09:46.200 it was a decisive
00:09:47.340 landslide
00:09:48.180 majority vote,
00:09:50.000 a popular vote victory, 0.99
00:09:51.540 and yet these jackasses 1.00
00:09:52.740 think they know better 0.99
00:09:53.640 than the people
00:09:54.140 for themselves.
00:09:55.080 What's the mood
00:09:55.640 up in Canada?
00:09:56.180 Well,
00:09:57.340 it's exactly
00:09:58.100 what you would expect
00:09:59.520 from a super-progressive country,
00:10:02.580 which is that
00:10:04.020 this demonstrates,
00:10:06.080 this only solidifies
00:10:07.420 our view,
00:10:08.480 I'm speaking now
00:10:09.240 as the progressive Canadian,
00:10:10.820 not as Gatsat,
00:10:12.060 that,
00:10:12.520 you know,
00:10:13.060 our southern neighbors
00:10:14.480 are truly ruled 0.99
00:10:16.500 by insane, 1.00
00:10:18.080 degenerate, 1.00
00:10:19.040 toothless, 1.00
00:10:20.200 maniacs 0.99
00:10:20.940 who sleep 1.00
00:10:21.480 with their sisters,
00:10:22.280 right?
00:10:22.480 I mean,
00:10:22.740 what else could,
00:10:23.940 what else could explain
00:10:25.580 that Trump would come to,
00:10:27.300 but by the way,
00:10:27.860 I,
00:10:28.860 yes,
00:10:29.100 of course,
00:10:29.480 I've got a talent
00:10:30.300 to use,
00:10:30.960 you know,
00:10:31.180 hyperbolic humor
00:10:32.080 and satire,
00:10:33.000 but what makes
00:10:34.420 that satire
00:10:35.380 so powerful
00:10:36.060 is that it really
00:10:36.880 is a very accurate
00:10:38.320 description.
00:10:38.940 As you said,
00:10:39.580 when I first read
00:10:40.700 my tweet,
00:10:41.140 and you said,
00:10:41.460 I wouldn't be able
00:10:42.220 to know if I didn't
00:10:42.960 know you,
00:10:43.520 if it's real or fake,
00:10:44.700 if you're being satirical
00:10:45.600 or not,
00:10:46.700 the kind of hysteria
00:10:48.460 you see,
00:10:49.320 Viva,
00:10:50.340 you know,
00:10:51.120 it offends me
00:10:52.240 to my core,
00:10:53.060 so I,
00:10:53.760 and again,
00:10:54.780 this is not some
00:10:56.360 misinformed,
00:10:57.860 quote,
00:10:58.520 low information voter
00:10:59.820 that's espousing
00:11:00.840 those positions.
00:11:01.800 Those are my colleagues,
00:11:03.760 many of whom
00:11:04.700 have all of the titles
00:11:06.900 and accolades
00:11:07.660 that you could hope
00:11:08.300 to have.
00:11:08.960 By the way,
00:11:10.180 well,
00:11:10.680 speaking of which,
00:11:11.740 Gann,
00:11:11.860 I mean,
00:11:12.000 I went down your feed
00:11:13.060 here also.
00:11:13.660 I mean,
00:11:14.120 we come across
00:11:15.380 similar stuff,
00:11:16.260 but this is,
00:11:16.940 this is a real thing,
00:11:18.100 by the way,
00:11:18.340 that you retweeted.
00:11:19.220 You said,
00:11:20.000 let's see if I can get
00:11:20.580 this here.
00:11:21.180 In my entire academic career,
00:11:22.420 I have never incorporated
00:11:23.400 political issues
00:11:24.140 into my pedagogic
00:11:25.120 responsibilities.
00:11:26.200 Check this out.
00:11:26.880 What happens if someone
00:11:27.600 in this person's lab
00:11:28.520 was a fervent supporter
00:11:29.680 of Donald Trump?
00:11:30.960 Academia needs to be purged
00:11:31.980 of this ideological rapture.
00:11:33.940 It is a cancer
00:11:34.740 that I've been warning you
00:11:36.060 about for decades.
00:11:37.080 Read this full,
00:11:37.820 read this,
00:11:38.540 or please read
00:11:39.040 The Parasitic Mind
00:11:39.840 and make sure you get
00:11:40.660 suicidal empathy
00:11:41.260 when it comes out
00:11:41.820 for pre-ordering.
00:11:42.540 I will,
00:11:43.040 but I gotta bring up
00:11:44.080 this tweet that you,
00:11:45.200 that you brought up.
00:11:45.980 There you go.
00:11:46.300 I sent this to my lab
00:11:47.480 this morning.
00:11:47.940 This is from a neuroscientist.
00:11:50.520 This is from academia.
00:11:52.340 This woman,
00:11:52.880 presumably,
00:11:53.800 or man,
00:11:54.380 I don't know who it is,
00:11:55.220 presumably reads,
00:11:56.220 grades other people's papers.
00:11:57.480 And if they know
00:11:58.340 their politics,
00:11:59.000 I would not expect
00:11:59.880 a fair review.
00:12:01.300 This is from what
00:12:01.900 the person wrote
00:12:02.560 to their lab.
00:12:03.100 Hyle,
00:12:03.740 well,
00:12:04.080 that didn't go well.
00:12:05.200 That didn't go well.
00:12:06.080 Sorry,
00:12:06.600 general population vote.
00:12:08.840 Doesn't matter.
00:12:09.820 That didn't go well.
00:12:10.420 Between the snow
00:12:11.300 and an election result,
00:12:13.480 that is to say
00:12:14.500 the least incredibly disheartening,
00:12:16.240 please feel free
00:12:17.220 to take a day or two
00:12:18.800 if you need time.
00:12:20.700 I want to say
00:12:21.260 that even though 0.91
00:12:21.700 America is terrible 0.96
00:12:22.760 and, 0.98
00:12:23.520 at least at this moment,
00:12:24.720 doesn't value us
00:12:25.820 or our identities,
00:12:27.440 you are all great people
00:12:28.560 and I appreciate you all,
00:12:30.600 except those who voted for Trump,
00:12:31.860 for your uniqueness
00:12:32.780 that makes our lab
00:12:33.960 a remarkable group
00:12:34.820 and we are a better place,
00:12:36.640 we are a better people
00:12:37.400 and scientists for it.
00:12:38.800 I am happy
00:12:39.680 that all of you
00:12:40.540 are part of the lab
00:12:41.740 and I will do everything
00:12:42.720 I can to support our group
00:12:43.820 and you each individually
00:12:44.940 for whatever comes
00:12:45.920 down the line.
00:12:46.760 I will be in today
00:12:48.060 and tomorrow.
00:12:49.340 I am WFH,
00:12:51.120 what does that mean,
00:12:51.440 working from home
00:12:52.100 on Friday
00:12:52.780 and available
00:12:53.460 for regular meetings.
00:12:54.380 We can complain bitterly
00:12:55.340 and cry together
00:12:56.000 for those who are scheduled today.
00:12:57.500 I don't know
00:12:57.940 if this person's in America
00:12:58.860 or Canada.
00:12:59.920 They talk as though
00:13:01.040 statistically 50%
00:13:02.940 of their lab
00:13:03.460 wouldn't be Trump supporters,
00:13:04.840 which means that
00:13:05.560 there is some ideological purge
00:13:07.260 that this person
00:13:07.940 is banking on
00:13:08.600 in academia.
00:13:10.360 Well, there isn't,
00:13:11.160 by the way.
00:13:11.900 I don't know either
00:13:12.780 if they're in Canada
00:13:13.700 or the U.S.,
00:13:14.360 but it isn't 50-50,
00:13:16.520 right?
00:13:16.800 Because as I provide
00:13:19.500 endless evidence 0.98
00:13:21.060 in the parasitic mind 0.94
00:13:22.120 of studies
00:13:23.200 that have looked
00:13:23.900 at the political affiliation
00:13:25.980 of professors
00:13:26.940 and, you know,
00:13:27.800 academia,
00:13:29.180 I mean,
00:13:30.100 across all disciplines,
00:13:31.680 it can vary
00:13:32.580 from 5 to 1
00:13:33.860 to 11 to 1,
00:13:35.680 which is certainly
00:13:36.660 unbelievably lopsided.
00:13:37.900 But in activist disciplines,
00:13:40.400 in anthropology
00:13:41.420 and in sociology
00:13:42.320 and in ethnic studies
00:13:43.900 and in communications
00:13:45.320 and so on,
00:13:46.200 it could be
00:13:47.120 130 to 0.
00:13:49.760 In other words,
00:13:50.540 you're more likely
00:13:51.720 to run
00:13:52.400 into a unicorn
00:13:54.100 or, you know,
00:13:55.000 a horse
00:13:56.020 with wings.
00:13:57.880 I mean,
00:13:58.240 literally,
00:13:59.220 than you are
00:14:00.060 to run
00:14:00.600 across a Republican
00:14:03.020 professor
00:14:03.640 in communications
00:14:05.000 or almost
00:14:06.100 in psychology.
00:14:06.920 So is that
00:14:08.660 a good thing?
00:14:09.400 I mean,
00:14:09.680 does it make,
00:14:10.280 I mean,
00:14:11.080 I explain to people,
00:14:12.140 look,
00:14:12.400 there are certain
00:14:13.220 issues in academia
00:14:14.680 that are true
00:14:16.560 irrespective
00:14:17.640 of your political orientation.
00:14:19.060 So, for example,
00:14:20.360 you know,
00:14:20.680 tectonic plate dynamics
00:14:22.960 are true or not
00:14:24.480 whether you're
00:14:25.180 Republican or Democrat.
00:14:27.020 The theory of evolution
00:14:28.160 has been verified
00:14:29.480 in endless ways
00:14:30.520 and it doesn't matter
00:14:31.200 whether you're
00:14:31.660 Republican or Democrat.
00:14:32.760 On the other hand,
00:14:33.800 on the other hand,
00:14:34.380 if I'm teaching
00:14:35.080 about the ethics
00:14:36.380 and morality
00:14:37.040 of the death penalty,
00:14:38.760 there are very
00:14:39.680 compelling arguments
00:14:40.760 that could be made
00:14:41.720 by both sides
00:14:42.880 of that issue,
00:14:44.160 right?
00:14:44.640 And so I would
00:14:45.320 certainly stand
00:14:46.300 to benefit
00:14:46.880 if I'm able
00:14:47.680 to hear those
00:14:48.560 that may not share
00:14:49.740 my views
00:14:50.260 on whatever
00:14:51.700 fiscal policy
00:14:52.800 that we're discussing
00:14:53.640 or foreign policy
00:14:55.880 or death penalty
00:14:57.040 and so on and so forth.
00:14:58.280 So in some disciplines,
00:14:59.980 you genuinely
00:15:01.160 are murdering 1.00
00:15:03.000 and raping 1.00
00:15:03.700 the pursuit
00:15:04.400 of truth
00:15:05.220 if you presume
00:15:06.460 that there is only
00:15:07.380 one right position
00:15:09.080 to take.
00:15:09.920 And so we are
00:15:10.500 cheating our students
00:15:11.760 and their intellectual
00:15:12.640 development
00:15:13.320 as this neuroscientist
00:15:15.240 is doing
00:15:15.680 by presuming
00:15:16.900 that anybody
00:15:17.700 who's in her lab
00:15:18.920 surely must be
00:15:20.320 of the same
00:15:20.980 political opinion.
00:15:21.680 It's grotesque.
00:15:23.540 I had,
00:15:24.080 there was another
00:15:24.380 beautiful one,
00:15:25.040 Gad,
00:15:25.080 I'm not sure
00:15:25.460 if you saw this.
00:15:26.620 When you say
00:15:27.460 like what they think
00:15:28.280 of Americans,
00:15:29.080 let me just make sure
00:15:29.560 my,
00:15:30.100 this is,
00:15:31.080 well,
00:15:31.400 she's a Democrat strategist 0.99
00:15:32.580 so that's fair enough 1.00
00:15:33.740 that she'll be an idiot. 1.00
00:15:34.800 White men 1.00
00:15:35.420 without college degrees 1.00
00:15:36.840 are going to ruin
00:15:37.780 this country.
00:15:38.760 Yes.
00:15:40.040 I mean,
00:15:40.480 think of,
00:15:41.300 think of,
00:15:42.140 I mean,
00:15:43.580 Elon Musk,
00:15:44.700 total uneducated
00:15:46.840 guy who's never
00:15:48.400 done anything
00:15:49.040 in his life
00:15:49.760 who's the biggest
00:15:50.660 supporter
00:15:51.200 of Trump.
00:15:54.440 You know,
00:15:54.760 Gatsad
00:15:55.360 has,
00:15:56.180 you know,
00:15:56.400 a very,
00:15:57.020 very long list
00:15:57.920 of degree,
00:15:58.640 you know,
00:15:59.260 the diplomas
00:16:00.060 and degrees
00:16:00.680 and all sorts
00:16:01.760 of titles.
00:16:02.520 Again,
00:16:03.120 just an uneducated 0.99
00:16:04.360 white guy,
00:16:04.900 although I'm Lebanese
00:16:05.880 so I'm a Jew
00:16:06.680 of color.
00:16:08.180 J-O-C, 1.00
00:16:09.260 I've never,
00:16:10.140 they're like that,
00:16:10.700 huh?
00:16:11.240 You've got Bill Ackman,
00:16:12.560 another guy who's
00:16:13.360 never done anything,
00:16:14.320 Douglas Murray,
00:16:14.940 you've got,
00:16:15.480 here's another
00:16:16.000 uneducated 0.98
00:16:17.000 white guy, 0.99
00:16:18.320 the Somali
00:16:19.340 woman, 1.00
00:16:20.300 Ayana Hirsi Ali
00:16:21.540 is also
00:16:22.900 an uneducated 0.91
00:16:23.860 white guy. 0.80
00:16:25.200 So,
00:16:25.820 it's,
00:16:26.100 you know,
00:16:26.440 it's such
00:16:27.780 an affront,
00:16:29.000 you know,
00:16:29.440 as you,
00:16:30.300 you know me
00:16:31.180 enough to know this
00:16:32.120 but maybe some of
00:16:32.960 your viewers
00:16:33.520 and listeners
00:16:33.900 don't,
00:16:34.740 in Chapter 1
00:16:35.660 of the parasitic mind, I basically state that the two fundamental ideals that drive and guide my
00:16:42.580 life trajectory are truth and freedom. So, you know, I'm perfectly happy if I do something wrong
00:16:50.840 to stand up and say, hey, I'm really sorry, Viva. I know I showed up 20 minutes late and please
00:16:56.260 forgive me, right? It takes humility, right? But that's what you need if you're going to function
00:17:01.800 properly in society. You admit when you're wrong, you stand proudly when you think you're right,
00:17:07.080 but you regulate your behavior so that you always exhibit epistemological humility.
00:17:12.160 Those folks simply could never do so, right? It doesn't matter how much evidence I show you
00:17:17.760 that Donald Trump's tent is the most inclusive tent that one could have ever imagined. It's still only
00:17:25.240 uneducated, hick, white people who are voting for him. 0.99
00:17:29.860 Look, you're going to love this. Now, in fairness to this, I don't know if it was from 2016 or 2024, 0.97
00:17:36.820 but it doesn't matter. With the imagery that you see here, look at this. I'm going to turn the volume
00:17:41.880 down so I don't get copy-claimed on this. This is European politicians, from what I understand,
00:17:47.960 sitting there with their arms crossed. Like, this has to be, there has to be some human evolution
00:17:52.240 thing about this particular stance that makes it repulsive. But you've got foreigners standing 1.00
00:18:00.280 like scorned housewives or angry mothers looking at their kids, and that is the attitude that they
00:18:06.640 have to people who voted en masse against them. And what blows my mind, now I've been breaking down
00:18:12.800 some of the stats. Set aside, it's a popular vote victory. The number may come down a little bit
00:18:18.200 because they're still counting California. 45% of Latinos voted for Trump. Record numbers with
00:18:25.800 black men. Record numbers with black women, but still, you know, 20%. And what they're basically 0.99
00:18:31.960 saying, I try not to draw these hyperbolic comparisons and call them, you know, like,
00:18:36.260 this is what slave owners used to do back in the day. But the treatment is that of a slave owner,
00:18:40.360 where it's like the minorities don't know what's good for them. They need the woman standing there 0.96
00:18:45.320 with her arms crossed, telling them how to think, how to vote and what to do. And when they deviate
00:18:49.640 and when they veer too far from the plantation, then they say, you better come back because we're
00:18:53.940 the ones telling you what to do. You don't get to think for yourself. But this, is there an
00:18:58.360 evolutionary reason for why that gives you? I mean, not, I don't know if I can make an intelligent
00:19:04.700 comment about specifically the scornful nonverbal stance. But I think in terms of an existential
00:19:11.420 stance, it basically is saying, you either believe as I do, or there is something morally
00:19:19.020 diseased about you. And therefore, that's why I must look at you with such scorn, right? Because
00:19:24.720 I am part as Thomas Sowell. Oh, Thomas Sowell, who's another uneducated white man, right? For those of
00:19:32.200 you who don't understand the satire, he's a black man, arguably one of our, the best intellectuals that
00:19:38.120 the United States has ever had. He's an economist. But apparently, he's a white, uneducated man, 0.94
00:19:43.800 he certainly would not be supporting Kamala and her degenerates. But that's, that's the reality, 0.98
00:19:50.620 certainly in academia. I mean, that kind of stance is exactly the world that I have inhabited for well
00:19:57.040 over three decades. I'll give you a great story. I mean, people learn a lot from storytelling. I mean,
00:20:03.080 I could give you all of the fancy academic stuff. But what sticks is when you back it up with actual
00:20:08.940 vivid narratives. Okay. So in 2023, I was invited to be one of the plenary speakers at a one day
00:20:17.680 conference at USC, you know, a prestigious university in Southern California. It was in celebration of,
00:20:25.880 I think, the 10 year anniversary of a particular center at USC. And it was a day on, you know,
00:20:31.320 are enlightenment values still, you know, de rigueur? Are they still valid? And so on. So I gave a talk
00:20:39.080 at that conference, and you can watch my talk on my channel, on deontological versus consequentialist
00:20:47.060 ethics. And if you want, just as a small tangential parenthesis, let me just explain what that is.
00:20:52.520 So deontological ethics are absolute statements of ethical positions. It is never okay to lie,
00:20:58.820 would be a deontological statement. Consequentialism would be judging the moral and ethical standards
00:21:06.880 of a particular action based on its consequences. So if I say, well, it's okay to lie, to spare my wife's
00:21:14.460 feelings, if she asks me, do I look good in that dress or not? Well, then that would be consequentialist.
00:21:19.500 And as I tried to explain in a very professorial manner at the talk, it's a very serious talk,
00:21:24.820 you know, no, no gad humor, very sober. I said, look, for many, many things, it makes perfect sense
00:21:31.160 for us to put on a consequentialist hat. But for certain foundational principles, by definition,
00:21:37.000 they are deontological. So if you say things like, I believe in freedom of speech, but, and the but is
00:21:43.800 just a bunch of consequentialist calculations, then you are an imbecile who doesn't believe in freedom 1.00
00:21:50.000 of speech. And I gave many examples to highlight this point, only one of which was the Trump example 0.99
00:21:57.440 when he was booted out of what was then Twitter. And I said, it can't make sense that you believe
00:22:04.300 in freedom of speech, but not for Donald Trump. Uniquely, there is an asterisk in this deontological
00:22:10.940 principle that says Donald Trump is not afforded that deontological right. That was my only contribution
00:22:17.080 contribution to Trump in that bigger lecture that I can remember. If you saw the Q&A period after,
00:22:24.880 which by the way, even though they promised that they would send me, they decided not to send it
00:22:29.740 because they realized that once I would advertise it on my large platform, it would make them look
00:22:36.100 really bad. They said, oh, well, we can't send it because we didn't get signed clearance from all
00:22:41.420 the audience members that the clip could be released. It was a public event. It didn't follow
00:22:46.480 any secrecy laws. But people started getting up and they were unhinged in their hysteria. By the way,
00:22:54.540 my wife and kids were with me in that room. And my children, who are still very young, came up to me
00:23:00.140 and said, Daddy, why are people screaming like that at you? It's insane. One of them, by the way,
00:23:05.840 so to our point about mass psychosis, said to me, it makes perfect sense that the government
00:23:12.320 would regulate some of your speech if it is dangerous and corrosive. Speaking specifically
00:23:18.360 to me, I said, can you give me an example of something that I said in today's event that you
00:23:24.760 would consider to be, that would require the government to step in and stop? He said, well,
00:23:31.480 for example, by the way, when he first began his Q&A, he had to say what his identity is.
00:23:37.500 I'm a Latino. I like when they put the extra T. I'm a Latino. I'm a gay Latino man who studies sex, 0.85
00:23:45.760 whatever. He said to me, when you said that men do not, cannot bear children and that men 0.97
00:23:52.060 don't menstruate, that should be regulated. So then I paused and in my inimitable God style, 0.90
00:23:59.660 I said, let me just repeat this to see if I understood. You think that when it comes to the
00:24:04.460 issue of whether men menstruate and can bear children, I'm on the wrong side of that issue?
00:24:11.700 And then you can sort of hear uncomfortable murmur. That happened not at a psychiatric ward,
00:24:17.860 it happened at USC. That's University of Southern California. Yes, sir.
00:24:23.600 It's officially crazy. The thing is, though, I think it's officially crazy. The idea was that by
00:24:35.000 denying a biological fact that that qualifies as the requisite degree of real life violence,
00:24:40.160 that it would warrant censorship online. Exactly. And it's all of those parasitic ideas. So
00:24:49.240 in the parasitic mind, I was trying to come up with a universal explanation for why each of those
00:24:56.720 parasitic ideas are so catchy. Like, what is it that could make someone actually believe this nonsense? 0.73
00:25:03.480 So in the same way, and let me draw an analogy. Different cancers have different trajectories,
00:25:09.560 but the singular thing that they have in common is unchecked cell division. So at least we can agree
00:25:14.920 that that that is a commonality across the different cancers. Okay. So what is common across all of these
00:25:21.320 parasitic ideas, postmodernism, cultural relativism, social constructivism, radical feminism,
00:25:27.180 it's that they all start off with a noble objective. And then in the service of that noble objective,
00:25:35.140 if you have to rape and murder truth in the service of that objective, so be it. So it's a
00:25:41.300 consequentialist argument, right? So I'm a really super empathetic and kind person. I really love
00:25:48.520 trans people, and I really don't want them to ever have their feelings hurt. So in the service of that
00:25:54.780 goal, well, the reality is most of us don't care that you're trans and want you to live a life that's
00:26:01.440 fully dignified and void of bigotry. But in the service of that goal, I'm not going to nod my head and
00:26:07.720 say, oh, yeah, yeah, men too can menstruate. Oh, yeah, yeah, men too can bear children, because I want
00:26:13.220 to protect your unique personhood, right? That's what I tried to explain to the degenerates in the
00:26:19.760 Canadian Senate in 2017, when I went up as a as an expert witness to say that there are dangers in
00:26:28.100 denying reality. It starts off with a noble empathetic reflex, but then it metamorphosizes
00:26:35.640 into pure bullshit. So that's what's common across all that lunacy. 0.99
00:26:40.080 In the context of your studies, have you done any breakdown as to like what the
00:26:44.300 proportion of true believers are, what the proportion of fake believers are, but who love the power that
00:26:50.000 it gets them, and what the proportion of actually mentally ill individuals who are legally not able to
00:26:56.120 consent among this calculation? Yeah, that's a fantastic question. So I don't have the empirical
00:27:01.220 evidence to answer that question. But I've often questioned it myself in terms of in the deep
00:27:08.960 recesses of their minds, when they go to bed at night and lay their heads on the pillow, do they
00:27:14.060 genuinely believe that? And I suspect that many of them do, because it is a way to protect their
00:27:22.980 their, it's an ego protective mechanism, that I am a truly orgiastically empathetic and kind person,
00:27:31.980 which by the way, not to engage in promotional plugging, but that's exactly the point of my next
00:27:37.520 book, right? Suicidal Empathy. Because what parasitic mind did is it said, here is what happens
00:27:45.220 when your cognitive system is parasitized, right? But we also have an emotional system, right? We are both
00:27:52.360 a thinking and feeling animal. So the parasitic mind is about how parasitic ideas distort your ability
00:28:01.420 to think clearly and to navigate reality clearly. And then in the follow-up with Suicidal Empathy,
00:28:08.320 well, here is what happens once your emotional system is hijacked. And so now you have, if you like,
00:28:14.320 the full, the full story.
00:28:17.060 I'm going to play you, have you seen, you saw Jimmy Kimmel's tearful, tearful statement.
00:28:22.280 Oh, I actually retweeted it. He was so empathetic.
00:28:25.180 Here, let me bring this up. Because this is where, like, I think there's, there's people who exploit this
00:28:29.680 for ego and clout. There's others who exploit it for actual malice. Like, within the online, you know,
00:28:36.560 you must admit that men can shower with, with, with girls, et cetera. There are those who are
00:28:41.920 genuinely mentally unwell, like DSM-5 clinically diagnosable. There are other sociopaths who want
00:28:48.220 to do it because they are deep misogynist and want to abuse women. And then there's others who just 0.98
00:28:52.180 want to do it because it's a lever for control and influence. Watch Jimmy Kimmel because a lot of
00:28:56.480 people were observing. He's not crying out of any form of sincere empathy. He's crying because on the
00:29:01.520 one hand, maybe he doesn't feel that he has any influence anymore because nobody listened to his wise
00:29:05.560 advice. Where's it? Where's the audio here? Hold on a second. Getsky spouses. Did I put the,
00:29:09.860 oh yeah, there we go. Put it down there. Listen to Jimmy Kimmel's crocodile tears.
00:29:13.140 Let's be honest. It was a terrible night last night. It was a terrible night for women, 1.00
00:29:17.360 for children, for the hundreds of thousands of, of hardworking immigrants who make this country go. 1.00
00:29:23.620 Oh, oh, oh, oh, oh, for science, for journalism, for justice, for free speech. It was a terrible
00:29:35.260 night for poor people, for the middle class, for seniors, for our allies. Who overwhelmingly voted
00:29:39.560 for Trump. For our allies in Ukraine. For NATO. Go fight, Jimmy. For the truth. And democracy and decency.
00:29:50.260 And it was a terrible night for everyone who voted against him. And guess what? It was a bad night for
00:29:54.500 everyone who voted for him too. You just don't realize. You had a little too much to think there,
00:29:58.760 middle class who voted overwhelmingly for Trump. What do you, like, do you think that they're
00:30:03.820 finally losing whatever control they had over directing narratives and controlling public
00:30:09.760 influence? They are. I mean, certainly they are. But by the way, to your earlier question about
00:30:14.000 which camp, let's say, would Jimmy fall in? I don't think he genuinely believes that stuff. I mean,
00:30:19.660 I think it's a combination of other factors. In the deep recesses of his mind, he does know that it's
00:30:26.160 full of shit. There's no, there's no conceivable way. And think of it this way. Or it's a way for him 1.00
00:30:31.920 to atone for some of the public stuff that he had done in the past. His, the black face, the coming
00:30:38.420 from behind the woman. Remember like he's doing as if he's, he's mounting someone. I can't remember
00:30:43.020 what it was. And so one of the ways that I can atone for all of that vulgar stuff that I did in my
00:30:49.860 youthful transgression is to now demonstrate that I truly am, as Thomas Sowell said,
00:30:56.160 part of the anointed ones. Go ahead. No, that's, that's, that's an amazing observation because
00:31:01.640 it's, that's the similar pattern that I've noticed with Howard Stern, with all of those who were the
00:31:06.680 biggest degenerates who are now trying to buy their, their way into the good graces of the
00:31:10.480 political elite. It's Angela Merkel, whose grant, I think it was the grandparents who were Nazis. 0.89
00:31:16.920 What better way to demonstrate that I repudiate all of my grandparents' behaviors by now letting
00:31:26.140 in with suicidal empathy, millions of people from Muslim countries. Surely I couldn't be 1.00
00:31:32.540 a racist and bigot if I open up Germany to all of those folks. By the way, your grandparents 0.96
00:31:40.240 eradicated Jews from Germany by you letting in millions of people who come from cultures that are 0.97
00:31:49.220 defined by their Jew hatred. Tell me how Jewish life is going to, uh, play out in Germany, uh, dear 0.97
00:31:57.660 Dr. Merkel. Well, that, that was the ultimate irony for those who don't recall the, the, the Syrian 1.00
00:32:02.840 refugee crisis in 2015. And Merkel basically opened up the borders of all of Europe because there's 0.57
00:32:08.100 free travel within the countries in Europe to Syrian refugees who typically are, uh, for whatever the
00:32:13.460 reason people might think it's, there's good reason, but hostile to Jews. And I, my remark at the time
00:32:19.120 was a lot of people were saying it's the making up for her, uh, Nazi history that she's overcompensating.
00:32:24.360 And I made the joke, like, how the hell do you know? Like if, if, if the, if the one thing you
00:32:27.700 want to do is get rid of the Jewry of Europe, what other means would you do it while trying to look 1.00
00:32:32.100 benevolent while you do it? And then lo and behold, it was such a great exercise in assimilation and, 0.88
00:32:37.140 uh, immigration policies that you have to offer money to these Syrians to go back home and the 0.99
00:32:42.080 refugees to go back home. Um, again, it will on that subject. And we're going to come back and 1.00
00:32:48.640 forth. First of all, you don't, don't feel bad to plug the book. The, the, the new book is called
00:32:53.220 suicidal empathy or that's going to be within the title. Do you have a title?
00:32:56.320 Yeah. So that's, that's the tentative title. So it's a, it's a term, uh, that I coined to explain
00:33:02.840 this orgiastic emotional malady. And, uh, actually I'll even tell you the background of how the book came to
00:33:10.940 be. So I had written, I had been using, and I've coined that term and I've been using it on all my
00:33:16.400 social media. And at one point I wrote a long, uh, post where I demonstrated the various public
00:33:23.700 policy decisions that are exemplars of suicidal empathy. And then I get an email from one of the
00:33:32.360 really big publishers, the executive editor. And he basically just said, he, he, he puts my tweet
00:33:38.820 and he goes, looks like we found your next book or something like that. And so that's how it started.
00:33:45.600 Uh, and so we, we first, uh, communicated last April and I've been feverishly working on it since
00:33:53.380 because, uh, you know, it's, it, in a sense, I'm having a hard time putting a, an upper bound on how
00:34:01.900 long the book's going to be because every day I get sent 9,000 new cases of suicidal empathy.
00:34:09.420 So at some point I'm going to have to apply a stopping rule and say, okay, no more. The book
00:34:13.440 can't be more than 43,000 pages long. Um, no, I don't remember when the first time you used it was
00:34:20.320 that I heard it, but I, I, I loved it and have been using it ever since. And it's an amazing idea.
00:34:25.440 The idea is there in my mind, if you have a better example, you'll let me know that New York lady, 1.00
00:34:30.920 the activist whose boyfriend gets stabbed to death on the streets at four in the morning in New York 0.95
00:34:36.800 by a black guy who's having a mental health crisis. And she literally shows more concern for the dude
00:34:42.480 who just stabbed her boyfriend to death than for her boyfriend who is lying there bleeding to death 0.95
00:34:47.300 on the ground because she wants to be understanding and sympathetic to the violent, mentally ill murderer 0.97
00:34:52.340 who just killed her boyfriend. And like, this is insanity. This is activism gone to the point of 0.99
00:34:57.160 where you must kill me in order for me to show you how virtuous I am. Exactly. Uh, so that is a great 0.82
00:35:03.040 example. I've got a million of those in the book. I'll just give you one or two others that are well in
00:35:08.720 line with the one that you just gave. Uh, in 2013, a Norwegian man was sodomized, raped by a Somali 0.72
00:35:20.240 migrant. The guy was caught. He served a very minimal prison sentence because in Norway, it's all 1.00
00:35:29.460 about, you know, doing ceramics classes. It's not nice to punish people. And then when he finished his
00:35:36.740 sentence, he was going to be deported back to Somalia. The guy who was raped by him went public
00:35:45.520 saying how awful he felt because now the Somali sodomizer was going to have a much less promising 1.00
00:35:55.340 future in Somalia than he did, than he would have in Norway. Because if I am sodomized by a guy, the thing 0.93
00:36:04.280 that I'm most concerned about is to ensure that he has a enriching future in the country where he sodomized 0.94
00:36:12.720 me. Now I can, can I offer the psychological, the, the, the framework for why the suicidal empathy
00:36:20.600 happens? Oh, please. Let me just ask you one question about that story. Are we sure that the
00:36:25.740 man was not wrongly convicted and that they were involved in an amorous relationship? Because
00:36:29.200 that, that, and I'm not even trying to be funny. It just, it sounds, there's no world in which one
00:36:34.780 would fear that their rapist, uh, being deported to their country of origin who raped them, uh, would be 0.93
00:36:40.600 treated badly after, after rape and let alone that type of rape. So we know that it is a rape. And we 0.99
00:36:46.660 also know that the guy who was raped, uh, I mean, presents himself to the world as a staunch feminist
00:36:54.880 and anti-racism ally. Uh, and so, so no, we, we know exactly that he holds those beliefs. Let me, let me
00:37:02.940 offer, I mean, I, I want people to go out and buy the book. So don't think that by me offering you the
00:37:07.940 theoretical framework, you you've gotten the full cow. You're doing the audio book this time,
00:37:13.020 correct? Gad? Oh, it's so funny. You said this. I was just, I was just, uh, going through, uh,
00:37:19.380 some details of the contract. It's taken way longer for me to sign the contract. They, they approached
00:37:24.840 me, as I said, I think it was like last April. And one of the clauses that I put, I mean, you're the
00:37:30.360 lawyer, but you appreciate what happens when you're looking through a contract and you're put, I said,
00:37:34.520 uh, you know, Hey guys, the main criticism I received for my last two books, which is a pretty
00:37:44.200 good thing. If that, those are the only recurring criticism you're getting is that people were super
00:37:48.840 pissed off that the parasitic mind and the sad truth about happiness were not narrated in my own
00:37:54.040 voice. Can we at least put a clause that that is an option? And so they, I just got back twice
00:38:01.200 manuscript and they added that in. It's, it should be obligatory because you have a, it's not that
00:38:06.160 you have a good voice, you have a good voice as well, but a distinctive good voice. It's sort of
00:38:09.360 like, you know, listening to Alex Jones's book. I don't think it was someone who had a sufficiently
00:38:13.480 similar voice that it wasn't shocking or glaring, but you, you listened to your book was, nobody has
00:38:18.140 a voice like you. So, sorry, good, good clause, good addition. Um, and now what you were just about
00:38:23.060 to say, I was going to tell you about the, the, the, uh, a summary of the theoretical framework
00:38:28.000 of what, what drives suicidal empathy. So let me step back a second and offer an analogy.
00:38:34.960 Take for example, by the way, when you started, you said he's not a clinician in a technical sense.
00:38:39.920 That's true, but I'm now known the moniker is I'm the global therapist to the world. So while I don't
00:38:47.080 hold a clinical license, effectively speaking, I am the head parasitologist of the world because I'm
00:38:53.680 trying to resolve a psychiatric malady at the group level. What, what, what more can you want as a
00:39:01.120 clinician? But in any case, so joking aside, uh, OCD, obsessive compulsive disorder is a condition,
00:39:09.980 but by the way, I've written several, uh, academic papers on various psychiatric afflictions from an
00:39:15.580 evolutionary perspective. So I've done, uh, Munchhausen syndrome by proxy. That's where I then develop
00:39:22.240 the theory in parasitic mind for the malady of collective Munchhausen and collective Munchhausen
00:39:27.980 by proxy and transgenderism, uh, Munchhausen by proxy and so on. And actually, let me stop you on
00:39:34.120 that. That wasn't very important for people to appreciate that. The idea of this whole trans thing 1.00
00:39:37.900 that parents impose on their children is a Munchhausen Munchhausen syndrome by proxy is when
00:39:43.200 a mother who wants the attention, the social adulation would make her kids sick. And it was like, 1.00
00:39:47.020 Oh, poor baby, you know, we'll get, you'll get all that social credit score and whatever, 0.87
00:39:50.440 because your kid is sick. And so when it's real, people feel it. And when it's, when they don't,
00:39:55.080 when it's not real, they get their kids sick so they can then feel it. That is apply it mutanus 0.73
00:39:59.180 mutanus to parents who say, Oh, my four-year-old is trans and I'm empowering his existence. And
00:40:03.500 everyone's like, Oh, good for you. Take to social media and put out videos. The, the, the transgender 1.00
00:40:07.400 Munchhausen syndrome by proxy is an amazing, uh, insight connection to make. Sorry. I want to
00:40:13.060 highlight that. Yeah. Thank you. I appreciate that. Uh, if anybody who's interested in the
00:40:17.140 original scientific paper that I wrote, it's, it's in, I think 2010 in medical hypotheses,
00:40:23.340 it's a medical journal. So I've also written paper. I'm only saying this, not, not to give
00:40:27.980 you my CV, but because it's relevant to my OCD explanation. I've also written, uh, papers,
00:40:33.340 psychiatric papers on an evolutionary explanation of suicide, which you might imagine is a difficult
00:40:39.620 concept to explain evolutionarily speaking, you're ending your life. And I've also written a
00:40:44.920 evolutionary based paper on sex differences in the symptom mythology of OCD. In other words,
00:40:52.660 men and women are likely to exhibit sex differences in terms of the O's and the C's that they succumb
00:41:00.740 to. And I argue that there's an evolutionary reason for that. Okay. Having said all that,
00:41:04.940 so that gives you a sense of where my scientific, if you like credentials come from to then be able
00:41:12.600 to diagnose this. Okay. So OCD actually has an evolutionary explanation and here, here, here's
00:41:20.640 how it goes. And then I'll link it to suicidal empathy. The idea that we should scan the world
00:41:26.180 for environmental threats makes perfect evolutionary sense. So for example, if you and I are going out
00:41:32.280 for a pizza and I noticed that your nose is running and then you sneeze into your hand and you shake my
00:41:37.980 hand, then it makes perfect evolutionary sense that I would have the reflex to say, Hey, I'm going to go
00:41:42.980 wash my hands before we have the pizza, because it makes sense for me to have that repulsion at the
00:41:49.720 possibility of having germ contamination. If I go to the back door and check that it is locked before we
00:41:56.140 all turn in to bed, that makes perfect evolutionary sense. The problem with OCD arises when that
00:42:03.680 adaptive mechanism misfires by being hyperactive, right? So the, the, the flag usually goes up.
00:42:11.620 I tend to it. The flag goes down and then I go on with my day. But if I spend eight hours in an
00:42:18.060 infinite loop, washing my hands and scolding hot water so that I don't make it to work and my skin is
00:42:24.520 falling off, then that becomes a dysfunction. It's a dysregulation of an otherwise adaptive process.
00:42:31.080 So having written a lot about this dysregulation, evolutionary dysregulation in my scientific work,
00:42:39.240 that's how I had the insight. Aha, that's what suicidal empathy is. Empathy, when it is directed at the
00:42:48.040 right people and in the right amounts, perfect, makes perfect evolutionary sense because we are a
00:42:54.060 social species. We have evolved certain positive emotions that allow us to, if you like, lubricate
00:43:01.760 our social bonds. It makes sense for me to have theory of mind, to put myself in your shoes when
00:43:08.100 you are feeling pain. That's a good empathetic reflex for me to have. The problem arises when empathy is
00:43:15.680 dysregulated. And in this case, I mean, there are several ways that it misfires. One of which
00:43:22.000 is it misfires to the wrong target. So being empathetic to my Salvadorian neighbors down south 0.91
00:43:32.960 because they have a right to live the American dream, so much so that it supersedes the empathy 0.99
00:43:39.980 that I should have for American veterans who lost their limbs defending my right to be an asshole 1.00
00:43:47.360 doesn't make sense. That's dysregulated empathy. So what I do in the book is I take many, many public 0.99
00:43:53.620 policy decisions that have resulted in the full decay of the West, and I argue that at the root of
00:44:00.040 each of these public policy mistakes is suicidal empathy. God damn, that's going to be an international 0.98
00:44:06.380 bestseller. Again, as you say it, and I don't know if this is going to be already in your thought process, 0.99
00:44:11.260 but not speaking from any personal experience with OCD, that you have, I've always found like
00:44:16.860 there's ways in which it manifests. And one is normal fear, like you say, like I was biking
00:44:22.400 yesterday and I forgot it was daylight savings. And so biking at six is a lot darker than biking at
00:44:26.880 seven. And I'm on a path, I'm not worried about people now. I'm worried about gators and panthers
00:44:30.620 at sunset. And then there's, so you have the rational fear, then you have the irrational fear applied to
00:44:37.000 the rational fear, which is just an over-exaggeration, over-emphasis of it. You have the misfiring of it
00:44:43.220 being on things that are not existential threats or the over-application of it from like, wash your
00:44:48.040 hands. But then if you also approach someone sneezing across the street as an existential threat,
00:44:52.940 you're over-adapting, so to speak. Or you're wearing a mask alone in your car in 2024.
00:45:00.740 It is, it's so beautiful an analogy. Hold on, I was taking notes in the email. No, that was it.
00:45:06.600 The over-emphasis on actual threats or applying it to non-threats and the suicidal empathy. Have you
00:45:12.740 thought about this yet at some point? And I'm not trying to be funny because I think it's almost
00:45:16.500 scary. The suicidal empathy goes from suicidal empathy to homicidal empathy, that they will
00:45:22.040 actually start killing you for your own good? Well, okay. So, and we can answer that either
00:45:26.960 literally or figuratively. Let's go figuratively, then literally. Right. Figuratively, it already
00:45:34.000 happens all the time, right? I'm going to kill your career. I'm going to kill your reputation. 0.99
00:45:41.280 I'm going to kill your prospects of ever being invited to the cool kids party. So, the homicide 0.99
00:45:47.280 has already happened. It hasn't happened though, literally. Now, in other countries, I could
00:45:54.600 literally kill you because I have the power. Oh, are you willing to accept Islam? We're giving you 1.00
00:46:01.880 the choice. So, you're freely able to decide no. Now, of course, because there are consequences
00:46:08.280 in life, if you make the wrong choice by not accepting Islam, we're going to have to mercifully 1.00
00:46:14.780 detach your head from the rest of your body. But that's done after we've given you the courtesy
00:46:20.340 of choosing for yourself. So, I don't think we're at the point today in the West where you can
00:46:26.300 orgiastically, literally engage in the homicide that you're talking about. But boy, can we certainly
00:46:31.860 do it figuratively in very painful ways. And I don't know if you know of any historical
00:46:36.800 analog. What is the social, the threshold, like the percentage of social acceptance threshold?
00:46:42.100 When did it become, you know, when 5%, let's just go to the easy example that no one takes
00:46:46.640 offense with, which is Nazi Germany. 5% of Germany is Nazis. And they say, we've got these crazy ideas.
00:46:51.640 And it was like, okay, well, too bad. We're not buying into it. It's offensive. There's a threshold
00:46:56.660 at which not only do people start to give it some sort of credibility, but also at which they start
00:47:02.460 to become fearful of it, where it starts taking hold. Like, what percentage do you actually...
00:47:07.200 That's actually, it's even, I mean, I'm not trying to sound patronizing, but it's even a more
00:47:12.940 intelligent question that you already stating it, even though... Exactly. That's how smart you are.
00:47:21.640 That's what Laval University trained lawyers... Well, no, I did my philosophy degree was in,
00:47:28.780 my thesis was called deontological consequentialism. What?
00:47:32.640 Again, I'm trying to find... I knew that there was somewhat of a semblance of a brain in you.
00:47:37.640 I did four years of philosophy, honors degree at McGill. Then I went to Laval for my law degree.
00:47:42.860 And I did one year exchange in Paris, which is why I did an extra year of McGill, because I wanted to
00:47:47.160 get my honors degree, which required a cumulative GPA of above 3.0. But my year in Paris was pass-fail,
00:47:53.360 so it didn't count. So I did another year of philosophy. And I'm still... I'm trying to track
00:47:57.160 down my thesis, because when you talk about consequentialism versus deontology, Kantian
00:48:02.780 categorical imperatives. And I sit there saying, whenever I hear someone say, it's okay to kill
00:48:08.140 a baby if it means saving 10. And I'm saying, my reaction was always, it's never okay. What you're
00:48:13.380 basically just saying is, I want to minimize the evil, because I think that the evil to be minimized
00:48:17.440 is the killing. So I'm going to kill one to prevent somebody else from killing 10. And my theory has 0.99
00:48:21.440 always been, you haven't minimized anything. You've actually just maximized the aggregate evil,
00:48:25.460 because the other evil still exists. But no, the question is like, yeah, socially, what is this?
00:48:31.460 Yeah. Let me answer that in a technical way. I mean, vulgarize to the masses. By the way,
00:48:37.040 a lot of people don't understand the word to vulgarize, which in French, as you know, is...
00:48:44.900 Vulgariser.
00:48:46.100 Vulgariser. And the reason why it came up to my... Well, first, I'm a lover of words. And I remember,
00:48:53.680 I think it was my 2011 book, The Consuming Instinct, this, the red book here, where is it? No.
00:48:59.620 Hold on, let me zoom out, and then I'll see it. Oh, yeah, right over your shoulder here.
00:49:02.860 Yeah, on the... This, not next to the... No, the other way, this one.
00:49:06.980 Yeah, that's a water. No, that's your microphone. I'm joking. I think it might be covered up by your
00:49:11.520 mic. Okay, it doesn't matter. It's the red one. It's called The Consuming Instinct,
00:49:15.740 What Juicy Burgers, Ferraris, Pornographies, and Gift-Giving Reveal About Human Nature. It was a book,
00:49:22.400 it was a trade book meant to demonstrate how you could apply evolutionary psychology and evolutionary
00:49:27.620 biology to study our consuming instinct. What does a trade book mean?
00:49:30.960 Oh, that's a great question. So there are different classes of books. So let's take the
00:49:38.440 one that most of your viewers and listeners would know, a textbook. A textbook is a book that's
00:49:44.640 written as a pedagogic exercise to be sold to university students. So let's say you were taking
00:49:53.340 a course in Philosophy 101. I'll write a textbook that hopefully professors that are teaching this
00:50:00.640 course will adopt throughout Canada and North America. So that's called a textbook. An academic
00:50:05.880 book, which is very different from a textbook, is a very technical book that's written for other
00:50:14.220 academics and graduate students and so on. It's a scientific, so it's not meant for Philosophy 101.
00:50:20.760 One, it's meant for practicing philosophers, okay? Now, academic books, if they sell a thousand copies,
00:50:29.980 that would be considered a hugely successful book because they have very, very limited market. It'll be
00:50:36.000 university libraries that will purchase them, you know, specialists in the field and so on. So my first
00:50:43.620 couple of books were academic books because I was trying to lay my flag on the, on, you know, having
00:50:50.120 pioneered the field of evolutionary consumer psychology and so on. Okay. So this, which book is
00:50:55.200 it? I think it's, if it's the, okay, there you go. This book right here, that's called the evolutionary
00:51:03.540 basis of consumption. That was my first book and it's very, very technical. I mean, it could still be read
00:51:10.700 by non-professionals, but it's a very scientific academic book. Okay. So that's the evolutionary
00:51:16.160 basis of consumption. Then here I have an, oh, no, this way. Here I've got an edited book, which I'll
00:51:25.320 also explain what that is. That's called evolutionary psychology in the business sciences. It's an academic
00:51:29.860 book, but it's edited in that different chapters are written by different specialists and I serve as the
00:51:39.120 editor of the entire compendium. And I usually will write an intro and an opening chapter and so on.
00:51:44.740 So these two books are academic books. Okay. So they will have a much smaller distribution, even though,
00:51:53.280 I mean, these were actually very successful by any standard. Okay. Then the next one, this one,
00:52:00.560 it's, so to answer your question, this, this one was my first trade book, the consuming instinct.
00:52:06.700 Now what I was trying to do there. So to answer your question, what is a trade book? A trade book
00:52:12.320 is meant for the masses. It's meant to be read by as many people as possible. It's what you will see
00:52:19.800 at Barnes and Noble and at Indigo and so on. It's not for the specialist. It's not for students. It's for
00:52:26.480 the general reader. And so as you might imagine, many professors can be very successful as academic
00:52:34.640 writers, but dreadfully bad as trade authors. Why? Because they don't know how to speak in a voice
00:52:43.980 that would be appealing to the general masses. Okay. So now coming back to, thank you for asking
00:52:50.480 that. I think this is the first time that I've ever on a show explained the difference between
00:52:55.020 different types of books. So trade.
00:52:56.820 Well, and I love the fact that you describe it as appealing to the masses, as opposed to what I
00:53:00.360 suspect most would describe it as, as dumbing it down for the masses. Exactly. Yes. Don't,
00:53:05.820 don't talk down to people. You know, I recently had a chat with Rob Schneider, the actor, who's just
00:53:11.020 delightful. He's, he's amazing. He's amazing. I mean, just like you want to hug this guy. Okay.
00:53:15.860 And to, I think it was maybe towards the end of the show, he said some really sweet things to me.
00:53:21.080 He goes, you know, there's a lot of us in Hollywood, whatever, that are huge fans of yours.
00:53:25.660 And what one always walks away when they hear you speak or read your stuff is you're never talking
00:53:31.520 down to us. And that really touched me because I think that that's one of the, I know it's gauche
00:53:38.340 to speak about oneself, but one of the things that I most appreciate about my engagement with the
00:53:43.060 public is that I really take pleasure in having the corrections officer and the trucker write me a
00:53:50.180 fan email. Because if he is res, if my voice is resonating with him, that I'm doing something
00:53:56.420 right. It, it, it's a no brainer that the Stanford professor might write to me and say, Hey, I love
00:54:00.880 your last paper on whatever. That's my job. That's what I do. But the fact that the trucker says, Oh my
00:54:06.480 God, I was listening to you and Viva. And I decided that I'm going to go back to school because you,
00:54:10.860 you motivated me to study psychology. Well, now I think I've done a really good thing.
00:54:15.580 And you, when you say it out loud like that, it's the exact problem with not with academia in and of
00:54:20.840 itself, but with experts in general is people generally want to make their own decisions.
00:54:24.980 So give me the info and then let me decide. Whereas most look down and say, no, I'm going to
00:54:29.160 tell you what to do. And don't even ask me when it came to like COVID, for example, was the prime
00:54:33.840 example. We're not explaining it to you. You're not able to understand and shut up and do what we say
00:54:38.520 versus here's here are the pros and cons, the risks, and you make your own decision. It's, it's a form of
00:54:43.160 control and not a form of education. Exactly. Right. Uh, so coming back to the term vulgarization. 0.72
00:54:49.860 So when I was, uh, uh, communicating with my editor for the consuming instinct, which was going to be
00:54:57.880 my first trade book, I use the term, I think it was maybe during when we were coming up with sort of
00:55:04.940 the taglines for my media, whatever I said, well, you know, the book is an attempt to vulgarize
00:55:10.920 evolutionary psychology. And she said, Oh, don't, don't, don't use the word vulgarize with the
00:55:15.020 American audience. Why not? It's, it's a beautiful fan. It's a nice one. She goes, no, say popularize
00:55:20.440 and so on. And that always, uh, was annoying to me because it's, it's exactly what the word is meant
00:55:27.280 to say, which is to make something accessible to the masses. Anyways, I don't know why I was,
00:55:33.300 why I said that. What was the, what was the, the, uh, trade book you were talking about your first
00:55:37.000 trade book. And I think you were getting into the latest trade book. So the first trade book
00:55:45.920 was the consuming instinct. The next trade book was the parasitic mind. The next trade book was 0.84
00:55:54.460 the sad truth about happiness. Yeah. That's what that one is right over your shoulder there. If we
00:55:58.340 can see that right there. And then the next one is suicidal empathy. Uh, now to our earlier point
00:56:06.260 about, you know, don't tell people what to do when you, you said that a minute or two ago,
00:56:11.100 actually, that was one of my biggest challenges when I was working on the happiness book. And let
00:56:18.660 me explain, I mean, that, that story in of itself is, is really fascinating. So in, when you study
00:56:24.320 decision-making, there are different ways that you could study psychology of decision-making. You could
00:56:30.000 study it from a normative perspective and bear with me. I know it's tech. Is it okay if we get tiny
00:56:35.880 bit? No, please go. So normative decision-making is, for example, the classical economic, the classical
00:56:45.240 school of economics, say at University of Chicago, where many of the Nobel Prizes come from, they
00:56:50.840 adhere to a view of decision-making that's called homo economicus. Homo economicus is this ultra rational
00:56:59.200 being that adheres to axioms of rational choice. Example, if I prefer car A to car B, and I prefer
00:57:07.580 car B to car C, it must be that I prefer car A to car C. It's an axi, it's called the transitivity
00:57:15.800 axiom. It's transitive. If A is bigger than B and B is bigger than C, then A must be greater than C.
00:57:21.000 So they take those axioms of rational choice, and they say that anyone who violates those is
00:57:27.540 behaving irrationally. Now that's called normative decision-making because you're saying that you
00:57:33.240 have to adhere to a norm, a norm of rationality. Holy shit, I'm doing a full academic lecture here. 0.99
00:57:40.040 I should be charging all the assholes watching. I'm writing down questions. 1.00
00:57:43.580 Exactly. Yeah. And to everybody who's watching, yes, this will be on the final exam.
00:57:49.940 All right, let's go on. So that's normative decision-making.
00:57:55.060 Descriptive decision-making is studying decision-making to solve optimization problems.
00:58:03.900 And hence, you're prescribing an optimal path of decision-making. This is actually like, I literally
00:58:10.020 have thought about creating these lectures and charging people money. And one of the-
00:58:14.740 Yeah, they're masterclasses. There's no reason why you shouldn't be doing it.
00:58:17.460 I know, I know. I'm such an idiot. 1.00
00:58:18.960 Well, there's only so much time in the day, but- 1.00
00:58:21.440 Exactly. But, by the way, one of the bifurcations in the road that I'm currently facing is,
00:58:29.320 I am an academic through and through. It's in my DNA. But if I now have to, you know,
00:58:35.580 poll people about their gender pronouns in the morning, which of course I don't do,
00:58:39.160 there's an element of academia that's becoming very, very costly to me, so that the thought
00:58:44.920 has crossed my mind, is it time for me to step aside? Now, if that were to happen, then it would
00:58:50.240 allow me to set up those masterclasses. And you've been one of the ones who's been most
00:58:54.140 strongly and feverishly telling me, sign up for this, set up, paywall. You're probably the one
00:59:00.820 who's most been doing that. And to my great shame, because I'm a moron, I still haven't done a lot of 0.97
00:59:06.620 that stuff. Well, I mean, not to say it hasn't gotten bad enough for you in- I don't want to 0.94
00:59:10.860 besmirch where you work, because I don't want to get you into trouble through association, but
00:59:14.240 not to say it hasn't gotten bad enough, because I think it's gotten way worse than anybody even
00:59:18.060 knows as to how bad it is where you can't safely go on the campus where you teach, and you have to
00:59:22.280 teach remotely, and you have to worry about security. I think it's gotten bad enough where
00:59:25.620 you will see the writing on the wall, but it takes some time to catch up.
00:59:31.080 And that's one of the reasons why it was a godsend, frankly, that I got this position at
00:59:35.300 Northwood University. They're very much into freedom of inquiry, freedom of speech, free
00:59:39.780 enterprise, economic freedom. And so their president reached out to me and said, look, we're big fans of
00:59:44.400 yours. We're ready to make you a permanent offer. I was a bit tentative about that only because at the
00:59:49.460 time I was in California with my family for five weeks. So I wasn't ready to change my life without at
00:59:55.520 least trying it. So we decided we agreed to do a one-year leave. My university was kind enough
01:00:01.140 to grant me the one-year leave. And so for the, at least for the next, yeah, I say, look, I'll say
01:00:07.020 also what I think they want you out. And if they can get you out without a fight, Hey dad, don't take
01:00:11.880 a vacation. Yeah. You can come back to your job. As I said, as I said, kindly, I also had a little smirk
01:00:18.520 because I realized that boy, why is God sad being so diplomatic? But yes, there you go. That's my
01:00:25.180 if anybody doesn't appreciate the dynamic. Gad takes a lot of flack for his writings and
01:00:30.460 statements on Islam. He teaches at Concordia university, which has been historically one
01:00:34.400 of the most radical universities, uh, in, in, back when I was in 2000, like they were, they
01:00:39.900 were protesting. Yeah. I can only imagine. Hold on. This dog's got to get out of here. You
01:00:44.700 know, go, go, go. Oh, yeah. Speaking of, uh, evolutionary biology, how many paralyzed dogs would have
01:00:53.940 survived in the nature? I look at that dog every day. It's like, my goodness. Oh, if
01:00:58.620 only you struggle. There is though, an evolutionary explanation for why we are so bonded to our
01:01:04.780 pets. And I actually discussed it briefly in the consuming instinct in the, in the red
01:01:08.880 book that I showed up. I have a theory that they're wonderful. They keep us, they keep
01:01:12.180 us, you know, company. And if things get really bad, you can kill them and eat them. So it's 1.00
01:01:16.020 bada bing, bada boom. I'm joking, everybody. Yeah. As you mentioned something, the, the, the,
01:01:19.940 the, the, a is greater than B or sorry. A is, uh, B is greater than C is greater than
01:01:25.240 B. B is greater than C. Therefore, A is greater than C. That works on an objective level where
01:01:29.760 like that's logic one-on-one. I said, everybody should always take a philosophy one-on-one
01:01:33.640 logic because they teach you that. Then I realized, as you say it, it doesn't apply to things that
01:01:37.220 are subjective. Like I like Billy Madison is better than happy Gilmore. Uh, dumb and dumber
01:01:41.520 is better than Billy Madison, but I don't necessarily like dumb and dumber more than happy Gilmore. 1.00
01:01:45.060 Okay. Well, you just described, uh, a Nobel prize winning studies. So, so let me, so, okay. So hold 0.96
01:01:53.420 on. So, uh, I just described normative decision-making and I was on my way to describing prescriptive,
01:01:59.300 but let's pause prescriptive for a second. So I can answer what you just said. So the two,
01:02:03.980 so my, my doctoral training, my, I mean, my official PhD is in psychology of decision-making
01:02:10.120 specifically. My doctoral dissertation was on looking at the following problem.
01:02:15.340 When is it that we have acquired enough information about competing alternatives for us to stop
01:02:21.540 acquiring additional information and make a choice? So for example, if I'm choosing between two cars,
01:02:26.720 I may be, I may choose, I could look up 50 attributes on the two cars, but most of us don't
01:02:31.680 do that. Instead, we look at enough information that at some point my brain says, I've now seen enough
01:02:38.940 to buy car A. So what I was trying to do in my, uh, doctoral dissertation is study the stopping
01:02:45.980 decisions of information search, which could then be applied to anything. So when people write to me
01:02:50.540 and say, Oh, but you know, you study marketing. I mean, there's almost never a mention of the word
01:02:54.660 marketing in my doctoral dissertation. It just so happened that I can then apply it to consumer
01:03:00.500 behavior, hence marketing. But the fundamental doctorate, the fundamental problem that I was
01:03:05.520 studying applies to mate choice. It applies to employee selection. It applies to, should I stay
01:03:10.840 in a marriage or leave the marriage? So it applies to any process that involves the iterative acquisition
01:03:16.460 of information. And when do I stop? Are you with me? Holy shit. This is a good conversation.
01:03:22.540 And you know what?
01:03:23.580 Speaking of killing the dog, now the dog wants to come back in the room. Hold on. 0.50
01:03:26.800 From an evolutionary perspective, I'm probably the stupid one for just not leaving the door 0.99
01:03:35.020 open at a given point in time. I'm leaving the door open now, people. I know my wife and 0.98
01:03:38.060 kid are not here, so I'll have some quiet. Okay. All right. Let's go on. So, uh, so my doctoral
01:03:44.380 work was in the area of behavioral decision-making and the gurus of that field are two, no Jews. 0.98
01:03:54.620 Oh, they're Jews. Jews. Uh, Amos Tversky and Daniel Kahneman, two Israeli psychologists who then, 0.98
01:04:05.680 you know, moved to North America. They were lifelong friends and colleagues. And what they did to your 0.70
01:04:11.900 point about, I prefer Gilmore, but whatever violation you said, the first paper that Tversky
01:04:18.440 published in 1969 was with my doctoral supervisor, Jay Russo, uh, who did his PhD in psychology and
01:04:27.980 cognitive psychology at University of Michigan. Uh, now Tversky and Kahneman demonstrated that many
01:04:34.740 of the axioms of rational choice that classical economists tell us we should adhere to, we don't. 1.00
01:04:41.560 And so they won the Nobel prize in economics for demonstrating that economists are full of shit. 1.00
01:04:49.480 Okay. And that idea being, they've already made up their mind as to what they want to get. And so 1.00
01:04:53.340 they will fit into the preferences, the way to get there. No, no. It's that the, the, the economist
01:05:00.140 has a very restrained axiomatic view of how human cognition should operate. It should be hypercomputational,
01:05:10.260 hyper-rational, but human minds don't work that way. They don't adhere to this orgiastic form of
01:05:18.820 hyper-rationality. So what Kahneman and Tversky did is go through all of those axioms of rationality
01:05:26.740 and demonstrate that we violate them all the time. And so they designed many astoundingly clever
01:05:32.840 experiments to show to your Gilmore story that we don't do all the things that these economists tell
01:05:40.100 us in La La Land that we should be doing. Okay. Now I say they won the Nobel prize. That's technically
01:05:45.360 incorrect because Kahneman won it for work that he did with Tversky. Tversky didn't win it because he
01:05:53.020 had unfortunately passed away and you can't win it posthumously. So officially it's only Kahneman who
01:05:59.520 won it, but it's really Kahneman and Tversky. Okay. Now, so that, that's the normative decision-making.
01:06:06.460 Okay. Prescriptive decision-making, as I mentioned earlier, is prescribing how you ought to optimize
01:06:15.040 something. And I know that, that sounds like fancy. Give me a concrete example. Take for example,
01:06:20.460 a classic problem known as the traveling salesman problem. You have a salesman who has to visit 10
01:06:26.800 cities. He's going to start in city A. He has to return to city A and he has to visit each city once
01:06:35.460 in what order should the path be as to minimize the cost of travel. That turns out to be an easy
01:06:45.380 problem. If you've only got three cities, you could manually try all three and then calculate,
01:06:50.260 which is the optimal one. When I give you 13 cities, you're going to be spending until the next
01:06:56.440 4 billion years to calculate all the permutations and combinations. So the field,
01:07:02.520 the mathematics field that solves those problems algorithmically is called operations research.
01:07:11.400 And so in my, in one of my, I have two masters. In my first masters, I did a mini thesis in operations
01:07:18.440 research, which is this optimization field. Now, why is it called prescriptive decision-making?
01:07:22.940 Because you're trying to prescribe some optimal behavior. If you wish to reduce the travel costs,
01:07:31.820 this is the order in which you should visit those cities. Are you with me?
01:07:35.900 Yep.
01:07:36.340 So first approach to decision-making normative, second approach, prescriptive, third approach,
01:07:43.920 descriptive. Descriptive simply says, describe how people actually make decisions. I'm not trying to
01:07:51.100 find some normative way of behaving. I'm not trying to prescribe some optimal way of behaving. When
01:07:58.360 you decide you want to buy a car, let's study the cognitive processes by which you arrive at choosing
01:08:04.120 the car or marrying that girl or voting for Trump or Kamala and so on. And so for much of my career
01:08:10.860 as a psychologist, as a behavioral scientist, I existed in descriptive world. I'm trying to develop
01:08:19.020 theories that describe actual decision-making behavior. Okay. Why am I talking about all this?
01:08:26.120 Because I'm going to link it back now to the challenge that I had when I was writing the happiness book.
01:08:31.720 Now, when you're writing a happiness book, by definition, this is going to involve some
01:08:38.140 prescription. Here is what you ought to do to lead a happy life. And I had always been very
01:08:45.040 epistemologically suspicious of the people who stand as gurus and lecture to the rest of us what
01:08:53.320 to do. And being the very, very careful professor and scientist that I am, I wanted to make sure to
01:09:01.920 tell my readers that I am not offering you the incontrovertible recipe for happiness. Rather,
01:09:10.520 spoken like a real academic, life is a game of managing statistical probabilities, I'm going to give
01:09:21.220 you some prescriptions, which if you apply them, increase the probability of you summiting Mount
01:09:30.380 Happiness. So look at the difference between how carefully I just positioned that book to how,
01:09:38.480 to your point, everybody who was much smarter than you told you, shut up, you don't walk your dog
01:09:44.680 after eight out after eight o'clock, because that's settled science. Shut up, pleb. So those were exactly 1.00
01:09:51.180 who the fascists were. And it turns out that they weren't part of the Republicans. Many of them are
01:09:56.600 super progressive professors, leftist professors. The correlation between progressivism and fat and
01:10:03.820 fascism or tyranny or at least, I don't know what the proper word is for it, but like
01:10:07.240 Karenism, like progressivism and Karenism, people who just tell you what to do and shame you when 0.95
01:10:14.400 you disagree with them. I find it's almost a direct overlay. It's a one circle Venn diagram,
01:10:20.520 progressives and Karenists who... By the way, I have a student who regrettably has disappeared. Often
01:10:28.820 what happens when you have MSc students who are doing theses with you or PhD students,
01:10:33.980 it's the first time in their lives where they're now tasked with creating knowledge rather than
01:10:41.320 just absorbing knowledge from a classroom. And so oftentimes, regrettably, even though students
01:10:48.060 might be very bright, they get sucked into a black hole and they never resurface from. And so I've had
01:10:53.740 very, very bright students who wanted to work with me under my tutelage that never ended up finishing
01:10:59.440 their degrees. And I'm not the one who will call you every day, say, what are you doing today? I mean,
01:11:04.260 you're an adult. You also have to have the personal agency to get your work done. In any case,
01:11:11.360 that one particular student, I had proposed the following thesis topic to him, and he was very keen to
01:11:18.660 do it. It very much related to your working hypothesis, which is, for example, can we identify
01:11:27.800 morphological signatures of people's political orientations? So this is not quite Karenism,
01:11:35.860 although it is, in a sense, because there is a morphological exemplar of the woman that looks like
01:11:43.220 a Karen. You follow what I'm saying? Oh, yeah. I'm just, I'm just trademarking Karenism right now
01:11:47.720 here. Boom. Karenism is trademarked, peeps. I'm joking. Actually, it is the first time that I've
01:11:53.600 heard it used with the ism part. So... Because it sounds like, it sounds like communism, but it's
01:11:58.400 Karenism. It's Karenism, exactly. So anyways, I hope that that particular student resurfaces because
01:12:03.960 we had already started to collect the data and it's, it's very, very powerful data because it's taking a lot
01:12:10.360 of, a lot of the elements that I discuss in the parasitic mind and it's testing some really cool
01:12:17.540 empirical hypotheses. So hopefully, if, if that student is watching, call me. I don't know where
01:12:25.280 you've disappeared to, but it's time to come home. But why was I mentioning that? It was about the
01:12:32.380 decision-making process. Yeah, that's right. That's right. So yeah, so I think I finished that story. So all I was
01:12:36.800 saying is that the challenge for me when writing Happiness Book was that I was now entering into
01:12:43.820 a world that I heretofore had not entered, which was prescriptive world, right? All of my other work
01:12:50.660 is, let me explain to you the evolutionary reasons for why we do X, Y, Z. I'm describing behavior
01:12:58.660 under some parsimonious theoretical framework. Happiness was a completely different endeavor.
01:13:03.940 Yes, I'm going to use ancient wisdoms. I'm going to use contemporary science. I'm going to use my
01:13:10.660 personal life trajectory to offer you some prescriptions of how to live a happy life.
01:13:17.260 But at first I was very hesitant to do so because I thought, you know, I don't want anybody to think
01:13:22.900 that it is a, it is an assured guarantee recipe. Well, it's, it's, it's, um, I don't know. I can
01:13:29.260 analogize it to golf where a professional might never get a hole in one, but the better you
01:13:33.880 swing, the more likely it is you get your ball within a certain vicinity. You don't cheat on your
01:13:38.940 wife. You might still get divorced, but cheat on your wife and you are increasing the odds of getting
01:13:42.760 divorced and murdered and not necessarily in that order. Uh, so no, the, so, I mean, I don't want to
01:13:48.520 get to the punchline of what a suicidal empathy is going to be. I kind of want to know how it ends.
01:13:52.280 Where do you see, I mean, America's a good litmus test for the world right now. I think it might've
01:13:56.800 gone singularly crazy, but then I sort of look to Europe and Europe has gone crazy as well in terms
01:14:02.380 of a same type of Karen ism in terms of speech, in terms of government overreach, there's a bit
01:14:07.840 of a pushback, but whether or not the government's gotten already too far along to be pushed back,
01:14:12.800 where do you see things going? And then I actually want to talk a bit about Canada while we're sure.
01:14:16.940 Uh, I mean, right now I'm not feeling great. I'm certainly feeling better than I did Tuesday
01:14:23.240 morning, right? Because at least by Trump winning, you are going to have some auto-corrective
01:14:31.020 mechanisms that will swing things in the right direction. But I keep warning people I've already
01:14:35.780 done since Tuesday, several shows, and I keep repeating the following point, which I'm happy
01:14:41.980 to repeat here. Don't now be complacent, sit back and say the problem is solved. Trump is here because
01:14:48.260 Trump will come in and Trump will leave. And someone else might come along who is also a
01:14:53.100 Trump guy, JD Vance. But if you don't eradicate the parasitic ideas and the suicidal empathy that
01:15:00.920 has taken 50 to a hundred years to flourish and proliferate, then you, it's just in French,
01:15:07.100 you say, right? It's, it'll come back even maybe stronger than ever before. Right? Remember when you
01:15:13.880 take an antibiotic, if you don't kill all of the bacteria, the ones that remain, it's a form of
01:15:20.840 evolutionary selection will be even that that's how you develop the evolution of the superbug.
01:15:25.600 So if you don't eradicate this nonsense, it'll come back even more orgiastically nasty in some
01:15:32.640 future iteration. So yes, celebrate that Trump won, but then don't sit back on your couch and eat
01:15:38.320 potato chips all day. There's still a lot of work to be done. That, that analogy actually of like
01:15:42.820 why the doctors say, take the antibiotics, even if you feel better, because if you don't,
01:15:47.620 it comes back stronger and more resistant to antibiotics. Is that, that's an amazing idea,
01:15:52.480 but you know, the superbug to the parasitic mind. It's called in mind, it's, I'm giving you a lot
01:15:57.340 of stuff from suicidal empathy. It's actually, it's, it's, it's, it's genius gab, but you know that
01:16:03.220 already. And, and, and I'm, my personal dilemma is yeah, Trump won and everybody can celebrate,
01:16:09.060 but I don't think that I'm done yet mocking into oblivion everyone and publicly shaming those who 0.96
01:16:14.840 were the most arrogant, pompous pricks on earth. The Michael Cohens of the world, the, the Cardi B's 0.79
01:16:20.740 of the, the ones who thought they could brow people to submission because they need to be mocked and
01:16:24.840 humiliated and basically not disavowed, but, um, I shunned into irrelevance. Yeah. I had on a guy
01:16:32.860 named Richard Barris yesterday, who's the best pollster out there. And, you know, I said like with Trump's
01:16:37.820 victory, is it unique to Trump or is it sort of party wide? It was a stupid question as I started 1.00
01:16:42.540 asking. And he's like, no, it's, it's Trump. It's the character. It's the person that people love 1.00
01:16:46.980 and Trumpism, which could be, you know, when Trump comes and goes in four years, it could be someone
01:16:52.160 else who has to fill in that Trumpism, whether or not it's JD Vance, we'll find out as things evolve,
01:16:57.000 but it's true. Like if, if Trump comes and goes and it's a flash in the pan, um, then it's a,
01:17:03.260 it's a short, it's a big victory, but short lived. So 100%. And so if I can draw an analogy,
01:17:09.340 I was recently approached by someone, I won't, I won't give the details. I don't give them away
01:17:13.900 who said, and this has happened to me many times in the past, but someone says, you know, I'm such a
01:17:20.400 fan and all kinds of compliments. And then there's a, but, so I'm waiting for the, but, but do you not
01:17:27.300 think that when you do the pink wig, it affects your, uh, you know, legitimacy as a professor or
01:17:37.420 when you hide on it? And I look at them, I say, you exactly don't get it. I mean, it, it, it couldn't
01:17:46.200 be any clearer that you need to consume more of my content. It is precisely because of the unique
01:17:54.340 set of skills that I have that I'm able to be this effective. Uh, I just spent quite a bit of time
01:18:01.340 being about as professorial as you can get, right? So no one is going to outrank me on professorial
01:18:07.520 status. That doesn't remove the fact that I can act like a buffoon in the service of trying to 0.95
01:18:16.440 persuade you precisely because I have enough authenticity and self-confidence to know
01:18:24.060 that it doesn't diminish me by putting on the, the, the wig. It's precisely because I am 18 feet
01:18:31.900 tall, metaphorically speaking, that I can wear the wig. Now to draw an analogy with Trump, it's the exact
01:18:37.460 same thing. If he were more stately, a la, uh, Romney, and if he were more kind, a la pick your other
01:18:46.700 presidential guy, his voice would have never resonated. You need him to be exactly who he,
01:18:53.620 he has to show up in the garbage truck for, for it to work and for him to win on Tuesday.
01:18:59.180 It's God. I like we're, we're lit. We don't even know this, but our thoughts must've been the same
01:19:03.100 at a given point in time, like just recently, cause I have the discussion with my father who
01:19:06.980 said that he follows me on Twitter. He's like, David, do you have to swear so much? I was like that.
01:19:10.900 First of all, part, there's an element of getting, not getting attention, but getting people to focus
01:19:16.580 on something and you could be polite as you want and have a great message. And a, no one's going to
01:19:21.780 hear it and good for having a good message that no one hears. And B, you'll still get demonized.
01:19:26.280 You could be, you could never wear the pink wig. You could never do the satire. You'll still get,
01:19:30.260 you'll still get derision because of what you say. So on the, strategically, there's no reason not to.
01:19:34.900 And on the other hand, it is the level of honesty where, you know, like I was thinking Trump's in power.
01:19:39.700 If I ever got the call, Viva, would you be the press secretary? And for a second, I'm thinking,
01:19:43.420 ah, my tweets are too much of a liability. And then I, and I really, I'm thinking like, no,
01:19:47.900 my tweets are an indication that you'll get respect. But if you pull a Jim Acosta,
01:19:51.840 I will berate you and tell you to stop being an idiot and sit down and get, give the mic up to the 1.00
01:19:55.840 next real journalist. You've justified all of my bad conduct yet. I hope you should feel proud of 1.00
01:20:02.080 yourself. Listen, authenticity is the whole ball game, right? I mean, I actually in the,
01:20:08.080 in the happiness book, I have a whole section on authenticity. Now they are,
01:20:12.240 there are two layers of authenticity. There is, there's personal authenticity, you know,
01:20:16.800 is Trump authentic? Is God sad? Is Viva authentic? But there's also existential authenticity, meaning,
01:20:23.080 so in this, I was talking about this in the context of living your life so that hopefully at the end of
01:20:27.760 your life, you have as few regrets as possible. And there I was arguing, if you live a life of
01:20:32.960 existential authenticity, then you are protecting yourself against that. And by the way, and I'm
01:20:38.960 not saying this just because I'm on your show, you perfectly exemplify that which I tell people to do,
01:20:44.860 which is you did become a lawyer and you, I think you were the editor of the law review. So it's not
01:20:51.860 like, so you had all of the credentials and all the things, and you went to a top law firm. And then
01:20:57.620 you looked at one point in the proverbial mirror and you said, that's not what I want to do. And
01:21:02.520 very few people are going to say, you know what I'm going to do? I'm going to go to philosophy degree
01:21:07.980 at McGill. I'm going to do law school. I'm going to become the law review guy. I'm going to go to a
01:21:12.780 law top firm. And then I'm going to drop all that so I can do balloons with ice. What is that thing you
01:21:17.900 did? The impimba effect. Amazing. Right. And that's what I'm going to do. I'm sure your parents
01:21:24.080 came very close to disowning the good Jewish boy, but guess what? Who won at the end? By you living 1.00
01:21:31.640 an authentic life, an existential authentic life, I'm sure you've been able to flourish in ways that
01:21:37.780 you could have never imagined had you continued. So don't become a pediatrician because your dad
01:21:43.200 and your mom are pediatricians and they expect that of you because you will wake up at 60 and say,
01:21:48.300 I always wanted to be an architect and now my life is gone and it's too late. So authentic. And by the
01:21:54.880 way, the ancient Greeks and the Delphic maxim of know thyself, it's just two words, know thyself
01:22:04.000 has stood the test of time because it is universal and since time immemorial.
01:22:09.500 Amazing. Again, so what is your, uh, if I may ask the situation with Northwood, Northwood
01:22:14.840 University? Yes. So North, thank you so much for asking. So Northwood reached out to me, uh,
01:22:20.060 by the way, the president of, I don't know what it is. Their entire, not their entire, but quite a few
01:22:26.840 of their senior staff are Canadians. So the president, this, so Northwood university is a
01:22:32.700 university in Midland, Michigan. Midland is, I'd never heard of it. It is gorgeous.
01:22:38.980 Now there is a reason why it's so gorgeous. It's because this is where the headquarters,
01:22:44.300 the global headquarters of Dow chemicals is in Midland. And so they've poured in billions of
01:22:50.800 dollars to this town precisely because if you're going to bring a top chemist from Zurich to, you
01:22:57.400 know, rural Michigan, you, you need to have culture. You need to have a, uh, you know, a sushi place
01:23:04.480 and you need to have cafes. So when I first went there, my inaugural visit, uh, earlier this semester,
01:23:10.300 I, and my wife came with me, I was concerned, like, is it, is it going to be like, uh, you know,
01:23:14.760 crystal meth labs everywhere. And it is gorgeous, beautiful downtown, uh, cafes and restaurants and,
01:23:23.600 you know, cool. It's just, it's beautiful. The campus is beautiful. Northwood university is
01:23:29.040 interestingly, it follows the European model of les grandes écoles. Les grandes écoles is,
01:23:35.740 these are the schools that are focused on only one thing. So for example, Sciences Po for the future
01:23:41.840 political leaders, uh, all the business schools are usually separate schools. They're just business
01:23:48.320 schools. So Northwood university is really a big gigantic business school. That's completely rooted
01:23:55.700 in all of the freedom ethos that you could think of. So the president reached out to me in, uh, uh,
01:24:04.220 the summer when I had put out, out of office for five weeks, I'm not answering anybody. And he knew
01:24:11.420 of some of my difficulties at Concordia. They were all big fans of my work. And luckily I violated
01:24:17.940 my edict to not check my emails. And so I had this long email from this guy that I'd never heard of,
01:24:23.980 who had been the president of San, uh, San Francis Xavier, uh, in Canada. And he reached out, he said,
01:24:31.220 Hey, can we talk? We'd love to have you join our, our family. And since then it has been one of the
01:24:37.240 most, it's not like I'm being paid to say this, right? I mean, I wouldn't have said it. I'm very
01:24:42.280 authentic. He's a gem. I mean, if all academic presidents were like Kent McDonald, there would
01:24:49.520 be no problem in academia. Okay. Uh, his whole team is unbelievable. They've given me complete
01:24:55.680 freedom to do things as, as I please. I'm teaching two courses in, in, uh, spring. Uh, but otherwise,
01:25:03.140 so that my title is visiting professor and global ambassador. The global ambassador part
01:25:07.580 is to promote the school, right? Is to use my platform. Look, there are 4,000 universities in
01:25:14.560 the United States. So a classic marketing problem is how do you differentiate yourself? How do you
01:25:20.540 break out of the clutter? And so hopefully I could contribute to, you know, breaking them out of the
01:25:27.040 clutter. And I think so far it's been very, very fruitful. A lot of people have gotten to know
01:25:32.020 Northwood, uh, via my intervention. So I've got nothing but unbelievable things to say about that
01:25:38.460 place. That's it's amazing. And the best marketing on earth is to have someone who people trust love,
01:25:43.720 who is what they call woke or based as we not woke, sorry, anti-woke and base or whatever the words are
01:25:49.860 people, you know, who it's, it's a university that would bring on a Gadsad and not suppress a Gadsad
01:25:56.040 is a university where I, as a parent would want to send my kids without a question. And I've got
01:26:00.920 nothing to do with the university. So it's, it's, it's, it's fantastic.
01:26:05.080 Gadsad, do you have another 15 minutes?
01:26:06.760 Let's do it.
01:26:07.540 Okay.
01:26:08.140 It's impossible to be satiated of you, man.
01:26:10.380 No, no, I do it. And I got questions. I don't want to neglect. So everyone on YouTube,
01:26:13.360 come over to rumble and I'm going to get to some questions and low and local stuff in a second.
01:26:17.280 I'm just going to end it on YouTube. It changes nothing from our end, but it's the Q and a after party.