The Saad Truth with Dr. Saad - September 11, 2024


My Thoughts on the Trump-Harris Debate (The Saad Truth with Dr. Saad_711)


Episode Stats

Length

39 minutes

Words per Minute

145.45747

Word Count

5,719

Sentence Count

459

Misogynist Sentences

14

Hate Speech Sentences

10


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Sen. Kamala Harris (D-CA) is facing a tough re-election challenge from Sen. Elizabeth Warren (D, D-Massachusetts) in Tuesday night's Democratic primary debate. Is it time to turn your brain off and vote on emotions?

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.100 All right, so today I wanted to, I think in two hours, we have the much anticipated debate.
00:00:06.380 I always, I'm not sure if I should ever watch these things because I get pissed off and
00:00:11.280 it's usually boring and so on.
00:00:14.000 But this seems like a potentially explosive situation.
00:00:19.160 Let's see what happens.
00:00:20.420 So what I wanted to do is first read for you a, I wrote an article last week, my inaugural
00:00:27.560 article for Newsweek.
00:00:33.520 It's an opinion piece.
00:00:35.320 So I wanted to read it for you.
00:00:36.820 I mean, of course, you can just go and, you know, read it on your own.
00:00:41.000 It's not behind a paywall, but it's very relevant to tonight's debate.
00:00:48.200 I'm looking, the mic is still on.
00:00:50.180 Can I get a thumbs up that everything's going well, that you could fully hear me?
00:00:53.740 Give me a thumbs up, somebody.
00:00:55.700 I'm waiting for you guys.
00:00:57.980 Thumbs up, someone.
00:00:59.680 I guess there's a big time delay from when I ask it.
00:01:02.700 I haven't gotten a thumbs up.
00:01:05.020 Anybody?
00:01:06.260 Give me something.
00:01:09.540 No?
00:01:11.440 Okay, I got a thumbs up.
00:01:12.700 All right, thank you so much.
00:01:14.060 Okay, so this is an article.
00:01:16.420 Thank you very much, guys, for those of you who gave a thumbs up.
00:01:19.140 The title of the article, this was last week.
00:01:22.780 It was published on September 5th.
00:01:25.180 And many thanks to Batya Angersargon, who is the deputy editor of the opinion section.
00:01:35.220 And she was on my show last week.
00:01:36.520 You really should listen to our chat.
00:01:38.360 Fantastic chat.
00:01:39.080 I've also had Leila Micklewaite, who's been fighting Pornhub.
00:01:46.380 She came on the show.
00:01:47.780 And then I've also had, just posted it today, Marissa Streit, who is the CEO of PragerU.
00:01:55.980 You, three women, three honey badgers.
00:01:59.120 They take no prisoners.
00:02:00.780 They're tough as nails.
00:02:02.340 I wish one of them were.
00:02:03.880 If we need a first female president, I wish someone with their grit and intelligence could be president.
00:02:12.860 So you can go check out my chats.
00:02:14.160 And by the way, just a little bit of administrative stuff.
00:02:19.860 Please go check out my chat with Stephen Bartlett.
00:02:23.520 He's a gentleman, I was in California this summer, and I did many, I did several shows, including his show.
00:02:32.140 The show's called Diary of a CEO.
00:02:34.660 They're from Britain.
00:02:37.760 We had an incredible chat, very long, over, I think, four hours and 15 minutes.
00:02:43.520 The final edited piece came out yesterday, and it has been, it has gone so viral.
00:02:51.600 And, you know, I've been on many, many high-profile shows with, you know, many incredible hosts.
00:02:58.680 I don't think I've ever had a more powerful, informative, intimate conversation.
00:03:05.300 Of course, in part because the host has the right set of tools, the right skills, conversational skills.
00:03:12.020 He's well-prepared so that it just flowed beautifully.
00:03:14.900 So go check it out on his channel, Diary of a CEO, hosted by Stephen Bartlett.
00:03:21.660 Trust me, you won't regret it.
00:03:23.500 We got into all sorts of incredible things.
00:03:26.160 Okay, so here is the, I'll first read you the article, and then I'll offer some additional analysis.
00:03:32.440 So Kamala Harris is hoping you turn your brain off and vote on emotion.
00:03:37.200 By the way, for those of you who may not know, I have taken a one-year leave from my home university, from Concordia University.
00:03:48.180 And I'm delighted to report that I am a visiting professor and global ambassador at Northwood University.
00:03:55.420 If you don't know much about Northwood, please look into it.
00:03:59.840 It's a, their slogan is the Free Enterprise University.
00:04:05.340 You know, all of the values that I defend and support is exactly what they embody.
00:04:10.520 So I'm very, very much looking forward to doing all sorts of, you know, really exciting things at Northwood University.
00:04:18.280 Okay, so let me read for you the article, and then I'll offer some analyses.
00:04:21.180 The actor Ben Stiller was recently asked why he supports Kamala Harris.
00:04:27.640 There are a number of answers he might have provided, maybe her progressive stance on abortion, or on housing, or the environment.
00:04:35.260 But Mr. Stiller mentioned none of these, of those.
00:04:38.280 His answer, quote, all the energy and excitement that is around this movement right now.
00:04:45.220 Stiller then added that it was time for a change, perhaps forgetting that Harris is the sitting vice president.
00:04:51.180 But don't judge Stiller too harshly.
00:04:53.860 This is the standard response of Harris supporters.
00:04:57.500 Harris has so singularly relied on generating positive emotions that when she tentatively offers a policy position,
00:05:04.980 it is with the half-hearted certainty that it will never become law or just flagrantly copied from Trump.
00:05:13.040 Many of Trump's voters are going to vote based on emotions too, but the Harris campaign has cynically embraced a very specific view of human behavior,
00:05:22.820 that our natural inclination to make decisions is largely based on emotion rather than substance and data.
00:05:29.800 And she's hoping it will carry her all the way to the White House.
00:05:33.900 Is she right?
00:05:34.500 In some sense, she is.
00:05:37.100 There's actually a deep tension between our emotional or affective systems and our intellectual or cognitive systems when we make decisions.
00:05:46.100 And it turns out, we humans are cognitive misers when all is said and done.
00:05:52.460 We're intellectually lazy.
00:05:53.980 Thinking is too effortful for most, so we rely on effortless, automatic mental shortcuts, in other words, emotions to arrive at a decision.
00:06:03.740 That there are two pathways to persuasion is a fact well known to advertisers.
00:06:10.640 They think of them as the central and peripheral route of persuasion.
00:06:14.380 The central route utilizes cognitive justifications to sell a utilitarian product.
00:06:20.520 Think five things to consider when deciding if a reverse mortgage is right for you.
00:06:26.120 On the other hand, the peripheral route uses cosmetic cues rooted in affective processing
00:06:32.420 to sell you a hedonic product.
00:06:34.820 Think of an ad showing a sexy young woman running on the beach to sell you cologne.
00:06:40.320 When promoting a reverse mortgage, it is crucial to engage your cognitive system.
00:06:45.120 When selling you a cologne, the advertiser wants to trigger your emotions.
00:06:50.840 It's this peripheral emotive route of persuasion that the Harris campaign has embraced
00:06:55.880 with the positive vibes campaign rooted in joy, excitement, and fun.
00:07:00.400 Her managers are willfully hijacking your decision-making process by ensuring that you focus only on your affective peripheral system.
00:07:10.300 It's deeply cynical for the obvious reason that when selecting the leader of the free world,
00:07:16.820 you should be engaging your cognitive system.
00:07:19.440 A rational voter should evaluate the respective positions of the two candidates
00:07:23.680 on fiscal policy, immigration policy, border security, foreign policy, criminal justice policy,
00:07:30.880 commitment to the First and Second Amendments,
00:07:33.540 the tension between the rights of biological women versus trans women, i.e. biological males,
00:07:39.440 and their stance on meritocracy versus diversity, inclusion, and equity.
00:07:43.520 And yet, the great majority of voters are utterly oblivious about these issues
00:07:49.560 and prefer to love or hate a given candidate based on irrelevant, affective processing.
00:07:56.580 Emotions are evolutionarily important, but only when properly deployed on the right target
00:08:01.980 at the appropriate situation.
00:08:04.180 It's crucial not to let your emotions hijack your thinking, whether you are a voter or a policymaker.
00:08:10.760 As American voters, you have the sacred task of choosing the person who will impact
00:08:16.440 not just your country, but the entire globe for the next four years.
00:08:22.000 I am Canadian.
00:08:23.680 I will not be voting in the upcoming 2024 presidential elections.
00:08:27.820 But I do care about the foundational values that define the West.
00:08:31.460 I have seen what happens to a country when its voters are mesmerized by inconsequential emotional appeals.
00:08:38.080 We ended up with Justin Trudeau serving for three consecutive terms.
00:08:43.800 Whether you're a Democrat or a Republican,
00:08:46.920 you are in danger of making a crucial decision based on emotion.
00:08:51.420 But only one of the campaigns is betting on that,
00:08:54.720 betting on you succumbing to the weakest, laziest version of yourself.
00:08:59.400 There is a big chance it's because that candidate has the weaker policies.
00:09:03.880 Do yourself a favor, search for and process the relevant information.
00:09:09.340 Vote with your intellect.
00:09:11.040 Vote on the issues.
00:09:12.440 The rest of us are relying on you to make the right choice or at least an informed one.
00:09:17.640 So this was my inaugural article in Newsweek that came out last week, September 5th.
00:09:23.220 You can just do a search, Gatsad Newsweek, and you can have it in your file.
00:09:31.040 Next, what I wanted to do is just, admittedly, this is not a scientific poll.
00:09:37.500 Obviously, it's not a representative sample, so on and so forth.
00:09:41.620 But, you know, these little snippets, these X polls give us a sense of, you know, how people think.
00:09:49.440 And then I'll discuss.
00:09:50.740 I don't know if you saw.
00:09:51.800 I hope you did.
00:09:52.500 And if you didn't, you can still go access it now.
00:09:54.820 I posted next to the call for today's X Spaces, I uploaded two figures, two screenshots that I'd like you to look at as I think of it as an X lecture you're getting here.
00:10:12.320 So I wanted to first read for you, I think, six or seven polls that I took earlier today.
00:10:18.300 This was about two hours ago.
00:10:20.140 And I've had several thousand people vote.
00:10:22.780 So here we go.
00:10:23.480 So who is superior on immigration?
00:10:26.720 And, of course, this is a binary poll.
00:10:29.100 It's Harris versus Trump.
00:10:31.160 So who is superior on immigration?
00:10:33.780 3,291 people voted.
00:10:36.860 93.8% chose Donald Trump.
00:10:39.980 Now, in case you're thinking, yeah, but sure, all of your, I have now almost a million followers on X.
00:10:48.040 Oh, they're all, they all share, you know, your view.
00:10:51.680 They're all Trump supporters.
00:10:53.020 Well, I don't know how many are Trump supporters, but I can assure you there are many people who follow me who are not Trump supporters, as evidenced by the vitriol that I receive whenever I say anything complimentary of Trump or anything critical of Harris.
00:11:08.780 None other than Mark Cuban, who does follow me.
00:11:13.120 We've communicated on many occasions privately and in a few cases publicly.
00:11:18.680 You know, he's a huge Kamala supporter.
00:11:21.420 So I don't think that the numbers that I'm going to quote for you now are simply because of the non-representational nature of the sample.
00:11:31.440 So, again, to summarize, who is superior on immigration?
00:11:34.840 93.8% for Donald Trump.
00:11:37.800 Who is superior on the handling of crime?
00:11:40.960 94.1% in favor of Trump.
00:11:44.820 Who is superior on economic issues?
00:11:48.740 92.3% in favor of Trump.
00:11:51.840 And so far, just to give you the sample sizes, 3,291 people, 3,366, and 4,027.
00:12:00.160 Who is a stauncher supporter of the First Amendment, right?
00:12:04.460 Freedom of speech.
00:12:06.160 96.4% Donald Trump.
00:12:09.260 This is 2,682 people.
00:12:12.500 Now, it's not as though if you had a Harris supporter, that person, unless they're utterly lobotomized or dishonest, are going to say, no, no, no.
00:12:23.140 You know, actually, Kamala Harris is much better on the First Amendment.
00:12:25.880 These are, you know, objective facts that I can offer a bewildering amount of evidence in support of these, you know, poll results.
00:12:37.680 Okay, so who is a stauncher supporter of the First Amendment?
00:12:40.320 96.4% Donald Trump.
00:12:43.580 Who is a stauncher supporter of the Second Amendment?
00:12:46.820 96.6% Donald Trump.
00:12:50.980 And it's 2,851 people who voted.
00:12:54.060 Who is a stauncher supporter of the meritocratic ethos, which is the exact opposite to diversity, inclusion, equity, and affirmative action, and equality of outcomes, and so on, that she purports.
00:13:07.240 3,003 people voted.
00:13:08.920 This one was, quote, closer, only, quote, 87% voted for Donald Trump.
00:13:17.280 And then, who better embodies the foundational values of the United States?
00:13:22.920 4,675 people voted.
00:13:26.520 95.3% said Donald Trump.
00:13:31.440 So, the numbers, almost every single one that I just quoted, it's 95% and higher for Trump, and only one of them was 87% for Trump and 13% for Kamala, which, of course, makes you think, well, how could it be that the country is so divided 50-50?
00:13:52.260 It's precisely divided 50-50, because people are not using their cognitive system to make decisions.
00:14:01.240 They're using their emotive system.
00:14:03.360 I have tried to engage my own academic colleagues who are supposedly smart, right?
00:14:08.820 They have titles like professor and doctor before their name.
00:14:12.940 They're some of the biggest imbeciles imaginable.
00:14:15.320 Because when I say, can you give an actual cognitive justification?
00:14:20.260 I hate Trump.
00:14:21.240 He's disgusting.
00:14:23.220 He's not befitting to be a president.
00:14:26.560 He's a crook.
00:14:27.580 He's a scammer.
00:14:29.420 He's evil.
00:14:30.660 He's immoral.
00:14:31.940 He's cheated on his pregnant wife, right?
00:14:35.020 Now, these might be valid concerns.
00:14:37.480 You may think he's immoral.
00:14:38.740 You may think it's not good to cheat on your wife.
00:14:41.060 Those are all true.
00:14:41.880 But they certainly are not talking about the First Amendment or the Second Amendment or departures from reality.
00:14:48.080 When you think women with nine-inch penises are women.
00:14:52.800 When you want to promote all ways of knowing, including indigenous science, instead of something called the scientific method.
00:15:00.600 Immigration, economic policies, on and on and on.
00:15:06.120 There's absolutely no way that one, it could be 50-50.
00:15:10.340 Okay?
00:15:10.940 So now what I'm going to do next is I'm going to ask you to please turn to the two slides that I shared with you.
00:15:19.300 Because as some of you know, my doctoral dissertation was in, my PhD was in psychology of decision-making, right?
00:15:29.800 At Cornell, at the Johnson School of Management.
00:15:38.840 My doctoral supervisor is a cognitive and mathematical psychologist.
00:15:42.840 And so what I was trying to do in my PhD, it's a study of my PhD, is when is it that you have collected enough information to choose between two candidates?
00:15:56.980 And I actually spent quite a bit of time on June 21st.
00:16:00.840 The reason why I know that it's June 21st, because on the day of the 30th anniversary of defending my doctoral dissertation, which was on June 21st.
00:16:10.920 It was June 21st, 1994.
00:16:13.200 So I did an X Spaces where I went over in quite a bit of details what my doctoral dissertation was about, how it was a mix of cognitive psychology, experimental psychology, in studying psychology of decision-making.
00:16:27.980 And so the two slides that I'm going to show you now are very much, you know, within my doctoral area of psychology of decision-making.
00:16:37.820 So if you look at the first screenshot, and again, you can just go now and look at it.
00:16:43.900 I posted it even as a reply to my pinned tweet.
00:16:49.520 So if you go to my pinned tweet where I announced that I'm holding an X Spaces and just go to the reply thread, you'll see my two screenshots there.
00:16:59.340 So it's helpful for you to be able to understand.
00:17:01.440 So if you look at the screenshot that is titled Descriptive Decision-Making, How Individuals Actually Make Choices.
00:17:10.240 Can I just get a thumbs up that somebody's seeing it?
00:17:13.000 I'll wait a second.
00:17:13.900 I just want to make sure that you guys know what I'm talking about.
00:17:17.100 Somebody just give me a thumbs up.
00:17:19.260 I'm waiting.
00:17:20.180 There's a bit of a lag.
00:17:21.900 I'll wait a second or two.
00:17:23.800 Can I get a thumbs up?
00:17:25.120 Anybody?
00:17:28.720 Somebody?
00:17:29.600 Thumbs up?
00:17:30.260 Nobody?
00:17:33.660 Can people hear me?
00:17:35.080 The mic is on.
00:17:35.680 Okay, I got a thumbs up from Kira.
00:17:37.740 Thank you very much.
00:17:38.540 I appreciate that.
00:17:39.800 Okay.
00:17:40.320 Oh, I got a thumbs up from Gal.
00:17:42.200 That's great.
00:17:42.840 Thank you so much.
00:17:43.700 All right, guys.
00:17:44.200 Thank you.
00:17:44.600 Got it.
00:17:45.940 Okay, so if you look at that, this is called an informational display board.
00:17:50.280 Okay, and it's an informational display board because it's exactly that.
00:17:54.120 It's a matrix that displays a multi-attribute choice.
00:17:59.040 So let's say in this case you've got four cars, cars A, B, C, D.
00:18:03.640 Each of the four cars is defined by a bunch of attributes.
00:18:08.480 So price is an attribute, the quality of the whatever, the engine is an attribute, the safety record of the car is an attribute, and the miles per gallon, I guess the acronym, miles, whatever.
00:18:23.680 The gas efficiency is another attribute.
00:18:27.720 The weights that you see next to the attributes are the particular person's importance weight.
00:18:33.200 So that person thinks that price is the most important attribute for them at 0.5, 0.5 out of one, meaning that price is as important as the three other attributes, as quality, safety, and miles per gallon.
00:18:48.460 Okay, so that's why price is 0.5, and then quality is 0.25, that's 0.75, safety is 0.2, that's 0.95, and then miles per gallon is 0.05, and so it adds up to one.
00:19:01.080 Each of the four cars is going to receive a score on a common currency.
00:19:06.220 So on a scale of one to seven, seven is the best, one is the worst, how well does this car score on that attribute?
00:19:13.200 So for example, car A scores really well on price, pretty well on quality, not well on safety, and pretty well on gas efficiency.
00:19:22.100 Now, here's the interesting part.
00:19:24.540 By the way, I'm thinking of putting together an entire psychology of decision-making course, which is something that I've taught at the Masters of Science level and the doctoral level for many years.
00:19:36.960 I'm thinking of putting that behind a paywall for people because I think it's really incredibly exciting stuff that I think people should know about.
00:19:47.220 What is the actual psychological processes by which people make decisions?
00:19:51.020 And as I said, that was the central theme of my doctoral dissertation.
00:19:57.160 So if you look here, there are many different decision rules that one can use in arriving at a choice.
00:20:05.540 So contrary to what classical economists would tell you, which is you just pick the car that normatively maximizes your utility, that's actually not true.
00:20:16.760 So if we were doing now an entire course, I would show you probably 15 different decision rules.
00:20:23.680 Depending on which rule I use, I'll come up with a different choice.
00:20:27.500 Sometimes I'll come up with car A as the winner, sometimes car B, sometimes car C, sometimes car D.
00:20:32.720 Okay.
00:20:33.740 Meaning that depending on which decision rule I use, I will converge to a different final decision.
00:20:41.180 Okay.
00:20:42.020 So now let's apply it.
00:20:43.600 So you see, I'm only showing you here two rules.
00:20:46.320 Lexicographic, which is a rule that was first described by Amos Tversky.
00:20:53.320 For those of you who might know, Amos Tversky did unbelievable work with Daniel Kahneman.
00:21:00.980 Daniel Kahneman ended up winning the Nobel Prize in 2002 for his work with Amos Tversky.
00:21:08.100 But unfortunately, Amos Tversky didn't win the Nobel Prize because he had passed away in 1996.
00:21:14.280 And by the way, both Amos Tversky and Daniel Kahneman were psychologists that knew my doctoral supervisor well.
00:21:20.940 He comes from that tradition.
00:21:22.540 As a matter of fact, my doctoral supervisor's first published paper was with Amos Tversky.
00:21:29.120 I think it was in the late 60s, if I'm not mistaken.
00:21:33.020 I think they both got their PhDs.
00:21:36.840 I mean, I know for sure that my doctoral supervisor got his PhD at the University of Michigan.
00:21:41.220 And I think Tversky was also there, but maybe a bit more senior.
00:21:46.120 I can't remember the exact details.
00:21:48.140 But yeah, it was around the late 60s.
00:21:49.560 Anyways, so the lexicographic rule basically says, choose the alternative that has the highest score on the most important attribute.
00:21:59.900 Now, by the way, this is a decision rule that is often used in consumer decision making.
00:22:06.500 As a matter of fact, depending on the product category, as much as, you know, more than two-thirds of consumers will simply use the lexicographic rule.
00:22:15.500 So let's put the lexicographic rule to use when choosing between toothpastes.
00:22:21.200 Well, if I were to say, all I care about is which is the lowest-priced toothpaste, meaning that my most important attribute is price, and I will simply pick the toothpaste.
00:22:35.920 Let's say there are 10 toothpastes.
00:22:38.020 Let's say the third toothpaste is the cheapest one.
00:22:40.440 That's the one I pick.
00:22:41.260 So notice that under this rule, I did not look at all of the attributes defining toothpaste.
00:22:49.620 There might be seven attributes defining toothpaste.
00:22:52.340 How much tartar it removes, how well it makes, you know, your taste, your mouth, you know, how well your mouth tastes,
00:23:02.340 or how much it combats bad, you know, I'm losing my word, what's the word, like bad mouth smell.
00:23:12.040 What is the word?
00:23:12.720 I can't remember.
00:23:15.420 So there are many attributes that I can look at, but if I'm using the lexicographic rule, I only look at the most important.
00:23:21.300 So in this example, the one that I'm showing you on the screen, if you look at the most important attribute, it's price.
00:23:30.400 How do we know it's price?
00:23:31.800 Because that's the one that has the highest weight, 0.5.
00:23:34.520 And therefore, because car A scores the highest on price, I will pick it.
00:23:42.180 So notice, I didn't look at the information on quality.
00:23:44.940 I didn't look at the information on safety.
00:23:46.580 I didn't look at the information on miles per gallon and gas efficiency.
00:23:50.220 I only looked at the most important attribute.
00:23:52.220 That's, by the way, you might remember when I appeared on Sam Harris's show many years ago, around the time when Trump was running against Hillary Clinton, I explained to Sam, I said, look, there are very clear, rational reasons why people might choose Trump over Hillary Clinton.
00:24:12.200 For example, if they're using the lexicographic rule, and if immigration is the most important attribute for them, then rightly or wrongly, although, of course, I think it was rightly so, if they care only about immigration, meaning they are a one-issue voter, and they think that Trump is better than Hillary Clinton, boom, that's it.
00:24:33.920 Hillary Clinton might have been better on every single other attribute.
00:24:37.640 If they're using the lexicographic rule, it doesn't matter.
00:24:42.120 They only will look at that attribute and choose accordingly.
00:24:46.780 You follow?
00:24:47.660 That's why you study psychology of decision-making.
00:24:51.040 And I mean, that's why I'm housed in a business school, in a marketing department, because, of course, one of the places where you most often make decisions is that in consumer and economic decision-making.
00:25:01.560 But, of course, you also make decisions when you're engaging in mate choice, voter choice, and so on.
00:25:06.740 All right, so if I use the lexicographic rule on this informational display board, I would choose car A, and therefore, that's why I put it in red.
00:25:16.660 You see, lexicographic rule is in red, car A is in red.
00:25:19.500 Now, watch if I use the satisficing rule, okay?
00:25:23.640 I can't believe, by the way, people, you get all this information for free.
00:25:28.120 If you were taking it in university, it would have probably cost you about $8,000 now in tuition.
00:25:34.540 So do the right thing and go and subscribe to my exclusive content, okay?
00:25:39.340 Don't be a social parasite.
00:25:41.640 Reciprocal altruism, reciprocity is an important Darwinian mechanism.
00:25:46.140 Tit for tat.
00:25:47.200 I scratch your back, you scratch mine.
00:25:49.000 I give you infinite wisdom, you subscribe to my content.
00:25:53.400 It's well worth it, by the way.
00:25:55.480 In any case, satisficing rule, no, it's not a typo.
00:25:59.360 I didn't mean to write satisfying rule.
00:26:02.420 Satisficing rule means good enough, okay?
00:26:06.440 So watch.
00:26:07.480 You see the cutoffs that I've got there at the bottom of that informational display board?
00:26:12.720 It's 5, 4, 3, 2.
00:26:15.360 What does that mean?
00:26:15.980 I'm saying that the first alternative that I evaluate that passes all the cutoffs, I will
00:26:26.800 choose it.
00:26:27.860 So let's suppose I actually, so here, the order in which I evaluate the alternatives is important
00:26:38.040 because look, remember what I said, satisface.
00:26:41.720 If it's good enough, I pick it.
00:26:43.880 So I'm going to, now let's suppose I'm going to evaluate them in the alphabetical order,
00:26:49.980 A, B, C, and then D.
00:26:52.000 So I'm going to start with car A.
00:26:54.200 Is 7 greater than 5?
00:26:57.040 Now, in case you're wondering, 7 is the score that car A has on price.
00:27:03.580 And remember, my cutoff is 5, meaning I will not choose a car unless it has at least a 5 on
00:27:11.660 price.
00:27:12.020 So 7 is greater than 5, 5 is greater than 4.
00:27:16.480 Now, look at the third attribute.
00:27:18.360 Safety, car A scores a 2, but my cutoff is a 3.
00:27:23.800 That means it fails on the third attribute.
00:27:28.760 So car A is eliminated.
00:27:30.220 Now, I go to car B.
00:27:32.180 I do price, it scores a 1, but the cutoff is a 5.
00:27:37.780 Therefore, car B is eliminated after only looking at price.
00:27:43.120 Car C, 6 is greater than 5, 5 is greater than 4, 4 is greater than 3, and 3 is greater than 2.
00:27:52.920 Meaning that car C has passed every single one of my minimal cutoffs.
00:27:59.760 And therefore, I stop and choose car C.
00:28:04.500 The reason why it's called satisficing, because it could have been that car D is even better.
00:28:10.860 Let's suppose that car D scored 7, 7, 7, 7.
00:28:14.660 Like it scores the best on everything.
00:28:16.300 I would have picked it maybe, but I'll never get to car D.
00:28:20.440 Because what I'm saying is, first car that passes all my cutoffs, I choose it.
00:28:26.360 So let's use that in dating.
00:28:28.480 Let's suppose I'm a woman.
00:28:30.740 Here's how the satisficing rule would look like.
00:28:34.080 I want a guy to be at least of this level of education, and at least this level of good looks,
00:28:42.240 and at least of this height, and at least of this level of funny.
00:28:48.060 And if he meets those four minimal requirements, then I'll go out with him on a date.
00:28:53.200 Well, I just described for you in words the satisficing rule.
00:28:57.280 So you could take these decision rules that I just mentioned.
00:29:01.140 I only mentioned two of them, and you could put Kamala Harris and Donald Trump with all of your
00:29:09.260 important attributes and go through that exercise to then say, okay, so based on this whole cognitive
00:29:17.180 process, which of the alternatives should have been the one that I pick?
00:29:23.680 But of course, that's not what people do.
00:29:26.120 People are lobotomized fools.
00:29:28.760 But she's so joyful.
00:29:30.080 But she's got such good vibes.
00:29:32.960 But she's a woman.
00:29:34.440 She has a vagina.
00:29:35.800 What could be more exciting than that?
00:29:37.940 Well, but she's, she is a woman of color.
00:29:42.860 Guess what?
00:29:44.100 When I go to California, and I'm in my natural habitat, I'm about three, three shades darker
00:29:49.860 than her.
00:29:50.760 Okay, and I self identify as a woman.
00:29:53.320 And so I'm a, I should be President of the United States.
00:29:56.540 And I self identify as being American born.
00:30:00.040 So therefore, I should be President of the United States.
00:30:02.780 Believe me, it'll be a much better outcome than if this complete idiot were to become
00:30:07.140 President.
00:30:07.500 All right, so that's one way to study decision making.
00:30:13.440 Now, of course, look, oh boy, it's so hard to think.
00:30:17.100 Better just use our emotions.
00:30:19.280 It's exciting.
00:30:20.760 It's, it's, it's sexy.
00:30:22.340 It's different.
00:30:23.100 She's fun.
00:30:24.480 Positive vibes.
00:30:26.240 All right, let's go to the next one.
00:30:29.320 The next one says, okay, when we're down to our two final alternatives, this is the one
00:30:35.240 that it's titled criterion dependent choice model.
00:30:38.460 So when I have used the process, like the one on the previous page, narrowed it down from
00:30:45.280 say, X number of alternatives down to the final two.
00:30:49.200 So this is why this is called a binary sequential choice model.
00:30:53.000 So I've got two alternatives, A and B.
00:30:55.420 And if you can see here, you see how I've got two thresholds, 2.5 and 2.5.
00:31:01.900 What that's basically saying is I'm going to iteratively acquire one attribute information
00:31:09.840 at a time on both alternatives until I have sufficiently differentiated between the two
00:31:18.920 alternatives to stop and make a choice.
00:31:21.720 If I hit the upper stopping threshold, that's, by the way, that's, that's exactly what my doctoral
00:31:28.820 dissertation is on.
00:31:30.100 My doctoral dissertation is on sequential stopping decisions, which is exactly what you're seeing
00:31:35.800 in this curve, okay?
00:31:37.440 So if I hit the, the, the threshold of B, I stop and choose B.
00:31:45.080 If I hit the threshold of A, I stop and choose A.
00:31:49.040 So if you notice here, without going into all the details, the first piece of information
00:31:53.480 makes it that B is ahead.
00:31:56.040 But then the second piece of information, A is slightly more ahead.
00:32:01.360 Third piece of information, A is a bit more ahead.
00:32:05.020 Fourth piece of information, it crosses the threshold and therefore I stop.
00:32:10.360 It only took me four pieces of attribute information to stop and make a decision in favor of alternative
00:32:16.860 A.
00:32:17.160 Okay, so you could exactly apply this model by listing all of your attributes that are
00:32:23.560 important to you, every single one.
00:32:25.420 If, if let's say being a woman is one, well, you could put that, right?
00:32:28.900 So that would be a binary variable.
00:32:31.460 She's a woman, so she gets a score of one.
00:32:33.900 He's not a woman, he gets a score of zero.
00:32:35.800 Okay, you could put the scores on, on everything, on, on immigration policy, fiscal policy, tax
00:32:41.980 policy, immigration policy, criminal policy, freedom of speech, second amendment, on and
00:32:48.960 on and on.
00:32:50.100 And you score each of the two candidates on these attributes.
00:32:57.160 You give weights to each of the attributes.
00:32:59.480 So let's say for me, the most important attribute is immigration.
00:33:02.760 So I might give it the highest weight and so on.
00:33:06.260 And then you can literally go through that exercise.
00:33:09.040 Well, in my case, if I were to say, okay, my top five attributes are, you know, immigration
00:33:14.760 policy, economic policy, taxes, first amendment, you know, whatever, five of them.
00:33:23.840 Well, all five of them, Donald Trump would score higher and probably it would, it would make
00:33:31.200 me hit the stopping threshold rather, you know, quickly.
00:33:35.400 And therefore, I would choose Donald Trump.
00:33:37.440 Again, my point here is not to say, you know, vote for Donald Trump.
00:33:42.080 My point is, engage your cognition.
00:33:46.020 In other words, use the full panoply of information that might be relevant to you in making an informed
00:33:56.080 decision.
00:33:56.640 Now, this doesn't mean that affective-based processing is bad, right?
00:34:01.120 Because as I've said, for example, in the parasitic mind, chapter two, where I talk about
00:34:05.560 thinking versus feeling, there's, we are unequivocally both thinking and feeling animals.
00:34:12.800 The challenge, as I explained in the parasitic mind, is to know when to activate which system,
00:34:18.800 right?
00:34:19.060 There are many cases where it makes perfect sense for your autonomic emotional system to
00:34:25.000 be activated.
00:34:26.440 I'm going down a dark alley to take a shortcut home.
00:34:29.140 I see three young men that look suspicious loitering.
00:34:31.880 I get a fear-based response.
00:34:33.880 My blood pressure goes up.
00:34:35.240 My heart rate increases.
00:34:37.240 I start breathing in a more shallow manner.
00:34:40.480 That's an autonomic emotional response that is perfectly evolutionary-based and adaptive.
00:34:45.480 On the other hand, using my emotional system to do well on a calculus exam is probably not
00:34:50.940 a good idea.
00:34:51.860 I'm just going to panic and not do well.
00:34:54.160 So therefore, when it comes for choosing the president that's going to lead the free world,
00:35:02.000 it really does not make sense for you to use the emotional-based system, the affective
00:35:08.740 system.
00:35:09.060 So if you use the cognitive system and you arrive rationally at the decision that Kamala
00:35:18.820 is the best one, given how you score the particular attributes, how you weigh them, and so on,
00:35:25.660 hey, more power to you.
00:35:27.020 You made a rational choice.
00:35:29.220 But please don't be voting on completely irrelevant things because she's got positive vibes, because
00:35:36.520 you know, she's fun, because she smiles a lot, because she cackles like a degenerate lobotomized
00:35:43.840 idiot, right?
00:35:45.600 By the way, 1981, I believe.
00:35:49.360 You ready for this?
00:35:50.200 I don't think I've said this publicly.
00:35:53.980 That lobotomized fool went to Westmount High School in Montreal, not far from where I'm currently
00:36:02.740 sitting and doing this X-Spaces.
00:36:05.480 I used to go to Westhill High School.
00:36:09.260 Westhill High School in 19, I think it was 1981, played Westmount.
00:36:17.660 Kamala Harris went to Westmount.
00:36:21.040 Kamala Harris is my age.
00:36:24.780 I was the big soccer star, scoring more goals than the amount of cackling she does in a given
00:36:32.320 day, okay?
00:36:33.720 We were playing on a muddy day, very heavy pitch at Westmount High School.
00:36:41.300 Score was tied 3-3.
00:36:44.420 Penalty kick for us.
00:36:46.620 And it's me who takes the penalty kicks.
00:36:48.340 And my team was already celebrating because the likelihood of me missing a penalty shot
00:36:54.160 was about as high as occasional cortex AOC having a brain cell, meaning there was almost zero chance
00:37:05.020 of me not scoring.
00:37:07.220 But the pitch was heavy.
00:37:09.080 And guess what?
00:37:10.100 The impossible happened.
00:37:11.700 I went up for the penalty kick and I slipped and it ended up being like a complete slow pass to the goalie.
00:37:21.220 Everybody went quiet.
00:37:22.820 I am going to blame my missed penalty kick of 1981 prophetically on that cackler having been in the stands watching me.
00:37:38.020 Her negative aura made me miss that penalty shot.
00:37:41.580 So for no other reason than the fact that Dr. Goodlooks missed, I think this is the only penalty shot I might have missed in my entire career and I was always the penalty taker.
00:37:52.220 I'm going to blame this squarely at the feet of the cackler.
00:37:58.980 So there you have it, folks.
00:38:00.520 In conclusion, make sure that you, as you're watching tonight's debate, I don't know if they'll get into any serious substance
00:38:08.100 or whether it's going to be a debauchery of stupidity, but make sure that you engage your cognitive system.
00:38:16.040 I'm glad that it looks like the mic was on the entire time.
00:38:19.700 Thank you so much for coming in such an impromptu manner.
00:38:25.440 I'm going to now go and do a bit of biking because I'm a bit behind on my steps.
00:38:30.880 I'm at about 12,000.
00:38:32.340 I need to get to at least 15,000.
00:38:34.440 If you're going to watch the debate, have a good time.
00:38:36.580 And again, to all my American friends, please vote carefully.
00:38:40.520 Don't turn it into the Justin Trudeau show to the south of the border.
00:38:46.020 Take care, everybody.
00:38:46.920 Thank you so much for your attention.
00:38:48.460 I'll talk to you soon.
00:38:49.140 Cheers.
00:38:49.700 Cheers.
00:38:50.720 Cheers.
00:38:51.720 Cheers.
00:38:54.740 Cheers.
00:38:55.460 Cheers.
00:38:56.720 Cheers.
00:38:57.300 Cheers.
00:38:58.960 Cheers.
00:38:59.540 Cheers.
00:38:59.940 Cheers.
00:39:01.040 Cheers.
00:39:01.160 Cheers.
00:39:01.420 Cheers.
00:39:03.380 Cheers.
00:39:04.300 Cheers.
00:39:04.820 Cheers.
00:39:08.860 Cheers.
00:39:08.940 Cheers.
00:39:11.340 Cheers.
00:39:11.820 Cheers.
00:39:11.840 Cheers.
00:39:12.600 Cheers.
00:39:12.620 Cheers.
00:39:13.120 Cheers.
00:39:13.440 is on to the boostmond family.
00:39:15.400 Cheers.
00:39:16.620 Cheers.
00:39:17.300 Cheers.
00:39:17.360 Cheers.
00:39:18.100 Cheers.