The Saad Truth with Dr. Saad - April 03, 2024


Nobel Laureate Dr. Daniel Kahneman - Pioneer of Psychology of Decision Making (The Saad Truth with Dr. Saad_655)


Episode Stats

Length

48 minutes

Words per Minute

150.28897

Word Count

7,333

Sentence Count

266

Misogynist Sentences

7

Hate Speech Sentences

8


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of Things That Are Pissing Me Off, I discuss the passing of Nobel Prize winner Daniel Kahneman, and how his work helped shape my thinking and shaped my own thinking about happiness. I also share a personal story about how I first met Daniel and how he shaped my thinking about money.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 So today we're going to be talking about Daniel Kahneman, both some of my personal connections to him and his work,
00:00:07.940 but more generally about his work in Psychology of Decision Making, which is very much the area of my doctoral work.
00:00:18.860 Okay, so welcome to everybody. Thank you for showing up.
00:00:22.440 I did a X Spaces yesterday, impromptu one. It was titled, Things That Are Pissing Me Off.
00:00:31.220 And people said, please have this as a regular series. I just kind of riffed off the top of my head on things that are getting me angry.
00:00:39.060 I talked about four different things. It should be pretty easy to find on my feed.
00:00:47.040 And today I decided to do this one because, as I started saying a minute or two ago, as I was going through my Twitter feed,
00:00:57.840 I noticed Shai Davidae, actually the Columbia University professor in the business school who I recently had on my show,
00:01:06.880 who's been having all sorts of problems with anti-Semitism at Columbia.
00:01:10.040 I first read it on his Twitter feed that Daniel Kahneman had passed away.
00:01:17.920 So what I thought I would do is just, as I said, talk about truly what a gigantic psychologist Daniel Kahneman is.
00:01:29.260 It's really, if you don't know anything about him, that's okay.
00:01:32.540 But even if you're not an academic or if you're not a psychologist, it's worth knowing his work.
00:01:41.620 Some of you may have gotten to know his work in his 2011 book, Thinking Fast and Slow,
00:01:48.440 which refers to System 1 and System 2 processing, fast processing, autonomic processing,
00:01:55.140 versus more deliberative, cognitively effortful processing.
00:02:01.020 And in that book, he kind of went over many of the research streams that he had developed,
00:02:09.320 largely with Amos Tversky, who was his colleague.
00:02:12.680 They're both Israeli originally, both Amos Tversky and Daniel Kahneman.
00:02:16.760 And regrettably, Daniel Kahneman, sorry, not Daniel Kahneman, Amos Tversky passed away in 1996
00:02:23.840 of an aggressive melanoma, which I think had spread to his kidney.
00:02:31.500 And so, you know, he didn't, he wasn't around to see when Daniel Kahneman won the 2002 Nobel Prize in Economics.
00:02:42.760 But in any case, what I wanted to do first, and then I'll get into, I'll drill down into what the research of Daniel Kahneman is
00:02:51.300 and how it shaped my own thinking and psychology and decision-making and so on.
00:02:56.920 I want to read you a passage, a section in my latest book, The Sad Truth About Happiness.
00:03:04.480 This is the chapter where I'm talking about all sorts of correlates to happiness.
00:03:08.620 How does personality affect happiness? How does political orientation affect happiness?
00:03:12.980 How does, you know, religiosity affect happiness? How does culture affect happiness and so on?
00:03:18.320 And so at one point, I'm talking about the link between money and happiness.
00:03:24.060 And actually, it's something that recently I posed for those of you who listened to our, my chat with Elon Musk.
00:03:32.120 I actually, I mean, obviously, I'm talking to the richest man in the world.
00:03:37.040 It seemed uniquely apropos to ask him what did he think about whether money leads to happiness.
00:03:45.080 The research, the classic study on that issue had found that once your basic needs are met,
00:03:52.560 at the time, I think it was $75,000, but this needs to probably be updated a bit given inflation.
00:04:01.420 But once your needs are met, you don't have to worry about, you know, putting a roof over your head,
00:04:08.300 having food for your kids.
00:04:11.800 The utility that you reap from having an extra million or whatever doesn't really add much to happiness.
00:04:17.400 But anyways, so in this section where I'm talking about does money lead to happiness,
00:04:21.940 I'm talking about a personal story regarding a Nobel Prize winner versus, you know, money.
00:04:29.760 So let me just read it for you because the name Daniel Kahneman comes up.
00:04:34.920 And so I'm just going to read to you.
00:04:35.940 It's about a page long, a bit more than a page.
00:04:39.540 It's on pages 20 and 21 of, as I said, my happiness book.
00:04:43.980 So the title of the section is A Nobel Prize or Money.
00:04:48.460 So I'll first read you that section, maybe comment a bit more about that issue,
00:04:52.760 and then I'll drill into the Daniel Kahneman stuff.
00:04:56.560 So here we go.
00:04:57.140 In my academic career, I have had the honor of meeting and interacting with several Nobel Prize winners
00:05:03.140 or eventual winners.
00:05:04.820 My first such experience was as a first-year doctoral student at Cornell University
00:05:09.260 when I took Richard Thaler's Behavioral Decision Theory course.
00:05:13.000 Thaler went on to win the Nobel Memorial Prize in Economic Sciences in 2017.
00:05:20.640 In 1992, as a young doctoral student, if memory serves me right, I met Professor Daniel Kahneman
00:05:27.060 and or perhaps his longtime collaborator Amos Tversky, one of the great thinkers on the topic
00:05:32.680 of decision-making.
00:05:34.180 In 2002, as a visiting professor at the University of California at Irvine, I predicted to my MBA
00:05:40.340 class that Kahneman would win the Nobel Memorial Prize in Economic Sciences.
00:05:45.060 Less than 24 hours later, I learned that he had.
00:05:49.320 I also had brief communications with the other 2002 Nobel Laureate in Economics, Vernon L. Smith,
00:05:55.400 as well as with Paul Greengaard, a 2000 Nobel Prize winner in Physiology or Medicine,
00:06:00.800 and with Kip Thorne, winner of the Nobel Prize in Physics in 2017.
00:06:06.720 While each of these encounters was memorable, none was quite as awe-inspiring as when Herb Simon,
00:06:13.260 winner of the Nobel Memorial Prize in Economic Sciences in 1978, visited Cornell University in 1993.
00:06:21.500 My doctoral supervisor knew him well and hence had met him for lunch.
00:06:28.320 I was miffed at the time that I had not been invited to join them, but my disappointment was
00:06:33.380 attenuated when my supervisor sent me a memo, which I still have somewhere, telling me how
00:06:38.260 Simon had praised some of my work.
00:06:41.140 I shared these brushes with Nobel Prize winners to contrast how I define wealth, the richness
00:06:47.560 of one's life experience, with how my more materialistically inclined family members define
00:06:54.460 wealth.
00:06:55.360 During a trip to Rio de Janeiro, I had shared my excitement at meeting the great Herb Simon,
00:07:00.980 one of the truly great polymaths of the 20th century.
00:07:04.640 Rather than sharing in my excitement, my relative smugly stated, quote,
00:07:10.040 Who the hell is this guy?
00:07:11.480 I can probably buy him 500 times over, close quote,
00:07:15.900 To which I retorted, quote,
00:07:19.400 Perhaps, but while 500 people will wait in line to hear him speak, no one cares what you
00:07:25.420 have to say, close quote.
00:07:27.080 So there you have it.
00:07:29.020 Two ways to accumulate wealth, through amassing brilliant moments or amassing piles of dollars.
00:07:35.540 Both have their points, but the former can enrich your soul.
00:07:39.140 The latter, if it is all you value, can rot it.
00:07:43.120 And so the reason I mentioned that story was because, you know, here was a family member
00:07:48.860 who was so materialistically inclined that he wasn't impressed with the fact that Herb Simon
00:07:55.420 was this great thinker, one of the great polymaths of the 20th century.
00:07:59.540 By the way, Herb Simon won the Nobel Prize in 1978 for his work on bounded rationality.
00:08:06.740 So now I'm going to get into the technical weeds.
00:08:08.840 Okay, so bear with me.
00:08:11.600 This will give you a good sense of what you could expect if you were taking a course with me at university.
00:08:19.720 So Herb Simon won it because he argued that human beings, as decision makers, are boundedly rational.
00:08:27.940 So the concept of bounded rationality means that contrary to what classical economists think,
00:08:34.520 which is that if we are going to maximize utility and arrive at an optimal choice,
00:08:41.280 we should, you know, process all of the relevant and available information prior to making a choice,
00:08:49.500 whether we're choosing between political candidates to vote for or choosing between cars to purchase or houses to buy.
00:08:56.160 We're going to look at all of the available information and go through a very deeply cognitively effortful process
00:09:03.660 because that's the only way that we can make sure that we're going to maximize our utility.
00:09:08.360 And what Herb Simon said is, well, no, we're bounded by the fact that it is cognitively costly to, you know,
00:09:18.520 to acquire information.
00:09:19.820 There is, you know, search costs.
00:09:22.980 There is tedium.
00:09:23.780 So for all sorts, there's time pressures.
00:09:26.340 So we are, yes, we are rational in that, you know, we apply certain decision rules
00:09:32.740 and arriving at hopefully the best possible, you know, option.
00:09:36.440 But we are bounded by all these different things.
00:09:38.780 So that was one of the things I talked about.
00:09:40.340 I'm somewhat simplifying, but that's the general idea.
00:09:42.520 Now, to my point about classical economics, so in classical economics, you have what's called homo economicus.
00:09:52.440 Homo economicus is the idea that human beings, to the extent that they are rational,
00:10:01.220 they are rational in terms of abiding to certain axioms of rational choice.
00:10:08.300 Now, what does that mean?
00:10:09.080 That sounds like a mouthful.
00:10:10.140 So let me give you an example.
00:10:12.200 By the way, this is called normative decision making, because normative decision making is the idea that
00:10:17.940 if we're to presume that consumers, for example, are rational beings,
00:10:24.840 they ought to adhere to certain norms of rationality.
00:10:29.920 Hence, that's what we mean by normative decision making.
00:10:32.860 So what is an example of an axiom of rational choice?
00:10:37.240 And then I will come to the outlandishly brilliant work of Kahneman and Tversky.
00:10:43.420 So take, for example, the transitivity axiom.
00:10:46.720 The transitivity axiom from mathematics, in this case as applied to choice theory, says that if I prefer car A to car B,
00:10:55.260 and I prefer car B to car C, it has to be that I prefer car A to car C.
00:11:00.940 So if A is greater than B and B is greater than C, then it has to be that A is greater than C.
00:11:08.220 So if you violate that transitivity axiom, then you are being irrational in a homo economicus sense.
00:11:18.060 So economists would say, no rational decision maker could ever do that.
00:11:23.280 Well, guess what?
00:11:24.460 In a paper, I think it was in the late 60s, Amos Tversky, the eventual collaborator of Daniel Kahneman,
00:11:32.500 showed that no, people do make intransitive choices.
00:11:36.880 And he demonstrated it with a very clever set of experiments.
00:11:40.220 By the way, my doctoral supervisor, who obtained his Ph.D. in cognitive and mathematical psychology
00:11:49.080 at the University of Michigan, which is one of the top psychology departments in the world,
00:11:54.640 that's where Amos Tversky also did his Ph.D.
00:11:58.020 And Daniel Kahneman had gone there, I think, I can't remember if it was for a postdoc or for some reason.
00:12:02.920 Well, the first paper that my doctoral supervisor, Jay Russo, ever published is with Amos Tversky.
00:12:11.680 Again, Amos Tversky being the longtime collaborator of Daniel Kahneman, who won the Nobel Prize.
00:12:16.340 So there is that link.
00:12:18.340 And as I mentioned in the passage that I just read to you, Richard Thaler, who was my professor at Cornell,
00:12:23.500 ended up also winning the Nobel Prize in 2017 for his work in behavioral finance and behavioral economics.
00:12:30.960 Okay, so I just described one axiom of rational choice, which is transitivity.
00:12:39.640 Let me describe another one.
00:12:41.000 This is called the axiom of description invariance.
00:12:45.840 Again, it's a mouthful, but you'll understand it once I give you specific examples.
00:12:50.140 If I tell you that a hamburger is 90% fat-free, or I tell you that a hamburger is 10% fat,
00:12:57.200 those two statements, so the fancy term is they are isomorphically equivalent.
00:13:03.480 Isomorphically equivalent means they are literally the exact same statement, just framed differently.
00:13:09.220 So it's not you're optimistic or pessimistic or you're a half-full cup or a half-empty cup.
00:13:17.780 Saying that a burger is 90% fat-free is identical to saying that a burger is 10% fat.
00:13:23.720 Okay, now, remember that for a second.
00:13:26.420 Or if I tell you that three out of five dentists recommend this toothpaste,
00:13:32.600 it is isomorphically equivalent to telling you that two out of five dentists did not recommend it, right?
00:13:39.900 Again, these are identical statements, just framed differently.
00:13:44.020 Now, from a rational choice perspective, from a homo economicus perspective,
00:13:50.140 it has to be that a consumer, it doesn't matter whether I make him try the burger that's 90% fat-free described,
00:13:59.540 or 10% fat described, you should arrive at the same final evaluation,
00:14:04.300 because it's the exact same burger described in two equivalent ways.
00:14:09.980 And yet, what Kahneman and Tversky showed, and this became known as the framing effect,
00:14:16.420 that the way that you frame something, even though the two frames are identical logically,
00:14:22.860 can cause people to have preference reversals.
00:14:26.340 Meaning that if you frame it in one way, I choose A over B,
00:14:30.540 but if you frame it the other way, I choose B over A,
00:14:33.960 and hence I'm being axiomatically irrational.
00:14:37.200 Okay, and I'll come back to the framing effect in a second when I talk about a study that I did
00:14:42.740 where I looked at some evolutionary explanations for the framing effect.
00:14:49.180 So, bear with me.
00:14:50.660 Let me describe a third axiom of rational choice.
00:14:56.040 This is called procedural invariance.
00:14:58.820 So, let's suppose I'm going to ask you to choose between two options, A or B.
00:15:06.680 Okay?
00:15:07.440 Now, I'm either going to ask you to choose between A or B,
00:15:11.880 and I present to you the two choices sequentially or simultaneously.
00:15:16.980 So, let me explain what I mean by that.
00:15:18.520 Let's suppose it's two cars, and I say, let's say your name is John.
00:15:22.740 Hey, John, here's car A and car B.
00:15:24.940 I'm showing you both of them.
00:15:26.560 Tell me which one you prefer.
00:15:27.880 And you say, okay, let me look at both cars.
00:15:30.380 I prefer car A.
00:15:32.220 So, that's called a simultaneous presentation.
00:15:35.860 I'm simultaneously presenting you with both choices, A and B.
00:15:39.560 Now, if I were to show you car A and ask you on a scale of 0 to 100,
00:15:45.680 0 is the worst possible, 100 is the best possible,
00:15:49.600 please give me a score of what you think of A.
00:15:51.740 So, I show you A, and you give it a score of 80.
00:15:55.620 And then I show you B independently, right?
00:15:58.300 It wasn't simultaneous.
00:15:59.660 It was sequential.
00:16:01.020 And now B gets a 90.
00:16:03.200 Well, now you've engaged in irrational behavior.
00:16:05.680 How could it be that when I showed you both cars together, you chose A,
00:16:10.340 but when I showed you first car A, and then separately car B, you chose B.
00:16:16.620 Therefore, you're engaging in irrational choice.
00:16:20.120 Well, what Kahneman and Tversky did is demonstrate that all of these axioms of rational choice
00:16:28.020 that classical economists thought is exactly how we ought to behave normatively is complete nonsense.
00:16:35.980 We don't do that.
00:16:37.800 We are not rational in the homo economicus sense of the term.
00:16:43.500 Now, I won't go into all of their incredible research,
00:16:49.280 but I'll just mention one because it was specifically cited when Kahneman won the Nobel Prize.
00:16:55.280 Because prospect theory is a, I won't go into all the technical details,
00:17:00.160 but the thing that you need to remember is if I tell you, for example,
00:17:03.020 that you could take a bet that causes you to lose $100,
00:17:08.240 or you could take a bet that causes you to win $100,
00:17:13.120 the pain or pleasure of losing $100 or winning $100 is not linear.
00:17:20.480 In other words, your behavior in trying to avoid a loss is much greater than your behavior
00:17:28.620 in trying to, you know, get an equivalent win.
00:17:33.940 And therefore, that became known as loss aversion.
00:17:36.440 And the utility function that captures that loss aversion is part of prospect theory.
00:17:42.040 So, it's actually incredibly powerful research because,
00:17:47.420 so think about the framing effect, which I mentioned a few minutes ago,
00:17:51.040 the one where, you know, 90% fat-free burger versus 10% fat.
00:17:55.940 Now, you might say, okay, well, fine.
00:17:57.220 People behave irrationally, but who cares?
00:17:58.800 They're just, you know, they might make an irrational choice
00:18:01.780 when they're choosing between burgers.
00:18:04.820 Okay, fine.
00:18:05.440 That's true.
00:18:05.960 That's great.
00:18:06.440 But, you know, it's not a big deal.
00:18:08.100 No, it is a big deal.
00:18:09.760 Because let's suppose, God forbid, that you go see your surgeon, your oncologist,
00:18:16.220 and the oncologist has to choose between you having radiation therapy or a radical surgery.
00:18:25.760 It's one or the other.
00:18:27.360 Now, imagine if I were to tell you that if I frame the odds of success in one way,
00:18:35.380 the surgeon will say, oh, we've got to do radiation therapy.
00:18:38.140 But if I frame it in exactly the equivalent, the logically equivalent other way,
00:18:44.480 I can get him to change his opinion.
00:18:47.700 So that he said to you, when I framed it in one way, he said we should do radiation therapy.
00:18:53.880 And when I framed it the other equivalent way, he says, oh, no, no, we should do surgery.
00:18:58.080 Well, that's a problem, because that's showing you that even within the context of expert domains,
00:19:05.620 the surgeon's mind has the exact same architecture as the mind of the rest of us,
00:19:11.760 and he is just as prone, sorry, I should say he or she or they are just as prone to commit that bias.
00:19:22.700 Another way to think about the work of Kahneman Tversky is the following.
00:19:27.160 So many of you have probably seen, say, in an intro to psychology course,
00:19:32.260 when you're studying perception, how your perceptual system can be tricked.
00:19:40.540 For example, some of you might have heard, I think it's called Ponzo's Illusion.
00:19:43.660 If I show you two horizontal lines that are, you know, exactly the same length,
00:19:51.620 but there is, they are surrounded by lines that are, you know, falling from left to right and right to left,
00:20:00.780 meaning that the context around the two equidistant, the equal lines,
00:20:06.580 can cause you to think that line B is longer than line A,
00:20:12.500 even though the two lines are exactly the same length.
00:20:17.260 I don't know if I've explained this well.
00:20:19.100 I wish I could show it to you visually,
00:20:21.320 and hopefully one day soon we'll have the whole visual means to do so on X basis.
00:20:27.540 But in the same way that our visual system can succumb to perceptual illusions,
00:20:35.260 our mind can succumb to cognitive distortions.
00:20:40.380 And so what Kahneman and Tversky did is they systematically,
00:20:47.520 through many, many decades of research,
00:20:51.960 showed that completely contrary to the economic, homo economicus model,
00:20:58.640 we often succumb to these biases.
00:21:02.100 And they went through a whole bunch.
00:21:03.260 There's something called the conjunction fallacy, the base rate fallacy.
00:21:08.380 There's the overconfidence.
00:21:09.520 There's a million of these, okay?
00:21:10.900 There's the availability heuristic.
00:21:14.760 There's actually, you can go on Google and just enter cognitive biases,
00:21:20.260 and I think that's probably the right search term,
00:21:22.860 and it will list you all of the cognitive biases that have been documented,
00:21:28.540 not just by Kahneman and Tversky, but by that whole tradition of research.
00:21:33.500 And so I very much was trained within that.
00:21:37.880 So that's why, you know, my original work was in psychology of decision making.
00:21:43.280 As a matter of fact, my doctoral dissertation was in trying to address one specific problem,
00:21:50.680 which is, when is it that a decision maker has acquired enough information
00:21:57.460 to stop acquiring additional information and commit to a choice?
00:22:01.320 So let's say I'm choosing between cars A and B.
00:22:03.720 I could look up to 50 attributes on the two cars,
00:22:07.020 but I won't do that.
00:22:08.160 After maybe seven pieces of information that I've collected,
00:22:12.280 I now have sufficiently differentiated the two alternatives
00:22:15.720 that I think I don't need to look at anymore.
00:22:18.480 I'll stop, and I'm ready to buy the Toyota.
00:22:22.320 And so I looked at the cognitive stopping strategies
00:22:25.600 that are used in information search,
00:22:28.580 and that's very much part of this whole discussion
00:22:31.560 because classical economists would say,
00:22:33.360 no, no, no, if there are 50 attributes that are relevant for this decision,
00:22:39.080 you have to look at all of the relevant and available information.
00:22:42.580 Otherwise, you might be making a suboptimal choice
00:22:46.300 that doesn't maximize your utility.
00:22:48.300 So I was very much trained,
00:22:50.080 both because of my doctoral supervisor, Jay Russo,
00:22:52.960 because of my professor, Richard Thaler,
00:22:55.820 who, as I said, eventually ended up winning the Nobel Prize
00:22:58.540 for being associated with Kahneman and Tversky.
00:23:01.340 I regret that I never had the opportunity
00:23:03.500 to publish any papers with Kahneman and Tversky.
00:23:07.300 I do remember my first semester at Cornell in my PhD,
00:23:12.060 I had taken a course by Professor Thaler,
00:23:14.560 and I had gone to his office,
00:23:16.300 and he had pitched some ideas for us to work on together.
00:23:20.040 And to this day, I regret that we never ended up
00:23:22.500 doing some of the work that we had discussed.
00:23:25.380 It would have been great to work with him,
00:23:27.940 a truly brilliant man.
00:23:30.580 There was also at Cornell other very accomplished decision theorists.
00:23:37.260 For example, Thomas Gilevich was also one of my professors in psychology.
00:23:41.980 So you may have heard of him,
00:23:44.740 if you've read my latest book, my happiness book,
00:23:48.340 I have a chapter on regret.
00:23:50.860 And what Gilevich did is he pioneered some of the work
00:23:55.460 on the psychology of regret,
00:23:57.940 specifically the idea that there are two sources of regret that we face.
00:24:05.320 It could be either regret due to action or regret due to inaction.
00:24:10.520 So regret due to action would be,
00:24:12.740 you know, I regret that I cheated on my wife,
00:24:15.140 and that led to my divorce.
00:24:17.040 So because of an action that I committed,
00:24:19.280 it led to a bad consequence,
00:24:20.880 and I regret that action.
00:24:22.800 Regret due to inaction would be,
00:24:25.160 you know, I regret that I never pursued my interest in art
00:24:29.580 and art history and architecture,
00:24:31.860 and instead I became, you know,
00:24:34.660 a pediatrician because my dad was a pediatrician,
00:24:38.600 and that's what was expected of me.
00:24:39.800 So I regret that I never took that other road.
00:24:42.280 Now, it turns out, folks,
00:24:44.960 as perhaps you might,
00:24:48.120 I don't know if you would have thought so,
00:24:51.140 but it turns out that over the long haul,
00:24:54.620 the regret that looms the largest in people's minds
00:24:58.520 is those of inaction,
00:25:00.840 of the road not taken.
00:25:02.840 So in the context of my book on happiness,
00:25:04.840 which I so hope that you get a copy,
00:25:08.280 I mean, it's done reasonably well,
00:25:09.580 but not nearly as well as was expected of it,
00:25:12.020 in part because the publisher was bought out,
00:25:15.220 was going through a complete chaos
00:25:16.620 as my book had just come out.
00:25:18.920 And so, you know,
00:25:20.560 it basically fell through the cracks.
00:25:22.540 I'm hoping now that the paperback version
00:25:24.840 is going to be released
00:25:25.780 with the publisher who bought my old publisher.
00:25:28.240 So hopefully it will find its appropriate attention,
00:25:31.140 but I really hope that you purchase a copy.
00:25:33.300 It's a really fun book.
00:25:34.640 It's a mix of, you know, ancient wisdoms.
00:25:40.040 You know, here come the ancient Greeks,
00:25:41.700 here comes Epictetus,
00:25:42.800 here comes Seneca,
00:25:43.800 here comes Aristotle,
00:25:45.200 here comes Aurelius and so on,
00:25:46.980 because of course they wrote a lot about,
00:25:48.800 you know, the good life and happiness.
00:25:51.200 And it is backed up by contemporary science
00:25:53.640 and positive psychology,
00:25:55.120 happiness studies,
00:25:57.440 you know, hedonic psychology,
00:25:59.240 neuroscience,
00:25:59.680 and then peppered with all of my personal stories,
00:26:03.400 just like the one I just read to you
00:26:04.700 about the, you know, Nobel Prize or money.
00:26:07.180 And so I hope that you get it.
00:26:08.120 But in any case,
00:26:08.920 in the book,
00:26:10.000 the chapter on regret,
00:26:11.220 I talk about that,
00:26:12.800 you know,
00:26:13.040 one of the ways that you could forestall regret
00:26:15.020 later in your life
00:26:16.240 is to live an authentic life.
00:26:18.180 Now, authentic doesn't just mean authentic
00:26:20.580 in that you're a real person.
00:26:22.180 You know,
00:26:22.780 I am an authentic guy.
00:26:24.240 I speak my mind.
00:26:25.300 I don't,
00:26:26.120 I'm not two-faced.
00:26:27.420 I mean, that's true.
00:26:28.160 You should be authentic.
00:26:29.680 But it's,
00:26:30.360 I mean it in a more broader,
00:26:32.240 grander,
00:26:33.060 existential authenticity.
00:26:34.520 Be true to yourself.
00:26:35.540 That's why the old Delphic maxim,
00:26:38.360 right,
00:26:38.800 know thyself,
00:26:39.880 is so powerful,
00:26:41.340 right?
00:26:41.820 It's so powerful
00:26:42.780 because it's so simple
00:26:44.740 but yet so profound.
00:26:46.180 Know thyself.
00:26:47.060 You're going to make mistakes in life
00:26:48.600 if you don't know who you are,
00:26:51.560 right?
00:26:52.020 But if you know that
00:26:53.300 you're going to be happiest
00:26:54.380 being an artist,
00:26:55.740 then maybe you shouldn't become
00:26:56.960 a pediatrician
00:26:57.780 just because your dad said
00:26:59.900 that's the right thing
00:27:01.280 to do in the contemporary market.
00:27:02.700 That's a sure way
00:27:03.780 to wake up at 55
00:27:04.820 and say,
00:27:05.560 I'm facing a midlife crisis.
00:27:08.060 So anyway,
00:27:08.500 so Daniel Kahneman
00:27:09.800 is unbelievable
00:27:12.120 because anyone
00:27:13.020 who does any research
00:27:15.340 in anything resembling
00:27:17.360 decision-making
00:27:18.300 ends up citing him.
00:27:19.940 It could be accountants.
00:27:21.420 It could be engineers.
00:27:22.840 It could be physicians,
00:27:24.640 right?
00:27:24.840 So there's a whole field
00:27:25.960 of medical decision-making
00:27:27.100 and they all cite their work.
00:27:28.420 It could be political decision-making.
00:27:30.460 It could be consumer decision-making,
00:27:32.360 which is,
00:27:32.980 you know,
00:27:33.280 the area that I'm more in
00:27:34.480 or behavioral decision theory,
00:27:36.220 which is the area that I'm in.
00:27:37.780 So I,
00:27:39.100 just earlier before
00:27:39.940 as I was preparing
00:27:41.180 when I first announced
00:27:42.460 that I was going to have
00:27:43.140 an X Spaces
00:27:45.860 on Danny Kahneman,
00:27:46.680 I just went to check
00:27:48.840 his Google Scholar bibliometrics.
00:27:53.460 Now,
00:27:54.260 I don't,
00:27:55.340 I'm not sure if many of you
00:27:56.420 understand what the metrics are,
00:27:57.900 but like,
00:27:58.340 there are several key metrics
00:27:59.620 when you look at
00:28:00.360 the influence of a,
00:28:02.360 of an academic.
00:28:04.140 Number one,
00:28:04.760 it could be just
00:28:05.460 the number of papers
00:28:06.360 that they've published.
00:28:07.280 But if the number of papers
00:28:08.320 that they've published
00:28:09.200 are ultimately not cited,
00:28:11.160 so you could have a person
00:28:12.160 who's published 10 papers
00:28:13.480 and those 10 papers
00:28:14.900 have been cited 10,000 times,
00:28:16.400 you could have another person
00:28:17.860 who's been,
00:28:18.560 who's published 50 papers,
00:28:20.640 but they've only been cited
00:28:22.260 500 times.
00:28:23.620 So even though
00:28:24.500 the latter person
00:28:25.760 has published
00:28:26.680 five times more paper,
00:28:28.260 the influence of
00:28:29.400 his or her research
00:28:30.560 is much lesser
00:28:31.420 in that it hasn't been
00:28:32.420 cited by others.
00:28:33.580 So usually when you go
00:28:34.640 to Google Scholar,
00:28:35.860 you will check a few metrics,
00:28:37.440 one of which is called
00:28:38.200 the H index,
00:28:39.600 which is a way to
00:28:40.560 take a snapshot of
00:28:41.980 how good a researcher is.
00:28:43.820 So an H index
00:28:45.320 of over 30
00:28:46.580 is typically
00:28:47.700 what you would expect
00:28:48.900 of say a full professor
00:28:51.120 that's, you know,
00:28:52.000 at a good university.
00:28:53.560 Well, his H index
00:28:55.100 is 158, I think.
00:28:57.940 I mean, it is,
00:28:58.700 and this is not
00:28:59.400 a linear measure.
00:29:00.300 So it is so
00:29:01.300 astoundingly high.
00:29:03.440 So like,
00:29:04.020 let me give you another,
00:29:05.360 without getting into the weeds
00:29:06.480 of what the H index is,
00:29:07.860 just total number
00:29:09.280 of citations.
00:29:09.940 I mean,
00:29:10.740 if you're talking about,
00:29:12.200 if you're a scholar
00:29:12.880 that has,
00:29:13.740 you know,
00:29:14.020 a few thousand citations,
00:29:16.140 you know,
00:29:16.540 you're doing well.
00:29:17.900 Some of the top psychologists
00:29:19.960 might have 30,000,
00:29:21.680 20,000,
00:29:22.760 50,000.
00:29:24.000 Daniel Kahneman's
00:29:25.240 total citations
00:29:26.940 in academia
00:29:27.720 is 500,000 plus.
00:29:31.460 So he's not even,
00:29:32.660 it's kind of like saying
00:29:33.840 that,
00:29:34.700 you know,
00:29:35.280 it's basically like,
00:29:36.240 like Lionel Messi,
00:29:37.360 right?
00:29:37.500 The average top goal scorer
00:29:39.740 will score 200 goals
00:29:41.980 in their career
00:29:42.580 if they are amazing.
00:29:43.980 Wow,
00:29:44.260 that's amazing.
00:29:44.980 He's got,
00:29:45.500 he scored 200 goals
00:29:46.580 in his professional career.
00:29:48.480 Well,
00:29:48.660 Lionel Messi is well over 800,
00:29:50.600 right?
00:29:50.880 So it's,
00:29:51.320 it's a level of influence
00:29:53.680 that's difficult to,
00:29:55.380 you know,
00:29:57.940 to impart to people.
00:29:58.920 Really an important guy.
00:30:01.460 Now,
00:30:02.140 let me just mention
00:30:02.880 a few things.
00:30:03.620 So I was very much
00:30:05.580 into,
00:30:06.540 you know,
00:30:10.060 Kahneman and Tversky
00:30:11.060 and,
00:30:11.780 I mean,
00:30:12.160 that's,
00:30:12.540 that's exactly
00:30:13.620 what my doctoral dissertation
00:30:14.580 was all about.
00:30:16.260 But then I became
00:30:17.400 slightly disillusioned.
00:30:18.680 And so let me explain,
00:30:19.940 and this is,
00:30:20.800 please don't,
00:30:21.580 I mean,
00:30:21.740 I've been,
00:30:22.140 I've been only saying
00:30:23.260 unbelievably positive things
00:30:24.620 about Kahneman
00:30:25.460 and,
00:30:26.360 and,
00:30:26.980 and nothing that I'm going
00:30:27.940 to say now
00:30:28.540 is going to take away
00:30:29.220 from the fact
00:30:29.680 that I think he's just
00:30:30.580 one of the true,
00:30:31.720 you know,
00:30:32.860 behavioral science giants
00:30:34.260 ever.
00:30:35.840 So please don't take this
00:30:37.300 as,
00:30:37.900 you know,
00:30:38.140 me critiquing him
00:30:39.840 in any forceful way.
00:30:41.780 But let me just explain
00:30:43.300 why I started
00:30:44.380 to slowly deviate
00:30:46.100 away
00:30:47.160 from some of this work.
00:30:48.820 What ended up happening
00:30:50.420 with the behavioral
00:30:51.600 decision making framework
00:30:52.920 is what I,
00:30:54.300 and I think a few others,
00:30:55.820 began calling it
00:30:56.940 the violation
00:30:58.540 of the month club.
00:31:00.000 Meaning that
00:31:01.000 every month
00:31:02.220 some really bright
00:31:03.920 psychologists
00:31:04.600 would come up
00:31:06.080 with yet another
00:31:07.260 clever demonstration
00:31:08.680 of how idiotic
00:31:10.560 classical economists are.
00:31:12.080 So basically the,
00:31:12.980 the thing that was
00:31:14.400 driving the research
00:31:15.840 is classical economists
00:31:18.200 think that we are
00:31:19.620 these hyper-rational
00:31:21.000 creatures
00:31:21.980 and our job
00:31:24.020 as behavioral decision theorists
00:31:25.720 is to demonstrate
00:31:26.560 that they are wrong.
00:31:28.160 Well,
00:31:28.640 I got tired of that
00:31:29.940 because to me
00:31:30.960 it seems like
00:31:31.920 if I spend
00:31:33.000 my entire career
00:31:34.240 as an anatomist
00:31:36.280 proving to you
00:31:37.880 that the pancreas
00:31:39.440 of human beings
00:31:40.720 is different
00:31:42.080 from that
00:31:43.280 of the unicorn.
00:31:45.280 Well,
00:31:45.800 but the unicorn
00:31:46.460 doesn't exist.
00:31:47.400 So rather than worry
00:31:48.460 about demonstrating
00:31:49.840 that our pancreas
00:31:51.120 don't work
00:31:52.220 in the way
00:31:52.700 that the unicorn,
00:31:54.420 again,
00:31:54.800 here the analogy
00:31:55.860 is
00:31:56.260 that homo economicus
00:31:58.580 the view
00:32:00.180 of decision making
00:32:01.180 as espoused
00:32:01.900 by classical economists
00:32:03.000 is the unicorn.
00:32:04.180 It only exists
00:32:04.880 in the recesses
00:32:05.660 of their minds.
00:32:07.700 And so
00:32:08.000 I got,
00:32:08.940 we got it.
00:32:10.020 Human beings
00:32:10.760 don't behave
00:32:11.740 according to
00:32:12.920 what classical economists
00:32:14.120 tell us
00:32:14.760 how we should behave.
00:32:16.320 What interested me more
00:32:17.980 and now hopefully
00:32:18.600 maybe you're going to
00:32:19.220 get to where
00:32:20.060 I'm going with this
00:32:20.820 if you know my work
00:32:22.220 I'm an evolutionary
00:32:23.540 scientist,
00:32:25.080 right?
00:32:25.240 Evolutionary psychologist.
00:32:27.160 I was more interested
00:32:28.300 in understanding
00:32:29.740 why the architecture
00:32:32.100 of the human mind
00:32:33.660 is the way
00:32:34.280 that it is.
00:32:34.860 So if we do
00:32:36.120 succumb to the framing effect
00:32:37.780 what is the evolutionary
00:32:39.380 reason for us
00:32:40.480 to have that
00:32:41.900 if you like
00:32:43.100 cognitive
00:32:44.420 capacity
00:32:45.540 to be
00:32:46.740 swayed in that way?
00:32:49.400 So rather than
00:32:50.140 simply demonstrating
00:32:51.280 that we don't do
00:32:52.600 that which
00:32:53.700 the classical economists
00:32:54.980 expect us to do
00:32:56.180 just
00:32:56.860 let's study
00:32:57.880 why the architecture
00:32:58.940 of the human mind
00:32:59.580 is the way that it is
00:33:00.220 and that's why
00:33:00.800 I kind of
00:33:02.020 shifted away
00:33:03.520 from
00:33:04.360 psychology of
00:33:05.740 decision making
00:33:06.440 in the traditional sense
00:33:07.480 to more of an
00:33:08.940 evolutionary behavioral
00:33:10.000 science approach.
00:33:10.860 And so now
00:33:11.260 I'm going to come to
00:33:12.460 so how would I study
00:33:14.960 some of these
00:33:18.540 cognitive biases
00:33:19.740 that I've discussed
00:33:20.620 from an evolutionary
00:33:21.860 perspective?
00:33:22.420 So hopefully this will
00:33:23.600 blow your mind.
00:33:24.840 So remember when I
00:33:25.480 mentioned the
00:33:26.300 framing effect?
00:33:28.400 The, you know,
00:33:29.280 if I tell you that
00:33:30.500 a burger is 90%
00:33:31.620 fat-free
00:33:32.180 or tell you that
00:33:32.960 it's 10% fat
00:33:33.960 or if I tell you that
00:33:35.020 three out of five
00:33:36.060 dentists recommend
00:33:36.900 this toothpaste
00:33:37.480 is the same thing as
00:33:38.500 telling you two out of five
00:33:39.360 don't?
00:33:39.680 Well, what Kahneman
00:33:40.780 and Tversky did
00:33:41.620 is demonstrated
00:33:42.460 that the framing
00:33:43.320 effect exists
00:33:44.660 that we don't
00:33:46.340 we are succumbing
00:33:47.300 to this
00:33:47.800 this
00:33:48.900 cognitive distortion.
00:33:50.880 Okay, great.
00:33:51.560 That's wonderful.
00:33:52.800 What I wanted to do
00:33:54.200 is say
00:33:55.160 is there a way
00:33:56.840 that I can
00:33:58.100 put
00:33:58.660 the evolutionary
00:34:00.500 lens
00:34:01.220 to elucidate
00:34:02.820 something a bit more
00:34:05.000 something deeper
00:34:06.180 about the
00:34:06.800 that distortion
00:34:08.340 the framing effect?
00:34:09.680 And so
00:34:10.900 with one of my
00:34:11.460 former
00:34:11.940 doctoral students
00:34:13.920 who himself
00:34:14.800 is now a
00:34:15.340 chaired professor
00:34:16.000 his name is
00:34:17.360 Tripad Gill
00:34:17.980 in 2014
00:34:19.180 we published
00:34:21.040 a paper
00:34:22.040 in evolution
00:34:23.040 and human behavior
00:34:23.840 which is the top
00:34:24.840 you know
00:34:26.060 evolutionary journal
00:34:26.920 in the behavioral sciences
00:34:27.960 where we
00:34:29.600 studied the
00:34:30.660 framing effect
00:34:31.580 from
00:34:32.820 an evolutionary
00:34:33.860 perspective.
00:34:34.860 Let me explain
00:34:35.640 how we did that.
00:34:37.400 By the way people
00:34:38.460 you're getting
00:34:39.480 all this
00:34:40.320 at night
00:34:41.240 for free.
00:34:42.480 Imagine you're
00:34:43.120 doing this
00:34:43.700 in university.
00:34:45.020 This course
00:34:45.420 would cost you
00:34:46.200 $15,000.
00:34:48.080 So
00:34:48.720 do the right thing
00:34:50.060 go
00:34:50.740 subscribe to my
00:34:52.220 exclusive content.
00:34:53.340 For example
00:34:54.020 if you want to have
00:34:55.720 a Q&A period
00:34:57.360 with me
00:34:57.860 well
00:34:58.820 then you go to
00:34:59.800 subscription
00:35:00.620 and by the way
00:35:01.640 you know
00:35:02.280 I do this
00:35:02.740 I love doing
00:35:03.440 this
00:35:03.640 for the mere
00:35:04.060 fact of
00:35:04.660 sharing ideas
00:35:05.900 and I've spent
00:35:07.040 much of my career
00:35:07.820 giving things for free
00:35:08.660 but it's also
00:35:09.560 nice to monetize
00:35:10.360 your time
00:35:10.760 and expertise
00:35:11.240 so
00:35:11.720 as I mentioned
00:35:13.180 yesterday
00:35:13.620 for the price
00:35:14.720 of a latte
00:35:15.340 you can
00:35:16.540 have
00:35:17.180 you could be
00:35:17.700 supporting me
00:35:18.220 and if
00:35:18.420 imagine if I get
00:35:19.200 10,000
00:35:19.840 20,000 people
00:35:20.720 who are subscribing
00:35:21.380 suddenly
00:35:21.820 if I decide
00:35:23.160 that it's too
00:35:23.760 dangerous for me
00:35:24.460 to go to university
00:35:25.320 and I want to
00:35:25.900 quit and just
00:35:26.480 create content
00:35:28.020 all day long
00:35:28.920 for thousands
00:35:29.760 of people
00:35:30.260 I can do that
00:35:31.120 so anyways
00:35:32.100 think about it
00:35:32.820 I hope that
00:35:33.760 you'll subscribe
00:35:34.660 so
00:35:34.980 how did I
00:35:36.980 study
00:35:37.580 the framing
00:35:38.680 effect
00:35:39.180 from an
00:35:40.200 evolutionary
00:35:40.560 perspective
00:35:41.080 well
00:35:41.440 rather than
00:35:43.720 talking about
00:35:44.580 three out of
00:35:45.600 five dentists
00:35:46.360 recommend this
00:35:47.020 toothpaste
00:35:47.400 or two out of
00:35:48.220 five don't
00:35:48.860 what if we
00:35:49.900 look at
00:35:50.480 an evolutionarily
00:35:52.060 relevant
00:35:52.920 problem
00:35:54.240 or decision
00:35:55.180 so let's
00:35:56.340 talk about
00:35:56.860 mating
00:35:57.240 suppose that
00:35:58.500 I tell you
00:35:59.140 I think
00:35:59.700 I hope that
00:36:00.400 this blows
00:36:00.800 your mind
00:36:01.200 I know
00:36:02.960 it will
00:36:03.280 blow your
00:36:03.660 mind
00:36:03.880 what if
00:36:06.320 I told
00:36:06.800 you
00:36:07.060 I want
00:36:08.120 you to
00:36:08.440 evaluate
00:36:09.140 these
00:36:09.880 prospective
00:36:10.700 mates
00:36:12.140 for you
00:36:12.700 to go
00:36:13.000 out
00:36:13.280 with them
00:36:13.720 and I'm
00:36:14.820 going to
00:36:15.040 describe
00:36:15.500 them to
00:36:15.900 you
00:36:16.160 in one
00:36:16.900 of two
00:36:17.300 ways
00:36:17.720 maybe you
00:36:18.360 see where
00:36:18.640 I'm going
00:36:18.920 with this
00:36:19.280 so let's
00:36:19.620 take
00:36:19.800 intelligence
00:36:20.440 which is
00:36:21.760 an important
00:36:22.420 mating
00:36:22.900 attributes
00:36:23.440 that both
00:36:23.900 men and
00:36:24.220 women
00:36:24.420 care about
00:36:25.100 what if
00:36:26.900 I told
00:36:27.300 you that
00:36:27.760 eight out
00:36:28.260 of the
00:36:28.520 ten people
00:36:29.100 that are
00:36:29.720 acquaintances
00:36:30.300 of this
00:36:30.660 guy
00:36:30.920 or this
00:36:32.000 person
00:36:32.580 think that
00:36:33.640 he's
00:36:33.820 intelligent
00:36:34.260 well that's
00:36:35.080 exactly like
00:36:35.740 saying
00:36:36.120 two out
00:36:37.260 of ten
00:36:37.700 don't
00:36:38.860 right
00:36:39.180 so for
00:36:40.020 all of
00:36:40.600 these
00:36:40.840 mating
00:36:41.240 attributes
00:36:42.020 I can
00:36:43.340 either
00:36:43.820 frame
00:36:44.640 the
00:36:45.800 description
00:36:46.460 of the
00:36:47.000 prospective
00:36:47.460 suitor
00:36:47.960 using a
00:36:49.220 positive
00:36:49.680 frame
00:36:50.200 or a
00:36:51.280 negative
00:36:51.580 frame
00:36:51.980 you follow
00:36:52.420 I could
00:36:53.180 either say
00:36:53.720 seven out
00:36:54.640 of ten
00:36:55.020 think that
00:36:55.660 she is
00:36:56.000 very good
00:36:56.460 looking
00:36:56.780 or I
00:36:57.940 could say
00:36:58.240 three out
00:36:58.740 of ten
00:36:59.160 think that
00:37:00.080 she's not
00:37:00.480 very good
00:37:00.880 looking
00:37:01.140 those two
00:37:01.680 statements
00:37:02.040 are identical
00:37:03.320 they're
00:37:03.840 isomorphically
00:37:04.720 equivalent
00:37:05.160 but here is
00:37:06.760 the kicker
00:37:07.320 you ready
00:37:07.960 I won't tell
00:37:09.040 you the full
00:37:09.640 details of
00:37:10.160 the study
00:37:10.360 we had a
00:37:10.780 whole bunch
00:37:11.080 of manipulations
00:37:11.740 we looked at
00:37:12.320 short term
00:37:12.840 mating versus
00:37:13.320 long term
00:37:13.780 mating
00:37:13.940 I'm just
00:37:14.420 trying to
00:37:14.640 give you
00:37:14.820 the big
00:37:15.140 story
00:37:15.540 within the
00:37:18.440 context
00:37:19.180 of
00:37:20.000 mate
00:37:20.400 choice
00:37:20.880 who
00:37:22.640 bears
00:37:23.300 the
00:37:23.960 greater
00:37:24.560 costs
00:37:25.460 for
00:37:26.380 making
00:37:27.000 a
00:37:27.520 suboptimal
00:37:28.240 mate
00:37:28.520 choice
00:37:28.880 well it
00:37:29.640 should be
00:37:29.920 easy to
00:37:30.340 answer
00:37:30.620 of course
00:37:31.440 it's
00:37:31.660 women
00:37:31.900 this
00:37:32.680 comes
00:37:32.980 from
00:37:33.240 the
00:37:33.420 theory
00:37:33.760 called
00:37:34.120 parental
00:37:34.500 investment
00:37:34.980 theory
00:37:35.400 which
00:37:35.640 says
00:37:36.020 that
00:37:36.780 that
00:37:37.620 sex
00:37:38.200 in any
00:37:38.680 species
00:37:39.220 that
00:37:39.880 has
00:37:40.260 to
00:37:40.480 provide
00:37:41.100 the
00:37:41.600 greater
00:37:42.140 minimal
00:37:42.700 obligatory
00:37:43.400 parental
00:37:43.800 investment
00:37:44.340 is
00:37:44.960 the
00:37:45.200 sex
00:37:45.460 that's
00:37:45.740 going
00:37:45.940 to
00:37:46.140 be
00:37:46.420 more
00:37:47.340 sexually
00:37:48.120 choosy
00:37:48.720 for
00:37:49.400 very
00:37:49.660 obvious
00:37:50.000 reasons
00:37:50.460 because
00:37:51.160 if
00:37:51.840 they
00:37:52.500 are
00:37:52.700 the
00:37:52.880 ones
00:37:53.160 who
00:37:53.360 bear
00:37:53.600 the
00:37:53.800 greater
00:37:54.180 costs
00:37:54.980 in
00:37:55.500 making
00:37:55.800 a
00:37:56.020 poor
00:37:56.240 choice
00:37:56.600 then
00:37:57.120 they
00:37:57.280 have
00:37:57.540 to
00:37:57.680 be
00:37:57.840 judicious
00:37:58.520 in
00:37:58.780 their
00:37:58.900 mate
00:37:59.080 choice
00:37:59.360 this
00:37:59.760 is
00:37:59.940 why
00:38:00.360 in
00:38:01.200 every
00:38:01.680 culture
00:38:02.200 that's
00:38:02.880 ever
00:38:03.160 been
00:38:03.400 studied
00:38:03.840 in
00:38:04.340 every
00:38:04.740 religion
00:38:05.260 that
00:38:05.900 has
00:38:07.000 any
00:38:07.360 prescriptions
00:38:08.020 about
00:38:08.360 how men
00:38:08.780 and women
00:38:09.060 should
00:38:09.300 behave
00:38:09.700 it's
00:38:10.220 not
00:38:10.420 surprising
00:38:11.040 that
00:38:11.800 God
00:38:12.400 seems
00:38:12.780 to be
00:38:13.080 a lot
00:38:13.360 more
00:38:13.580 concerned
00:38:14.160 about
00:38:14.760 the
00:38:14.980 sexual
00:38:15.300 behavior
00:38:15.840 of
00:38:16.280 women
00:38:16.680 than
00:38:17.240 he
00:38:17.380 is
00:38:17.680 of
00:38:18.020 men
00:38:18.260 apparently
00:38:18.740 God
00:38:19.120 is
00:38:19.360 an
00:38:19.500 evolutionary
00:38:19.900 psychologist
00:38:20.540 okay
00:38:21.340 so
00:38:22.540 we
00:38:23.200 take
00:38:23.540 that
00:38:23.780 principle
00:38:24.240 and say
00:38:24.740 well
00:38:25.160 so
00:38:26.420 applying
00:38:26.920 the
00:38:27.180 evolutionary
00:38:27.880 lens
00:38:28.620 that
00:38:29.140 recognizes
00:38:29.900 that
00:38:30.360 there
00:38:30.580 is
00:38:30.720 a
00:38:30.900 differential
00:38:31.500 cost
00:38:32.280 for
00:38:32.960 men
00:38:33.240 and
00:38:33.460 women
00:38:33.720 in
00:38:34.040 making
00:38:34.360 a
00:38:34.640 poor
00:38:34.860 choice
00:38:35.280 in
00:38:35.420 the
00:38:35.560 mating
00:38:35.840 market
00:38:36.320 we
00:38:37.440 expect
00:38:38.180 we
00:38:38.520 hypothesized
00:38:39.520 and
00:38:39.740 that's
00:38:39.960 exactly
00:38:40.460 what
00:38:40.720 we
00:38:40.860 found
00:38:41.140 that
00:38:42.040 when
00:38:42.320 it
00:38:42.500 comes
00:38:42.840 to
00:38:43.020 the
00:38:43.260 framing
00:38:43.660 effect
00:38:44.240 but
00:38:44.840 specifically
00:38:45.840 in the
00:38:46.640 mating
00:38:46.940 domain
00:38:47.500 women
00:38:48.540 would
00:38:48.940 be
00:38:49.200 much
00:38:49.640 more
00:38:49.940 likely
00:38:50.260 to
00:38:50.520 succumb
00:38:50.820 to
00:38:50.960 the
00:38:51.120 framing
00:38:51.400 effect
00:38:51.900 precisely
00:38:53.020 because
00:38:53.800 negatively
00:38:54.900 framed
00:38:55.560 information
00:38:56.360 will loom
00:38:57.740 much
00:38:58.300 larger
00:38:58.960 in their
00:38:59.460 psyche
00:38:59.840 do you
00:39:00.260 follow
00:39:00.420 what I'm
00:39:00.620 saying
00:39:00.840 when I
00:39:01.600 tell you
00:39:01.980 it's
00:39:02.200 7 out
00:39:02.600 of 10
00:39:02.940 think
00:39:03.280 that
00:39:03.500 he
00:39:03.640 is
00:39:03.800 intelligent
00:39:04.260 it's
00:39:04.900 the
00:39:05.020 same
00:39:05.180 thing
00:39:05.400 as
00:39:05.540 telling
00:39:05.780 you
00:39:05.960 that
00:39:06.140 3 out
00:39:06.620 of 10
00:39:06.920 don't
00:39:07.260 think
00:39:07.440 that
00:39:07.580 he's
00:39:07.760 intelligent
00:39:08.200 well
00:39:09.060 men
00:39:09.640 and
00:39:09.860 women
00:39:10.220 evaluate
00:39:11.280 the
00:39:11.640 positively
00:39:12.340 framed
00:39:12.760 information
00:39:13.380 similarly
00:39:14.260 but
00:39:15.140 women
00:39:15.720 evaluate
00:39:16.840 the
00:39:17.220 negatively
00:39:17.880 framed
00:39:18.380 information
00:39:18.960 much
00:39:19.420 more
00:39:19.700 harshly
00:39:20.380 so that
00:39:21.180 cognitive
00:39:21.780 distortion
00:39:22.620 has a
00:39:23.840 sex
00:39:24.160 specificity
00:39:25.160 to it
00:39:25.680 because
00:39:26.360 of an
00:39:27.200 evolutionary
00:39:28.060 reality
00:39:29.020 this is
00:39:29.900 deep
00:39:30.220 each
00:39:31.680 one of
00:39:32.060 you
00:39:32.260 assholes
00:39:32.860 owes me
00:39:33.420 a thousand
00:39:34.060 dollars
00:39:34.600 for
00:39:35.180 sharing
00:39:36.740 with you
00:39:37.180 this
00:39:37.440 kind
00:39:37.760 of
00:39:38.020 depth
00:39:39.140 and
00:39:39.360 profundity
00:39:39.980 now
00:39:40.900 I'm
00:39:41.040 going
00:39:41.160 to
00:39:41.220 get
00:39:41.380 someone
00:39:41.600 but
00:39:41.820 it
00:39:41.960 wasn't
00:39:42.240 nice
00:39:42.520 when
00:39:42.720 you
00:39:42.880 said
00:39:43.100 the
00:39:43.280 word
00:39:43.440 asshole
00:39:43.740 I'm
00:39:44.020 kidding
00:39:44.260 I'm
00:39:44.600 just
00:39:44.800 I'm
00:39:45.000 just
00:39:45.140 being
00:39:45.400 gad
00:39:45.760 okay
00:39:46.420 so
00:39:47.880 let's
00:39:48.160 summarize
00:39:48.500 yes
00:39:49.840 the
00:39:50.060 framing
00:39:50.360 effect
00:39:50.740 occurs
00:39:51.300 but
00:39:52.160 in
00:39:52.360 evolutionarily
00:39:53.400 important
00:39:54.020 domains
00:39:54.700 I
00:39:55.620 can
00:39:55.880 predict
00:39:56.640 a
00:39:57.920 sex
00:39:58.520 difference
00:39:59.180 in the
00:39:59.820 proclivity
00:40:00.540 of
00:40:00.860 succumbing
00:40:01.360 to the
00:40:01.760 framing
00:40:02.060 effect
00:40:02.540 because
00:40:03.400 of an
00:40:04.060 evolutionary
00:40:04.620 catalyst
00:40:05.180 so
00:40:05.920 this
00:40:06.240 is
00:40:06.440 where
00:40:06.900 if
00:40:07.560 you
00:40:07.720 like
00:40:08.100 I
00:40:08.820 added
00:40:09.600 if
00:40:10.280 I
00:40:10.440 can
00:40:10.660 be
00:40:10.920 bold
00:40:11.300 enough
00:40:11.600 to
00:40:11.820 say
00:40:12.120 that
00:40:12.700 I
00:40:13.440 added
00:40:13.880 to
00:40:14.140 the
00:40:14.380 work
00:40:14.780 of
00:40:16.060 Daniel
00:40:16.380 Kahneman
00:40:16.780 because
00:40:17.160 Daniel
00:40:17.520 Kahneman
00:40:17.960 was a
00:40:18.540 lot
00:40:18.700 more
00:40:18.920 concerned
00:40:19.380 about
00:40:19.720 simply
00:40:20.240 highlighting
00:40:21.300 these
00:40:22.060 cognitive
00:40:22.500 biases
00:40:23.060 and
00:40:23.880 I
00:40:24.020 tried
00:40:24.280 to
00:40:24.440 come
00:40:24.600 along
00:40:25.060 in
00:40:25.800 this
00:40:26.020 case
00:40:26.280 with
00:40:26.520 my
00:40:26.840 brilliant
00:40:27.780 former
00:40:28.360 doctoral
00:40:28.760 student
00:40:29.120 Tripat
00:40:29.460 Gill
00:40:29.740 and
00:40:30.220 say
00:40:30.420 oh
00:40:30.660 yes
00:40:30.900 okay
00:40:31.180 there
00:40:31.440 are
00:40:31.920 these
00:40:32.180 cognitive
00:40:32.600 biases
00:40:33.160 and
00:40:34.280 there
00:40:34.900 is an
00:40:35.260 evolutionary
00:40:35.760 logic
00:40:36.420 for
00:40:36.840 why
00:40:37.200 these
00:40:37.620 biases
00:40:39.020 exist
00:40:39.680 in the
00:40:40.080 form
00:40:40.260 that
00:40:40.420 they
00:40:40.540 do
00:40:40.740 okay
00:40:41.380 so
00:40:42.300 that's
00:40:42.760 that
00:40:42.980 I
00:40:43.120 just
00:40:43.280 wanted to
00:40:43.620 maybe
00:40:43.800 share
00:40:44.120 one
00:40:44.520 final
00:40:45.280 personal
00:40:47.220 story
00:40:47.660 that speaks
00:40:48.240 to
00:41:18.240 a
00:41:19.380 bunch
00:41:19.640 of
00:41:19.860 a
00:41:20.240 bunch
00:41:20.760 of
00:41:20.920 his
00:41:21.200 doctoral
00:41:21.220 students
00:41:21.640 went
00:41:21.900 down
00:41:22.160 to
00:41:22.320 Ithaca
00:41:22.660 to
00:41:22.960 Cornell
00:41:23.220 and
00:41:24.300 you
00:41:25.080 know
00:41:25.160 we
00:41:25.280 had
00:41:25.460 a
00:41:25.640 wonderful
00:41:26.180 time
00:41:27.040 with
00:41:27.240 him
00:41:27.500 and
00:41:27.740 to
00:41:28.260 honor
00:41:28.560 him
00:41:28.800 and
00:41:28.920 so
00:41:29.100 on
00:41:29.300 and
00:41:30.000 Jay
00:41:30.520 remained
00:41:31.140 incredibly
00:41:32.200 productive
00:41:32.680 to the
00:41:33.600 last
00:41:34.240 minute
00:41:34.520 as a
00:41:34.820 matter
00:41:34.920 of
00:41:35.040 fact
00:41:35.280 that
00:41:35.940 he
00:41:38.400 held
00:41:39.000 like a
00:41:39.440 seminar
00:41:39.800 the
00:41:40.140 last
00:41:40.440 day
00:41:40.740 with
00:41:41.500 all
00:41:41.800 of
00:41:41.960 us
00:41:42.240 and
00:41:43.080 you
00:41:43.180 would
00:41:43.340 think
00:41:43.540 we're
00:41:43.760 back
00:41:44.020 to
00:41:44.200 being
00:41:44.500 26
00:41:45.200 year
00:41:45.600 old
00:41:45.800 doctoral
00:41:46.160 students
00:41:46.500 because
00:41:46.680 you
00:41:46.940 would
00:41:47.100 think
00:41:47.300 okay
00:41:47.540 he's
00:41:47.760 going
00:41:47.860 to
00:41:47.960 give
00:41:48.120 a
00:41:48.260 talk
00:41:48.740 about
00:41:49.100 you
00:41:49.520 know
00:41:49.740 lessons
00:41:50.680 learned
00:41:51.200 in my
00:41:51.600 long
00:41:52.020 illustrious
00:41:52.700 career
00:41:53.100 as a
00:41:53.840 cognitive
00:41:54.600 and
00:41:55.000 mathematical
00:41:55.540 psychologist
00:41:56.360 and you
00:41:57.060 know
00:41:57.120 business
00:41:57.400 school
00:41:57.680 professor
00:41:58.100 no
00:41:59.060 he was
00:41:59.680 in the
00:42:00.140 weeds
00:42:00.500 he was
00:42:00.840 telling
00:42:01.140 you
00:42:01.420 what
00:42:01.660 reviewer
00:42:02.400 two
00:42:02.700 said
00:42:03.220 in the
00:42:03.720 paper
00:42:04.160 that he
00:42:04.480 had just
00:42:04.860 submitted
00:42:05.320 to the
00:42:05.680 journal
00:42:05.860 he was
00:42:06.360 just
00:42:06.640 going
00:42:06.960 full
00:42:07.220 throttle
00:42:07.560 a
00:42:08.300 real
00:42:08.680 a
00:42:09.260 true
00:42:09.660 scientist
00:42:10.540 through
00:42:11.040 and
00:42:11.200 through
00:42:11.440 unbelievable
00:42:13.180 guy
00:42:13.500 I tell
00:42:13.920 I discuss
00:42:14.700 several
00:42:15.260 stories
00:42:17.640 of my
00:42:18.020 time
00:42:18.440 at
00:42:19.020 Cornell
00:42:19.380 in the
00:42:19.820 happiness
00:42:20.280 book
00:42:20.620 anyways
00:42:21.220 the reason
00:42:21.500 I'm
00:42:21.780 mentioning
00:42:22.160 him
00:42:22.400 right now
00:42:22.740 is because
00:42:23.180 one day
00:42:24.220 he told
00:42:24.640 me
00:42:24.900 we were
00:42:25.920 sitting
00:42:26.140 around
00:42:26.360 Jay
00:42:26.920 is someone
00:42:27.380 who
00:42:27.720 is a
00:42:28.560 very
00:42:28.800 austere
00:42:29.420 guy
00:42:29.680 he could
00:42:30.160 be
00:42:30.380 intimidating
00:42:31.140 you
00:42:31.380 don't
00:42:31.760 mess
00:42:32.000 around
00:42:32.300 with
00:42:32.500 Jay
00:42:32.680 Russo
00:42:33.060 he was
00:42:34.320 a fantastic
00:42:35.840 mentor
00:42:36.220 but
00:42:36.600 you better
00:42:37.520 know your
00:42:37.940 stuff
00:42:38.220 you better
00:42:38.560 not mess
00:42:39.020 around
00:42:39.320 you better
00:42:39.780 do the
00:42:40.980 work
00:42:41.320 you better
00:42:41.840 be creative
00:42:42.980 you better
00:42:43.380 be hard
00:42:43.800 working
00:42:44.180 he expected
00:42:45.380 the world
00:42:46.100 of you
00:42:46.460 and I
00:42:47.200 of course
00:42:47.780 I thank
00:42:48.140 him for
00:42:48.400 that
00:42:48.600 for that
00:42:49.060 mentorship
00:42:49.420 so one day
00:42:50.580 we were
00:42:50.820 sitting
00:42:51.120 and he
00:42:52.020 says to
00:42:52.460 me
00:42:52.720 you know
00:42:53.920 Gad
00:42:54.340 it isn't
00:42:56.020 very often
00:42:56.820 that I'm
00:42:58.020 the dumbest
00:42:58.600 guy in
00:42:59.040 a room
00:42:59.400 Jay knew
00:43:00.600 that he was
00:43:01.060 obviously
00:43:01.400 a very
00:43:01.760 bright guy
00:43:02.280 now let me
00:43:03.340 set up
00:43:03.860 the story
00:43:04.700 of which
00:43:05.500 room he
00:43:05.880 was talking
00:43:06.280 about
00:43:06.560 so Jay
00:43:07.760 had served
00:43:08.400 on a
00:43:08.820 doctoral
00:43:09.420 committee
00:43:10.060 of
00:43:10.760 of an
00:43:12.100 individual
00:43:12.520 who
00:43:12.920 subsequently
00:43:13.480 himself
00:43:14.100 became
00:43:14.540 a very
00:43:15.720 accomplished
00:43:17.600 professor
00:43:18.640 and
00:43:19.220 decision
00:43:20.020 theorist
00:43:20.560 and on
00:43:22.140 that
00:43:22.400 committee
00:43:22.800 of that
00:43:23.660 student
00:43:24.120 who's now
00:43:24.900 a very
00:43:25.400 senior
00:43:25.700 professor
00:43:26.200 the other
00:43:29.180 committee
00:43:29.500 members
00:43:29.840 so there
00:43:30.080 was Jay
00:43:30.580 who in
00:43:32.420 most rooms
00:43:33.180 will be
00:43:33.540 the smartest
00:43:34.040 guy
00:43:34.420 the others
00:43:36.900 were Herb
00:43:37.480 Simon
00:43:37.820 who won
00:43:39.200 the 1978
00:43:39.960 Nobel Prize
00:43:41.160 and
00:43:42.620 Amos
00:43:43.480 Tversky
00:43:43.980 the gentleman
00:43:45.340 who would
00:43:45.760 have won
00:43:46.200 the Nobel
00:43:46.700 Prize
00:43:47.180 with Daniel
00:43:48.100 Kahneman
00:43:48.560 had he not
00:43:49.180 passed away
00:43:49.820 in 1996
00:43:50.600 and so
00:43:51.880 Jay says
00:43:53.240 to me
00:43:53.660 you know
00:43:54.500 Gad
00:43:54.840 it isn't
00:43:56.200 very often
00:43:56.960 that I am
00:43:58.300 the dumbest
00:43:58.800 guy in a
00:43:59.560 room
00:43:59.800 but whenever
00:44:01.400 we met
00:44:02.100 in that
00:44:02.880 room
00:44:03.240 you know
00:44:04.420 with the
00:44:04.840 doctoral
00:44:05.160 committee
00:44:05.560 of that
00:44:05.980 student
00:44:06.340 I was
00:44:07.740 always
00:44:08.180 the dumbest
00:44:08.780 guy
00:44:09.020 in that
00:44:09.340 room
00:44:09.600 which
00:44:10.320 is a
00:44:10.760 less
00:44:10.940 I mean
00:44:11.260 I love
00:44:11.560 the story
00:44:11.920 because
00:44:12.160 ultimately
00:44:12.520 it's a
00:44:13.080 it's a
00:44:13.820 it demonstrates
00:44:15.240 humility
00:44:15.700 for him
00:44:16.040 to say
00:44:16.300 that
00:44:16.540 notwithstanding
00:44:17.260 that it's
00:44:17.920 probably
00:44:18.580 true
00:44:18.960 even though
00:44:19.560 obviously
00:44:20.120 Jay is
00:44:20.500 an incredibly
00:44:20.940 bright
00:44:21.900 professor
00:44:22.480 but it
00:44:23.680 it always
00:44:24.380 reminded me
00:44:25.140 of the
00:44:25.400 fact that
00:44:26.080 you know
00:44:27.400 life is
00:44:27.900 truly relative
00:44:28.660 right
00:44:28.940 so if
00:44:29.440 you walk
00:44:29.940 into a
00:44:30.380 prison
00:44:30.720 as a
00:44:31.860 super
00:44:32.160 tough
00:44:32.480 guy
00:44:32.880 and you
00:44:33.740 think
00:44:34.140 I've got
00:44:34.980 nothing to
00:44:35.440 worry about
00:44:35.900 in prison
00:44:36.320 I mean
00:44:36.760 people know
00:44:37.800 what a
00:44:38.540 damn
00:44:38.840 tough guy
00:44:39.400 I am
00:44:39.840 guess what
00:44:40.840 there are
00:44:41.600 guys in
00:44:42.080 prison
00:44:42.420 that are
00:44:43.280 going to
00:44:43.700 be tougher
00:44:44.300 than you
00:44:44.780 I mean
00:44:45.100 short of
00:44:45.820 you being
00:44:46.360 Lionel
00:44:46.860 Messi
00:44:47.160 right
00:44:47.540 where
00:44:47.780 you know
00:44:48.240 you know
00:44:48.580 by definition
00:44:49.380 that you're
00:44:50.280 the greatest
00:44:50.680 soccer player
00:44:51.220 ever
00:44:51.500 no matter
00:44:52.700 how good
00:44:53.400 you are
00:44:53.800 in mathematics
00:44:54.500 and how
00:44:55.180 accomplished
00:44:55.720 you are
00:44:56.280 as a
00:44:56.620 professor
00:44:57.020 and how
00:44:57.620 eloquent
00:44:58.100 you are
00:44:58.560 as an
00:44:58.900 orator
00:44:59.360 there's
00:45:00.400 always going
00:45:01.060 to be
00:45:01.620 some room
00:45:02.820 where
00:45:03.720 you're
00:45:05.100 going to
00:45:05.440 eat
00:45:05.740 humble
00:45:06.240 pie
00:45:06.540 which
00:45:06.840 actually
00:45:07.360 is a
00:45:07.840 beautiful
00:45:08.420 thing
00:45:08.800 because
00:45:09.100 it
00:45:09.520 causes
00:45:10.040 us
00:45:10.260 to
00:45:10.380 always
00:45:10.640 want
00:45:10.900 to
00:45:11.040 aspire
00:45:11.480 to
00:45:12.300 be
00:45:12.500 better
00:45:12.820 and so
00:45:13.260 on
00:45:13.440 right
00:45:13.640 I mean
00:45:13.980 Lionel
00:45:15.240 Messi
00:45:15.500 had won
00:45:16.020 everything
00:45:16.580 short of
00:45:17.400 the
00:45:18.260 World Cup
00:45:18.780 and he
00:45:19.260 didn't
00:45:19.520 stop
00:45:19.900 until he
00:45:20.420 eventually
00:45:21.000 did win
00:45:21.600 the World
00:45:21.900 Cup
00:45:22.100 so
00:45:22.380 it is
00:45:23.180 a
00:45:23.320 beautiful
00:45:23.660 thing
00:45:24.040 to
00:45:24.420 compare
00:45:25.580 yourself
00:45:26.040 to
00:45:26.220 relevant
00:45:26.540 others
00:45:27.020 and to
00:45:27.480 have
00:45:27.660 the
00:45:27.800 humility
00:45:28.080 to say
00:45:28.480 yeah
00:45:28.740 I may
00:45:29.300 be
00:45:29.520 a great
00:45:30.600 scholar
00:45:30.980 but guess
00:45:31.440 what
00:45:31.740 there are
00:45:32.440 these other
00:45:32.880 folks that
00:45:33.320 I'd like
00:45:33.580 to emulate
00:45:34.060 that are
00:45:34.680 even
00:45:35.020 crushing it
00:45:36.040 10 times
00:45:36.460 more than
00:45:36.840 I am
00:45:37.180 and I
00:45:37.420 want to
00:45:37.660 be
00:45:37.900 hopefully
00:45:38.960 as good
00:45:39.300 as
00:45:39.440 them
00:45:39.580 and that
00:45:40.060 never
00:45:40.300 stops
00:45:40.660 right
00:45:40.840 I mean
00:45:41.020 I could
00:45:41.280 easily
00:45:41.560 stop now
00:45:42.140 and say
00:45:42.380 hey
00:45:42.660 I'm
00:45:43.620 probably
00:45:44.580 better known
00:45:45.480 as a
00:45:45.760 professor
00:45:46.080 than 99.9
00:45:47.540 professors
00:45:48.100 anywhere
00:45:49.580 but I
00:45:50.940 don't look
00:45:51.280 at it
00:45:51.460 that way
00:45:51.840 I look
00:45:52.740 at my
00:45:53.340 bibliometric
00:45:54.100 score
00:45:54.560 compared to
00:45:55.160 Amos
00:45:55.460 Tversky
00:45:55.960 and I
00:45:56.680 go
00:45:56.820 god damn
00:45:57.640 I gotta
00:45:58.060 find
00:45:58.660 a way
00:45:58.920 to be
00:45:59.160 more
00:45:59.400 productive
00:45:59.840 what
00:46:00.360 the hell
00:46:00.840 I'm not
00:46:01.700 1% of
00:46:02.580 what this
00:46:02.900 guy's
00:46:03.160 accomplished
00:46:03.540 in terms
00:46:04.120 of
00:46:04.560 academic
00:46:05.980 bibliometrics
00:46:06.840 and so
00:46:07.900 yeah
00:46:09.060 so that's
00:46:09.560 called
00:46:09.780 social
00:46:10.080 comparison
00:46:10.520 theory
00:46:10.880 right
00:46:11.100 compare
00:46:11.440 yourself
00:46:11.880 to
00:46:12.180 relevant
00:46:12.780 others
00:46:13.200 and hopefully
00:46:13.660 that can
00:46:14.240 motivate
00:46:15.560 you
00:46:15.900 so there
00:46:16.660 you have
00:46:17.020 it
00:46:17.260 today
00:46:17.680 was
00:46:18.060 certainly
00:46:18.840 a sad
00:46:19.380 day
00:46:19.660 from the
00:46:20.000 perspective
00:46:20.400 of
00:46:20.980 a
00:46:21.380 unbelievable
00:46:23.000 academic
00:46:24.360 Daniel
00:46:26.080 Conant
00:46:26.440 passing
00:46:26.860 away
00:46:27.620 may he
00:46:29.300 rest in
00:46:29.720 peace
00:46:29.940 I know
00:46:30.320 that he
00:46:30.700 will
00:46:30.960 certainly
00:46:31.600 be
00:46:31.980 immortal
00:46:32.480 in the
00:46:32.960 sense
00:46:33.180 that his
00:46:33.660 work
00:46:34.040 will be
00:46:35.260 undoubtedly
00:46:36.240 cited for
00:46:37.200 centuries to
00:46:38.200 come
00:46:38.440 you know
00:46:38.960 there are
00:46:39.840 some
00:46:40.080 researchers
00:46:40.580 that ride
00:46:41.840 a fad
00:46:42.400 right
00:46:42.660 they're
00:46:42.960 you know
00:46:43.600 Michel Foucault
00:46:44.920 the bullshitter
00:46:45.900 postmodernist
00:46:46.800 was cited a lot
00:46:48.560 when postmodernism
00:46:50.260 you know
00:46:51.140 was one of those
00:46:52.200 mind viruses
00:46:53.320 that was spreading
00:46:54.140 you know
00:46:54.720 what I call
00:46:55.160 idea pathogens
00:46:55.980 in the
00:46:56.280 parasitic
00:46:56.600 mind
00:46:56.900 I don't
00:46:57.940 expect that
00:46:58.400 Michel Foucault
00:46:59.080 in 200
00:46:59.760 years
00:47:00.200 is going
00:47:00.780 to be
00:47:01.140 a bleep
00:47:01.580 on anybody's
00:47:02.220 radar
00:47:02.460 but
00:47:03.460 the work
00:47:04.500 of Amos
00:47:04.960 Tversky
00:47:05.380 and Daniel
00:47:05.780 Kahneman
00:47:06.200 will
00:47:06.640 because
00:47:07.080 there's
00:47:07.440 no way
00:47:08.020 that anyone
00:47:09.080 who wants
00:47:09.580 to study
00:47:09.980 psychology
00:47:10.480 of decision
00:47:11.000 making
00:47:11.440 can ever
00:47:14.020 avoid
00:47:15.060 seeing
00:47:15.740 their work
00:47:16.420 and so
00:47:17.080 he is
00:47:18.060 immortal
00:47:18.440 he may
00:47:19.140 physically
00:47:19.560 be gone
00:47:20.120 but
00:47:20.660 his
00:47:21.520 memetic
00:47:22.120 immortality
00:47:22.980 is guaranteed
00:47:23.460 so there
00:47:24.320 you have
00:47:24.600 it folks
00:47:25.020 I hope
00:47:25.560 that you
00:47:25.880 enjoyed
00:47:26.260 this
00:47:26.640 wow
00:47:27.180 almost
00:47:27.540 50 minute
00:47:28.120 lecture
00:47:28.680 on
00:47:29.020 Danny Kahneman
00:47:29.900 psychology
00:47:30.340 of decision
00:47:30.760 making
00:47:31.080 I'm heading
00:47:32.200 out to
00:47:32.580 spend some
00:47:33.000 time with
00:47:33.300 the family
00:47:33.800 as I said
00:47:35.460 if you wish
00:47:36.160 to interact
00:47:36.920 with me
00:47:37.340 the best
00:47:37.840 way to
00:47:38.180 do so
00:47:38.640 is by
00:47:40.200 subscribing
00:47:41.780 what I
00:47:42.520 what I'll
00:47:42.980 try to do
00:47:43.580 and that's
00:47:43.900 what I've
00:47:44.100 been doing
00:47:44.520 the last
00:47:45.160 few times
00:47:45.580 I've done
00:47:45.840 x-spaces
00:47:46.460 except
00:47:47.440 yesterday
00:47:47.800 I didn't
00:47:48.060 do that
00:47:48.380 is once
00:47:49.260 I finish
00:47:49.740 the general
00:47:50.380 x-spaces
00:47:51.120 then I go
00:47:51.920 into a
00:47:52.380 subscriber
00:47:52.840 only
00:47:53.180 x-spaces
00:47:53.800 where I
00:47:54.760 actually
00:47:55.240 do take
00:47:56.040 the questions
00:47:56.540 of people
00:47:57.020 and we
00:47:57.360 end up
00:47:57.780 having
00:47:58.060 these
00:47:58.300 incredible
00:47:58.860 conversations
00:47:59.420 because
00:47:59.800 as you
00:48:00.820 might expect
00:48:01.480 the people
00:48:02.520 who typically
00:48:03.280 subscribe to
00:48:03.960 my stuff
00:48:04.560 tend to
00:48:06.040 be very
00:48:06.960 intellectually
00:48:07.440 curious
00:48:07.920 oftentimes
00:48:08.320 they're
00:48:09.140 other
00:48:09.360 academics
00:48:09.860 or they're
00:48:10.440 graduate
00:48:10.800 students
00:48:11.320 and so
00:48:12.220 we really
00:48:12.760 end up
00:48:13.100 having
00:48:13.380 these
00:48:13.620 incredible
00:48:14.360 organic
00:48:14.980 conversations
00:48:15.580 so
00:48:16.060 thank you
00:48:16.940 so much
00:48:17.180 for your
00:48:17.400 attention
00:48:17.740 I will
00:48:18.560 probably
00:48:19.380 I mean
00:48:20.260 this will
00:48:20.640 stay up
00:48:21.240 as a
00:48:21.540 recorded
00:48:22.000 x-spaces
00:48:23.160 but I
00:48:23.520 will probably
00:48:24.160 take it
00:48:24.860 and upload
00:48:26.380 it on
00:48:27.460 my
00:48:27.920 YouTube
00:48:29.420 channel
00:48:29.840 and my
00:48:30.360 podcast
00:48:30.720 and please
00:48:31.400 if you
00:48:31.900 haven't done
00:48:32.340 so
00:48:32.600 please
00:48:33.120 consider
00:48:33.540 this is
00:48:34.300 free
00:48:34.560 so I'm
00:48:35.140 not
00:48:35.380 pleading
00:48:36.160 for any
00:48:36.660 of your
00:48:37.580 latte
00:48:38.400 money
00:48:38.760 subscribe
00:48:39.800 to my
00:48:40.380 YouTube
00:48:40.840 channel
00:48:41.280 and or
00:48:42.080 my podcast
00:48:43.260 thank you
00:48:44.020 so much
00:48:44.300 guys
00:48:44.640 have a
00:48:45.040 great
00:48:45.240 evening
00:48:45.580 and I'll
00:48:46.480 see you
00:48:46.720 soon
00:48:46.900 cheers
00:48:47.240 everybody