The Jordan B. Peterson Podcast


390. The Prisoner's Dilemma, Tit-for-Tat and Game Theory | Robert Sapolsky


Summary

Dr. Robert Sapolsky is a primatologist, neuroendocrinologist, and author of multiple books, including the upcoming Determined: A Science of Life Without Free Will. In this episode, we discuss Game Theory and how it applies to human behavior, the unexpected success of the tit-for-tat negotiating principle, the role of the neurochemical dopamine in reward, reinforcement and the anticipation of the future, and the potentially objective reality of transcendent ethical structures operating within the biological domain. Dr. Peterson has created a new series that could be a lifeline for those battling depression and anxiety. With decades of experience helping patients, Dr. Jordan B. Peterson offers a unique understanding of why you might be feeling this way, and a roadmap towards healing. He provides a roadmap toward healing, showing that while the journey isn t easy, it s absolutely possible to find your way forward. If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better. Go to Dailywire Plus now and start watching Dr. B.B. Peterson's new series on Depression and Anxiety. Let this be the first step towards the brighter future you deserve. Today's guest: Dr. Robert Sapolsky, author of the new book, Determined, A Science Of Life Without A Free Will: The New Science of Free Will, joins us to talk about Game Theory, Tik-Tat, and why it's so important to understand the nature of morality in the animal world, primatology, neuroscience, and neuroscience. in this episode of The Daily Wire Plus. Subscribe to DailyWire Plus to get immediate access to all the latest news and access to the latest episodes of Dailywire plus. To find a list of our newest episodes, go to dr. dr. Jordan Peterson's newest series on Dailywireplus.ca/Dailywireplus and get notified when new episodes go live! Subscribe today! Learn more about your ad-free version of the podcast, subscribe to our newest episode of the show, "Dailywire Plus! Subscribe to our new podcast, "The Mindful Minds Guide" Subscribe to the Daily Wire Podcast! Subscribe on Apple Podcasts Subscribe on iTunes or wherever else is listening to the podcast? Subscribe on the App Store or wherever you get your favorite podcast? Subscribe on Podcasts? Learn about the latest episode? Become a supporter of the series? Leave us a review on iTunes?


Transcript

00:00:00.940 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Hello everyone watching and listening.
00:01:11.240 Today I'm speaking with primatologist, neuroendocrinology researcher, and author of multiple books, including the upcoming Determined, A Science of Life Without Free Will, Dr. Robert Sapolsky.
00:01:26.240 We discuss game theory and how it applies to human behavior, the unexpected success of the tit-for-tat negotiating principle, the role of the neurochemical dopamine in reward, reinforcement, and the anticipation of the future, and the potentially objective reality of transcendent ethical structures operating within the biological domain.
00:01:51.400 So I was reading Behave in some detail.
00:01:55.400 I've read a number of your other books. I've followed your career for a long time.
00:01:58.880 I'm very interested in primatology and in neuroscience, so that makes for interesting reading as far as I'm concerned.
00:02:05.800 The thing that really struck me in Behave are the sections on game theory, and I wanted to start talking about game theory because, first of all, the terminology is strange because game theory, I mean, you could hardly imagine something that might sound more trivial than that.
00:02:25.820 I mean, first of all, I mean, first of all, I mean, first of all, it's games, and second of all, it's theory, but there's absolutely nothing whatsoever that's even minimally trivial about game theory.
00:02:33.900 It's unbelievably important, you know, and I kind of stumbled across it sideways.
00:02:37.820 I was reading work by Jack Panksepp, who did a lot of work with rats, and Panksepp showed that if you paired rats repeatedly together, juvenile males, and you allowed them to play, the little rat who had to invite to play once dominance had been established, he would stop inviting to play if the big rat didn't let him win 30% of the time in repeated bouts, say.
00:03:03.460 And I thought, oh my God, that's so cool, because what you see there is something like an emergent morality of play in rats merely as a consequence of the repeated pairing of the same individuals, you know, across an indeterminate landscape.
00:03:19.160 And that's an unbelievably compelling and stunning discovery, because it indicates something like the emergence of a spontaneous morality.
00:03:30.100 Now, you talk about game theory. Do you want to review for everybody, first of all, what game theory is, and then what the major findings of the field are?
00:03:40.200 We can talk about tit for tat and the variations, but please let everybody know what game theory is and why it's so important.
00:03:47.260 Sure. Maybe, well, just emphasize the point you made right from the start that this is not fun in games.
00:03:53.560 But game theory was mostly the purview of war strategists and diplomats and people planning, you know, mutually assured destruction.
00:04:04.920 So this was rather serious stuff.
00:04:08.200 At some point, the biologists got a hold of it, and especially zoologists.
00:04:13.840 And the sort of rationale was, like, you look at a giraffe, and you're some cardiovascular giraffe person, and you do all these calculations about, like, if you're going to have a head that's that far above your heart, and you're going to have this body weight and blah, blah, whatever, you're going to have to have a heart with its walls that are this thick or this, like, vascular properties.
00:04:40.460 And then the scientists go and study it, and that's exactly what you see.
00:04:46.000 Isn't that amazing?
00:04:46.920 Isn't nature wonderful?
00:04:48.260 Or, like, you look at desert rats, and you do all this theoretical modeling stuff and figure out if they're going to survive in the desert, their kidneys have to retain water at this unbelievable rate.
00:05:00.580 And then people would go and study it, and that's exactly how the kidneys work.
00:05:04.520 Isn't that amazing?
00:05:05.240 And it's not so amazing, because, like, if you're going to have giraffes shaped like giraffes, the heart has to be that way.
00:05:12.360 There is an intrinsic logic to how it had to evolve.
00:05:16.340 And if you're going to be a desert rodent, there's an intrinsic logic to how your kidneys go about living in the desert.
00:05:22.840 And the whole notion of game theory, as applied to evolution, animal behavior, human behavior, et cetera, is there's an intrinsic logic.
00:05:33.380 The logic of our behavior has been as sculpted by evolutionary exigencies as the logic of our hearts and the logic of our kidneys and everything else in there.
00:05:46.380 And by the time it comes to behavior, a lot of it is built around when is the optimal time to do X, and when do you do the opposite of X?
00:05:55.760 So, you talk about, all right, so let's review that for a minute.
00:06:03.000 So, your point, as I understand it, is that there's going to be necessary constraints on the physiology of an organism.
00:06:14.960 And those constraints are going to be reflective of its environment and the peculiarities of its morphology.
00:06:21.920 And you can predict that a priori, and then when you match your predictions against observation, at least some of the times they match,
00:06:30.120 there's an analogy between that and behavior in that you can analyze the context in which behavior occurs and the physiology of the organism.
00:06:41.780 You do that in particular and behave as you map out the nervous system from the hypothalamus upward toward the prefrontal cortex.
00:06:51.500 There's going to be an interaction between context and physiology that's necessary.
00:06:59.760 The context of behavior isn't the mere requiting of primordial and immediate needs.
00:07:08.960 The context of behavior is, in part, the reciprocal interactions that occur in a very large social space between many individuals, many of whom will interact repeatedly.
00:07:20.900 And there's something about repeated interactions that's absolutely crucial.
00:07:25.140 So, one of the things you point out, for example, is that, and this was also true of Panksepp's rat studies.
00:07:34.060 If you just put two rats together once, geez, the big rat might as well just eat the little rat because what the hell?
00:07:42.340 You know, maybe he's hungry and the little rat can be a meal.
00:07:45.560 And there are circumstances under which that occurs.
00:07:47.900 But if the rats are going to be together in a social environment, and they're also surrounded by relative rats and friend rats, then the landscape of need gratification starts to switch dramatically.
00:08:03.360 Because you don't just have the requirement of satisfying the immediate need of the single individual right now.
00:08:11.820 You have the problem of iterated needs across vast spans of time in a complex social environment.
00:08:18.240 And a wonderful jargon for it is the shadow of the future.
00:08:23.780 Right, right.
00:08:25.040 Let's talk about that, which is a wonderful, poetic way of, yeah, exactly that notion.
00:08:29.920 Yeah, well, and the future has a shape too, right?
00:08:33.460 Because the farther out you go into the future, the more unpredictable it is.
00:08:37.600 But it doesn't ever deteriorate exactly to zero predictability.
00:08:41.140 And I know there's a future discounting literature that's associated with time preference that also calculates the degree to which people regulate their behavior in the present in accordance with likely future contingencies.
00:08:54.880 One of the things you point out, and this is one of the ways your book is integrated, I believe, is that as you move upward in the hierarchy of the nervous system towards the more recently evolved brain areas, let's say, towards the prefrontal cortex,
00:09:12.300 the more you get the constraint of immediate behavior by future, what would you say, future contingencies, right?
00:09:22.500 And you describe that in behave as difficult.
00:09:25.880 It's very easy to fall prey to an immediate impulse.
00:09:29.020 Anger is a good example of that, or maybe fear, right?
00:09:31.660 That grips you and forces you to act in the moment.
00:09:34.560 But you want to constrain your impulses, which would be manifestations of brain circuits that are much more evolutionarily ancient.
00:09:43.520 You want to constrain those with increased knowledge of multiple future possibilities in a complex social landscape.
00:09:51.040 And those are also somewhat specific to the circumstance.
00:09:53.880 So the prefrontal cortex also is more programmable because the relationship between the future and the present varies quite substantially with the particularities of the environment.
00:10:06.560 But the fundamental point is that in game theory is that the consequences of your immediate action have to be bounded by the future and by the social context.
00:10:17.000 So I was thinking about something here recently.
00:10:19.020 You tell me what you think about this.
00:10:20.300 Because you write a little bit about religious issues in your book, too, although not a lot, but some.
00:10:25.660 So I was thinking about this notion that you should love your enemy as yourself and that you should treat your neighbor as if he's yourself.
00:10:33.520 I mean, one of those is an extension of the other.
00:10:36.380 And I think there's actually a technical reason for that.
00:10:39.440 Tell me what you think of this logic.
00:10:41.880 So the first question might be, what is yourself, the self you're trying to protect?
00:10:48.120 And one answer to that is it's what you want right now and what would protect you right now.
00:10:53.880 But another answer is, yeah, fair enough, you know, now matters.
00:10:58.060 But there's going to be you tomorrow and you next week and you in a month and you in a year and five years.
00:11:03.040 And what that implies is that you yourself are a community that stretches across time.
00:11:10.400 And as that community, you're also going to be very varied in your manifestation.
00:11:15.380 Sometimes you're going to be like top lobster and dominant as hell.
00:11:18.620 And sometimes you're going to be sick and in the hospital.
00:11:20.960 And there's going to be a lot of variation in who you are across time.
00:11:24.660 And so if you're treating yourself properly in the highest sense, you're going to treat yourself as that community that extends across time.
00:11:34.420 And then I would say there's actually no difference technically.
00:11:37.500 And maybe this is a game theory proposition.
00:11:39.280 There's no difference between that technically and treating other people well.
00:11:43.960 Is that you're a community across time, just like the community is a community.
00:11:49.600 And the ethical obligation to yourself as an extended creature is identical with the obligation that you have all things considered to other people.
00:12:02.160 So I'm wondering what you think about that proposition, if that makes sense to you, if you think there might be exceptions to that.
00:12:07.380 That makes perfect sense because that immediately dumps you into the, are there any real altruists out there?
00:12:15.380 Scratch an altruist and a narcissist lead sort of thing.
00:12:19.840 That anything within the realm of self-constraints and forward-looking pro-sociality and all of that, what's somewhere in there is running in between the lines is the golden rule.
00:12:35.020 And in the long run, this will be better if I do this.
00:12:40.380 And what defines the species is, you know, two lobsters can do game theory dominance displays.
00:12:48.200 But we are the species that is dominated by the concept of in the long run.
00:12:54.660 Or the more frontally regulated among us.
00:12:58.780 But that's absolutely the heart of it.
00:13:02.260 And which has always struck me, it's very easy to, like, dump on utilitarian thinking.
00:13:10.080 And because it's always easy to say, oh, my God, so would you push your grandmother in front of the runaway trolley and it just feels wrong?
00:13:18.640 And would you convict an innocent person if that's going to make society better in all of those scenarios where utilitarian thinking just sticks in your throat?
00:13:30.020 It just doesn't feel right.
00:13:31.860 And where the resolution always is, is utilitarian thinking in the long run.
00:13:37.740 If it's okay to do this, what are we going to decide is okay to do tomorrow?
00:13:42.620 And what's the slope where we're going to be heading down?
00:13:46.380 And it requires a sort of deep, distal, not just proximal, utilitarian mindset.
00:13:55.100 And when you work in shadow of the future and in the long run, suddenly what winds up being, you know, the easiest possible solution to maximizing everyone's good looks a whole lot more palatable.
00:14:09.240 Yeah, well, those strange questions that come up when people, they pick these contexts where utilitarian thinking seems to involve a paradox.
00:14:21.660 I mean, those are paradoxes of duty and they do come up.
00:14:24.520 But all that indicates, and I think this is what you're pointing out, all that indicates is that there are often conflicts between what seems morally appropriate immediately and what seems morally appropriate when it's iterated.
00:14:35.660 And sometimes those conflicts are going to be intense.
00:14:38.880 And, of course, those are the ones that we have a very difficult time calculating, and no wonder.
00:14:43.400 But I would also say those are also the times when intense negotiation is necessary.
00:14:48.180 You know, like if you and I are in a situation where my immediate good and our long-term good are in conflict, then I better talk to you a bunch to find out what at least, you know, what the most livable solution is, even if we can't do it perfectly.
00:15:04.680 And the fact that there's going to be conflicts doesn't invalidate the general necessity of having to consider iteration.
00:15:11.020 Now, you talk a lot in the book about tit for tat.
00:15:14.500 And so why do you outline that for people, too?
00:15:16.520 Because lots of people listening, again, this is one of these things that just sounds, it sounds trivial when you first encounter it, especially the computer simulations.
00:15:24.040 But it's absolutely, it's of stunning importance once again.
00:15:27.960 So do you want to outline the science behind these iterative game competitions and the fact that tit for tat emerged as a solution and then the variations around that, too?
00:15:39.280 Let's get into those.
00:15:40.200 Well, first off, just to sort of build on one of your points there, that repeated rounds, repeated rounds, repeated rounds of an unpredictable number.
00:15:51.380 If you're going to have an interactive interaction with someone, do you stab them in the back or do you cooperate?
00:15:57.200 And your starting point is you're never going to see this person again and they have no means of telling anyone else on earth if you were a jerk or whatever.
00:16:05.120 The only real politic thing that anyone could ever do is don't cooperate, stab them in the back if you have only one round that you're going to interact with.
00:16:15.420 And then you get this horrible regressive thing that if you're going to interact with them for two rounds, what's the logical thing to do on the second round?
00:16:22.580 Stab them in the back.
00:16:23.880 So you've already defaulted into knowing that the second round is going to be non-cooperation.
00:16:28.440 So what do you do in the first round?
00:16:29.980 You already know the second round is a given, so you might as well stab them in the back on the first one.
00:16:34.520 And if there's three rounds, you go backwards.
00:16:37.140 And at every one of those points, if you're hyper-rational, no matter how many rounds ahead of you there are, if you know how many there are going to be, the only like uber spocky and logical thing to do is to never ever cooperate.
00:16:53.580 Where the breakthrough comes in is when you don't know how many rounds there are in the future.
00:16:59.840 And that's where you get selection for cooperation.
00:17:04.520 And that's where you see a world of differences in social species who are migratory versus ones who are not.
00:17:11.640 If I do something nice to this guy, is he going to be around next Tuesday to help me out?
00:17:16.200 Not if he's like a Syrian golden hamster.
00:17:19.200 He's migratory.
00:17:20.180 He's going to be gone.
00:17:21.060 On the other hand, if he's a human living in a sedentary settlement, yeah, maybe if I could trust him or not.
00:17:28.460 So, yeah, key point of an unknown number of rounds in the future, because you never know, you know, putting it most cynically, how much of a chance they're going to have in the future to get back at you if you were a jerk right now in the present.
00:17:43.400 So, that emphasis on unknown number of rounds, what you allude to is like the poster child, the fruit fly of people who do game theory studies, the prisoner's dilemma.
00:17:56.180 Where essentially, there's a whole story that goes with it, but you have to decide, are you going to cooperate with someone or are you going to stab them in the back?
00:18:06.720 And the way it works is, if you both cooperate, you both get a decent reward.
00:18:12.820 If you both stab each other in the back, you both get punished to a certain extent.
00:18:17.620 But if you manage to get them to cooperate with you, but you stab them in the back, they get a tremendous loss and you get a huge number of brownie points.
00:18:28.020 And conversely, if they've suckered you into being cooperative and then they stab you in the back, you're way bigger.
00:18:34.500 So, this whole world of when do you cooperate and when do you do anything other than that, always within this realm of multiple rounds, but under a number.
00:18:45.520 So, this guy, Robert Axelrod, who's like this senior major figure in sort of political science, teamed up with this evolutionary biologist, W.D. Hamilton, one of the gods in that field.
00:19:00.800 And they said, well, let's talk to a whole bunch of our friends, a whole bunch of our friends who think seriously about this stuff and tell them about the prisoner's dilemma and have each one of them tell us what would their strategy be when playing the prisoner's dilemma?
00:19:15.900 How would you do an unknown number of rounds and maximize your wins at the end?
00:19:20.860 And they asked, like, Nobel Peace Prize winners and Mother Teresa and prize fighters and warlords and mathematicians.
00:19:32.060 And they collected just a zillion people's different strategies.
00:19:35.940 And then they ran this round-robin tournament on this, like, ancient 1970s computer of just running each strategy against all the other ones, a gazillion rounds, to see which one worked best, which one won.
00:19:51.640 Or in the terms that evolutionary biologists quickly started using, which strategy drove all the others into extinction?
00:19:58.700 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:20:05.260 Most of the time, you'll probably be fine.
00:20:07.240 But what if one day that weird yellow mask drops down from overhead and you have no idea what to do?
00:20:12.940 In our hyper-connected world, your digital privacy isn't just a luxury.
00:20:16.760 It's a fundamental right.
00:20:18.060 Every time you connect to an unsecured network in a cafe, hotel, or airport,
00:20:22.320 you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
00:20:27.260 And let's be clear, it doesn't take a genius hacker to do this.
00:20:30.580 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords, bank logins, and credit card details.
00:20:37.960 Now, you might think, what's the big deal?
00:20:40.080 Who'd want my data anyway?
00:20:41.620 Well, on the dark web, your personal information could fetch up to $1,000.
00:20:46.040 That's right, there's a whole underground economy built on stolen identities.
00:20:50.280 Enter ExpressVPN.
00:20:52.040 It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
00:20:56.320 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:21:02.400 But don't let its power fool you.
00:21:04.200 ExpressVPN is incredibly user-friendly.
00:21:06.580 With just one click, you're protected across all your devices.
00:21:09.580 Phones, laptops, tablets, you name it.
00:21:11.760 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:21:15.900 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:21:21.620 Secure your online data today by visiting expressvpn.com slash jordan.
00:21:26.620 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
00:21:33.000 Expressvpn.com slash jordan.
00:21:34.880 And the thing that flattened everybody was you had these people putting in these algorithms and probabilities and fuzzy logic and God knows what.
00:21:48.420 And the one that beat all the others was the simplest one out there, tit for tat.
00:21:54.180 You start off by cooperating.
00:21:56.000 If the other guy is a jerk at some point and stabs you in the back, the next round, you tit for tat him back.
00:22:03.020 You stab him back.
00:22:04.520 If he goes back to cooperating, then you go back to cooperating.
00:22:08.420 You've forgiven him.
00:22:09.720 If he keeps on being a jerk, you keep on being a jerk.
00:22:13.020 And even though what you see is by the person being a jerk, they're always one round ahead of you.
00:22:20.380 And that seems pretty disadvantageous.
00:22:22.620 You're always going to be one step behind the individual who stab you in the back.
00:22:26.440 When you get two jerky cheaters together, all they do is constantly stab each other in the back and make it the worst possible outcome.
00:22:36.420 And what you see with something like that is with tit for tat, if you're a nice cooperative guy and start off with that assumption, you lose the battles with the jerks, but you win the wars.
00:22:48.460 Right, right, right.
00:22:49.540 And cooperators find each other.
00:22:52.680 And this strategy outcompeted everyone.
00:22:55.200 And everyone couldn't believe it because of how simplistic it was.
00:22:58.280 And that was exactly, it was straightforward.
00:23:01.160 It was easy to understand.
00:23:03.560 Its starting point was one of cooperation, giving somebody the benefit of the doubt from the start.
00:23:09.920 But it was nonetheless, not a sucker.
00:23:13.560 It was punitive.
00:23:14.580 It was capable of retribution.
00:23:16.100 And if the other player who had, like, sinned against them, corrected their ways, it was forgiven.
00:23:25.600 And it was a simple, and this outcompeted all of the other ones.
00:23:31.020 And what everyone sort of in the zoology world went about saying at that point is, oh my God, do animals go about tit for tat strategies when they're in competitive circumstances where they've got to decide, am I going to cooperate or am I going to cheat?
00:23:47.100 And that sort of thing, has evolution sculpted optimal competitive cooperative behavior in all sorts of species to solve the prisoner's dilemma problem?
00:23:59.400 And people went and looked, and it turned out, like, what do you know?
00:24:04.280 Evolution had sculpted exactly that in all sorts of species.
00:24:07.620 It was, like, phenomenal, interesting findings where if you, like, experimentally manipulate one animal to make it look like they're not reciprocating in something that somebody else just did for them, and everybody punishes them one round afterward, and they go back to cooperating again, and everyone forgives them.
00:24:26.200 That's tit for tat.
00:24:27.340 All sorts of species out there were doing tit for tat.
00:24:31.020 Fabulous example of this.
00:24:34.020 I am forgetting his name.
00:24:36.520 Wilkinson studies bats.
00:24:38.980 Bats, some bat species, they do communal nesting stuff.
00:24:44.400 All the female bats have all their nests together, and they're communal in this literal sense.
00:24:49.420 They're vampire bats, which means they fly out at night, and they, like, get blood from some cow or some victim.
00:24:57.340 And they're not actually drinking the blood.
00:25:00.940 They're storing it in their throat sacks.
00:25:03.480 And they come back to their nest, and what they do is they disgorge the blood then to feed their babies.
00:25:08.580 And the hugely cooperative cool thing about the species is it's cooperative feeding, not just among, like, sisters, but through the everybody feeds each other's kids.
00:25:19.720 That's great.
00:25:20.460 So they've got this whole collaborative system, and it buffers you against one animal's failure to find food one night, and, like, everyone scratches each other's back, and it works wonderfully.
00:25:30.900 Now make the bats think that one of them is cheating.
00:25:35.340 One of them has violated feeding all each other's kids social contract.
00:25:40.180 When the bat comes out of the cave or whatever, you, like, net it and get a hold of the bat, and you pump up the throat sack with air.
00:25:48.580 And you put her back there in the nest, and she doesn't have any blood, but everybody's looking at her saying, oh, my God, look at how big her throat sack is.
00:25:56.160 Look at all the blood she has.
00:25:57.280 And she's not feeding my kid.
00:26:01.220 She's reneging on our social contract here.
00:26:04.780 And the next round, nobody feeds her kids for one round.
00:26:09.280 Oh, my God.
00:26:10.520 Have you evolved the optimal prisoner's dilemma strategy of tit-for-tat?
00:26:16.440 This was, like, phenomenal.
00:26:19.400 What people then began to see was out in the real world, straight tit-for-tat is not quite enough.
00:26:25.760 It's supposed you get a signal error.
00:26:28.860 And this is straight out of, I don't know, we're roughly the same age.
00:26:32.720 I don't know if you grew up reading all those, like, Cold War terrifying novels.
00:26:37.320 There's a glitch.
00:26:38.320 Oh, yeah.
00:26:38.700 What was a failsafe or something?
00:26:42.760 We're going to drop an atomic bomb on Moscow by accident, and the only way to prove to them it was an accident, they get to drop one on New York and tit-for-tat and all of that.
00:26:53.220 And what that introduced was the possibility of a signal error.
00:26:59.260 You're cooperating, but there's a glitch in the system, and the other individual believes you just stabbed them in the back.
00:27:06.360 Yeah, I think virtualization probably increases signal error, by the way.
00:27:12.180 You know, I've noticed that, well, I've noticed that when I've put together business enterprises that you can virtualize the cooperation, but if any misunderstanding emerges, it tends to cascade very rapidly.
00:27:25.960 And you don't have, you know, one of the things you also point out in Behave is that it isn't only that you're playing a sequence of iterated games with people, it's you're playing multiple sequences of multiple different iterated games.
00:27:37.880 And so one of the things that happens if you're face-to-face with people, as opposed to virtual, is that when you're face-to-face with them, this is probably the key importance of the issue of hospitality, which is very much stressed, for example.
00:27:52.300 And, well, it's stressed in the Old Testament, but it's stressed in traditional communities, is that if you're actually in an embodied space with people, you can play multiple games with them.
00:28:03.160 Games of humor, games of food exchange, games of music, dance, celebration.
00:28:08.340 And so you can test out their capacity for reciprocity in multiple situations.
00:28:13.380 And so then if there's a signal error, you can mitigate against it because you know that you've tested the person out in all sorts of different circumstances.
00:28:21.120 But when you virtualize things, it's very narrow.
00:28:24.240 The channel's now very narrow.
00:28:25.760 Yeah, yeah.
00:28:26.260 And so I'm very concerned about a lot of virtualization, too, because the other thing I think that virtualization is doing is enabling the psychopaths.
00:28:34.260 Because you can do a lot of one-off exchanges online with no reputation tracking.
00:28:39.420 And that seems to me that that enables the people who use, what did you call that in your book?
00:28:45.700 There's a particular kind of strategy.
00:28:47.420 Well, it's the stab you in the back strategy, essentially.
00:28:50.300 And if you can't track people's reputations across time, then you enable the people who are essentially the psychopathic manipulators.
00:28:57.940 And there's actually an emergent literature on online trolling and dark tetrad traits.
00:29:05.420 So I'm afraid we're enabling the psychopaths with the virtualization of the world.
00:29:09.180 And that's a terrifying possibility because they can take everybody out.
00:29:13.640 So, yeah.
00:29:14.020 So now you were talking about modifications of tit for tat.
00:29:17.680 Well, you bring up sort of the artificiality and the dangers there.
00:29:22.880 Okay, somebody suddenly from out of nowhere stabs you in the back.
00:29:26.960 Is this for real or is this a signal error?
00:29:29.960 And one way of getting out the other end of it is a vertical one.
00:29:35.480 Have you just had a gazillion rounds in common with that person and things have gone okay?
00:29:40.300 Have you built trust with them out of this game?
00:29:44.180 But what you outlined is instead the horizontal one.
00:29:47.220 Okay, I haven't had a gazillion rounds of this game with them, but we're also breaking bread together.
00:29:51.920 And we also did this together.
00:29:53.460 And we also have our cultural share.
00:29:55.940 Instead, lateral examples of iterated games that you could build trust on.
00:30:01.640 That's another way of solving it.
00:30:03.300 And the virtual world collapses both of those.
00:30:07.180 So what you wind up seeing when as soon as you put in a signal error, it could collapse the entire system.
00:30:15.080 So people then had to figure out how to evolve protection against signal errors as soon as that's possible in your game theory universe.
00:30:23.360 And what you have to bring in is this radical, like, upending notion of forgiveness.
00:30:32.600 Should it be like forgiveness automatically turning the other cheek?
00:30:37.280 Absolutely not.
00:30:38.360 It should be based on your prior history.
00:30:40.420 And all these algorithms of the more rounds in the game you've gone in the past with cooperation without the person doing something jerky, the faster you were willing to forgive them for what seems to have been a betrayal on their part and possibly a signal error instead.
00:31:01.120 And building up of trust, building up of social capital.
00:31:04.500 And, of course, what that opens you up to is exactly what you bring up, which is a good sociopath knows exactly how many inches they need to push it and still get under this umbrella of, well, that's a little bit worrisome, but forgivable, forgivable.
00:31:22.880 At that point, when you have a reciprocal system, that's a wolf in sheep's clothing.
00:31:27.680 A sociopath can exploit it like mad.
00:31:30.160 But at least that was the way of protecting yourself against that to some extent.
00:31:36.040 Build in.
00:31:37.040 You know, a shared culture might actually be the abstracted equivalent of a multi-situational, like an abstracted multi-situational game.
00:31:46.820 Because, like, if I live in your neighborhood, let's say, and I don't know who you are, but I know you live in my neighborhood, and nothing has happened that's untoward in the 10 years that we've been living near each other.
00:31:59.040 Then I can reasonably presume that you're pretty much like all the other people in my neighborhood, including the people I know, because if you weren't, you would have caused trouble.
00:32:09.380 And so, you know, you also talk in your book about the fact that we have a proclivity to demonize the foreign, let's say.
00:32:15.200 Right?
00:32:15.420 To fail to differentiate the foreign into the individual, which is a better way of thinking about it.
00:32:20.180 But one of the ways that we probably circumvent that with regards to shared culture is that we presume that people who are like us, which means they share our culture, are playing the same game as us.
00:32:31.520 And because nothing has gone wrong when they've been in the vicinity, we can assume that they're individuals rather than the dragon of chaos itself, let's say.
00:32:39.440 We can extend to them the a priori luxury of being individuated instead of being treated like the barbarian mob, right?
00:32:47.780 And so that's not prejudice precisely.
00:32:50.620 It's just the extension of the inclusion of a game into everybody who shares our culture.
00:32:56.620 And it would make sense that the thing is, the less someone is part of your culture, let's say, the less abstracted evidence you have that they're direct participants in a reciprocal game rather than stab you in the back psychopaths, which they could be, right?
00:33:15.260 Because that's about 3% of the population and maybe higher under some circumstances.
00:33:19.660 So you also talk in your book about something very interesting, which is something that's really puzzled me is I've not been able to figure out how honest cultures get a toehold, right?
00:33:32.720 Because as you point out that, first of all, there's some evidence that the default response of very immature individuals, 2-year-olds, let's say, isn't cooperative.
00:33:44.120 2-year-olds are not cooperative.
00:33:45.520 They are in some very bounded circumstances, but they can't play shared games very well.
00:33:49.840 That doesn't mature until the age of 3.
00:33:52.100 And so it's sort of a Hobbesian landscape among 2-year-olds.
00:33:55.120 I know there's exceptions to that.
00:33:56.880 But then as the brain matures, then the capacity for shared games starts to emerge, right?
00:34:02.280 But the fundamental question is, and you do point to this and behave, is, well, if you have a whole society of cheaters and backstabbers, which is maybe the default Hobbesian situation,
00:34:13.520 how the hell do you ever get a cooperative landscape started, much less a landscape where the default response between strangers is honest and trusting?
00:34:24.860 Now, you point out a little bit, I think maybe what you were pointing to in Behave is the initiation of low-risk trading games.
00:34:33.620 Like, I read about this jungle tribe, I think it was in South America, and they initiated trade with a foreign tribe on their border in the following manner.
00:34:44.460 They knew where the territorial boundaries were, just like wolves know, just like chimpanzees know.
00:34:49.580 You know, there's a rough fringe and boundary that's sort of no man's land.
00:34:53.180 They used to go there and leave some of their arts and crafts or their tools.
00:34:58.500 They'd just leave them on the ground, and then they'd retreat, knowing that the other people were watching them.
00:35:04.660 And then the other people would go and grab some of these cool things.
00:35:07.980 And then the other people, being not completely dim, would leave some of their trinkets and tools lying on the ground.
00:35:15.380 And that's, you know, kind of low-cost.
00:35:17.720 They weren't going to leave their most treasured possession to begin with.
00:35:21.560 They'd leave something that's sort of interesting, but they'd leave something that's sort of interesting.
00:35:23.540 But they'd, yeah, exactly that.
00:35:25.180 Exactly that.
00:35:26.220 That's there.
00:35:27.120 Yeah, yeah, yeah.
00:35:28.220 So, but what's cool is that that requires, and you pointed this out, that requires that initial movement of faith, right?
00:35:35.980 You have to presume the possibility of humanity on the other side.
00:35:40.820 Then you have to take a sacrificial risk.
00:35:43.420 And it can be small, you know, not a stupid sacrificial risk, but a reasonable one.
00:35:47.840 And that can get the ball rolling in an upward spiraling cooperative direction.
00:35:52.260 That's kind of what kids do, by the way, when they come together to start to initiate play when they're about three years old.
00:35:58.200 They'll play a real simple game to begin with, you know, one that you could maybe play with a one-year-old.
00:36:03.040 And then they ratchet up the complexity of the game right to the level where it's, what would you call, maximizing their adaptive progress.
00:36:10.520 And if they find a kid that they can do that with, then that kid becomes a friend.
00:36:14.520 And that friend is reciprocal, iterative interactions.
00:36:18.440 Starting a business can be tough, but thanks to Shopify, running your online storefront is easier than ever.
00:36:26.760 Shopify is the global commerce platform that helps you sell at every stage of your business.
00:36:31.040 From the launch your online shop stage, all the way to the did we just hit a million orders stage, Shopify is here to help you grow.
00:36:37.600 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, ship products, and track conversions.
00:36:46.100 With Shopify, customize your online store to your style with flexible templates and powerful tools,
00:36:51.400 alongside an endless list of integrations and third-party apps like on-demand printing, accounting, and chatbots.
00:36:57.540 Shopify helps you turn browsers into buyers with the internet's best converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:37:05.400 No matter how big you want to grow, Shopify gives you everything you need to take control and take your business to the next level.
00:37:12.400 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
00:37:18.360 Go to shopify.com slash jbp now to grow your business, no matter what stage you're in.
00:37:23.700 That's shopify.com slash jbp.
00:37:28.840 Great.
00:37:29.320 I mean, you've honed in on the central question, which is, in a world in which there's nothing but backstabbers, how do you jumpstart it?
00:37:39.960 Because if somebody suddenly, like, stands up and, like, recites the sermon for the mount and say,
00:37:45.340 I am going to start cooperation, everybody else is going to say, you know, what a schmuck, and stab him in the back after that,
00:37:52.460 and he will forever be one step, how do you jumpstart it?
00:37:55.780 One of the ways that you point out is the, like, tiny, tiny incremental uppings of the investment and the chance you're taking.
00:38:07.420 Another one, like evolutionary biology, people love this, founder populations.
00:38:13.800 Founder populations, this is old population ecology term.
00:38:17.160 A land bridge disappears, something where you get a population that gets isolated.
00:38:24.940 They get cut off from the main population.
00:38:27.900 And what happens over time is they get kind of inbred.
00:38:31.480 And thus, you get a lot of, like, cooperative stuff built around all being relatives and such.
00:38:36.840 And they establish a high degree of cooperation.
00:38:39.740 And then, I don't know, whatever the land bridge comes back, they go back and they join the general population.
00:38:45.780 And at that point, they are this cohort of cooperators who have figured out how to do reciprocity, how to do trust, how to do all that stuff,
00:38:56.420 which means they're a cluster of optimized tit-for-tatters, meaning they're going to out-compete everybody else.
00:39:04.060 And so, everybody else signs up on now becoming good guys.
00:39:06.860 Okay, so let me ask you about that.
00:39:09.300 So, I've got a proposition for you, and this is relevant to your speculations on the religious front.
00:39:14.520 And I want to bring Sam Harris into this, too.
00:39:17.800 So, I was reading, for example, I was reading the book of Abraham, because I'm writing a book on biblical stories.
00:39:23.940 And God promises Abraham that if he abides by the central covenant, that his descendants will outnumber everyone else's descendants.
00:39:33.420 And I have a sneaking suspicion that that's a narrativization, that's a terrible word.
00:39:41.520 It's a translation into story of the tit-for-tat reciprocal altruistic motif, which is that if you abide by this higher order sacrificial principle,
00:39:52.600 and I'll return to that sacrifice idea, if you abide by this higher order sacrificial principle,
00:39:58.060 all things considered across the longest possible span of time, your descendants will out-compete all other descendants.
00:40:06.400 And one of the things that's very cool about that story, so when God reveals this truth to Abraham,
00:40:12.080 who's decided to act in a proper sacrificial manner, right?
00:40:17.640 He's sacrificing the present to the future in the optimized manner.
00:40:22.600 Then God says, look, don't be thinking that this is going to be straightforward,
00:40:26.480 because your descendants are actually going to struggle for a number of generations.
00:40:30.900 But if you can hold out for the long run, and it's four generations in this particular story,
00:40:36.460 then you can be certain that the pattern of adaptation that you've chosen is going to work well for you,
00:40:42.260 but also very, very well for your descendants.
00:40:45.260 And so, you know, I know that Sam Harris, who's very concerned about the problem of evil,
00:40:50.340 has been trying to ground a transcendent morality in objective fact.
00:40:55.220 And I think I can admire Sam's motivation and his concern with great evils,
00:41:04.180 like the evils of the Holocaust, for example.
00:41:06.180 I think his attempt to ground morality in objective fact is misdirected,
00:41:14.100 partly because I think a much more fruitful place for an endeavor like that is actually in game theory,
00:41:21.620 because there is something there, right?
00:41:23.740 I mean, what we're basically pointing out is that the structure of iterated interact,
00:41:29.240 there is a structure of iterated interactions, right?
00:41:32.560 There's an emergent reality, and as you said, you could model that with tit-for-tat competitions
00:41:37.460 in a computer landscape, and that turns out to be ecologically generalizable.
00:41:42.560 So there's an actually underlying ethos in iterated interactions.
00:41:48.160 Now, you can imagine that as the human imagination observed interactions over vast stretches of time,
00:41:55.620 it started to aggregate imaginative representations of that ethos and to extract it upward.
00:42:03.040 And it seems to me that that would dovetail with the maturation and domination of the prefrontal cortex,
00:42:09.340 because what's starting to happen is that you're using long-term strategies to govern short-term exigencies.
00:42:17.660 And that's a very difficult thing to do, because, of course, the short-term sometimes screeches and yells extraordinarily loudly.
00:42:23.160 But part of what the religious enterprise seems to be doing, as far as I can tell,
00:42:28.120 is mapping this pattern of sacrifice of the present to the future,
00:42:31.640 and making the proposition that that is the, all things considered, that is the optimal adaptive strategy.
00:42:37.820 So I don't know what you think about those sorts of suppositions.
00:42:41.220 I think that's perfect.
00:42:43.880 I mean, when you look at, like, dopamine, its role in gratification, postponement,
00:42:49.540 and dopamine is anticipatory, all of that, there's a whole literature built around lab rats and lab monkeys,
00:42:55.780 and wow, it works just like in us in terms of being able to sustain behavior in anticipation of reward.
00:43:03.000 Isn't that amazing?
00:43:04.060 Just like us.
00:43:05.280 But we do it for an entire lifetime in anticipation of reward in the afterlife.
00:43:11.740 Like, that's on a scale that's very, very human.
00:43:15.880 It has always struck me, like, I could not possibly be on center ice getting into comparative religion stuff here,
00:43:23.560 but it has always struck me that the sort of Abraham and the covenant and the people of the stick with us,
00:43:33.900 and it's going to be great, you've got this dichotomy between religions where something amazing has happened,
00:43:41.740 and it's so amazing that you just have to join, and everything is about recruitment.
00:43:49.820 And then you have the religions that are about retention,
00:43:54.380 because the reward is going to be amazing if you stick it out with us.
00:43:59.260 And, like, traditional nomadic, pastoralist religions is about retention,
00:44:09.040 because you get a big problem, because you're wandering all over the back of beyond,
00:44:13.320 because you're nomadic and passing all these other tribes,
00:44:17.100 and maybe the grass seems greener with them,
00:44:19.840 so maybe it's a good time to decide to sort of switch over to those folks there.
00:44:23.940 Stick with us, stick with us,
00:44:25.180 because it's going to be amazing when the Lord finally comes through with all his promises.
00:44:31.020 That's like an ecological adaptation to nomadic pastoralism,
00:44:38.140 which is where the Old Testament came from.
00:44:41.300 And what you also get from that is,
00:44:43.600 and we're going to throw in something extra,
00:44:45.240 so you can't decide to, like, slip away at night and become like a Canaanite or something.
00:44:50.580 We're going to mark you in a very fundamental way that you could never pass yourself off as one of them.
00:44:56.100 We're going to invent circumcision.
00:44:58.140 So you can't fake them out on that either.
00:45:01.040 You better stick with us for retention.
00:45:02.880 Retention was a great reward coming.
00:45:05.760 And everything about the New Testament is something phenomenal happened.
00:45:10.780 There's really good news.
00:45:12.740 And isn't this so cool that you want to join us?
00:45:15.240 And I think the whole, like, developing a frontal cortex for it's going to come,
00:45:22.920 it's going to come if you hold your breath.
00:45:24.300 Yeah, yeah, yeah.
00:45:25.460 It's much more a product of religions of retention rather than religions of recruitment.
00:45:31.520 Yeah, well, that bridge that you're drawing between the long view and dopaminergic function is extremely interesting.
00:45:43.000 I want to go back to that part in your book because you're pointing out that the dopaminergic system doesn't just signal reward.
00:45:51.780 It signals the presence of, what would you say?
00:45:55.980 It signals that your theory that reward is likely to occur under these conditions is correct.
00:46:01.520 Yeah.
00:46:02.160 So it's reinforcing, what it's doing is actually reinforcing the potency and integrity of a predictive system that's actually predicting positively.
00:46:13.260 And you would want that reinforced.
00:46:16.320 I'm curious about this issue of sacrifice in relationship to cortical maturation.
00:46:23.080 Because one of the things, this is like a definition of maturity, you might say, is that the more mature you are,
00:46:29.780 the more you are able to forego comparatively immediate gratification for probably larger but deferred gratification.
00:46:40.140 Right?
00:46:40.300 So you start to tilt in the direction of the future rather than the present.
00:46:45.020 Okay.
00:46:45.300 So in the story of Cain and Abel, for example, so Cain and Abel are the first two human beings, right?
00:46:51.720 Really, because Adam and Eve are made by God.
00:46:54.060 So forget about them.
00:46:55.060 Cain and Abel are the first actual human beings.
00:46:58.160 That's when work is invented, right?
00:47:00.720 Because sacrifice and work are the same thing.
00:47:03.500 When you work, you're not doing what you want to in the moment.
00:47:06.100 And when you work, what you're doing is not doing what you want to in the moment so that the future will be better or so that your family can thrive, right?
00:47:13.880 It's deferred and social.
00:47:16.220 It's deferred and communal.
00:47:17.840 It's like the definition of work.
00:47:19.920 And then the idea is that if you work properly, whatever that means, and that's what Abel does, then your sacrifices are going to be rewarded by God.
00:47:26.900 Whereas if you hold back and you take the psychopath route and you pretend, then you're going to be deeply punished.
00:47:34.240 But the fundamental issue there, and this is the question that I have for you, is that it seems to me that there's a very tight relationship between the insistence that sacrifice is necessary and maturation and the emergence of the prefrontal cortex as a predictor of deferred future reward out of the landscape established by, say, the limbic system that's much more concerned with immediate gratification.
00:48:03.560 So it's sacrifice compared to immediate gratification.
00:48:07.340 And then there's a discussion of what constitutes proper sacrifice.
00:48:11.540 Exactly.
00:48:12.460 And that's where, like, you study dopamine neurochemistry and this receptor subtype of the dopamine receptor, blah, blah, all of that.
00:48:23.140 And when you really look at the system, what you have to come away with is we humans have the exact same neurochemical system as every animal out there.
00:48:33.760 And we have a totally unrecognizably different one because we mobilize the same damn molecule and the same, like, mesolimbic cortical pathways.
00:48:46.040 And we do it so that our great-grandkids will have a better planet.
00:48:50.860 And we do it for an after.
00:48:52.780 Like, we do it on a...
00:48:54.180 Well, do you think there's any difference between that and the idea of an afterlife?
00:48:57.480 Like, I mean, if I'm thinking six generations into the future, why wouldn't that be represented symbolically as something like an afterlife?
00:49:06.260 Because it is an afterlife.
00:49:07.760 I'm dead.
00:49:08.360 And if I'm trying to conduct my behavior in a manner that's so moral that it's actually echoing properly a thousand years into the future, I don't really see any difference between that, practically speaking, and my conception that my behavior should be governed by something like infinite regard for the potential future.
00:49:28.640 I mean, it's tricky, right, because you have to discount the future to some degree to survive.
00:49:33.800 But all things considered, you're still trying to set up a situation where your behavior in the present maximizes the utility of your behavior across all possible iterations out into the future.
00:49:43.780 And as soon as you allow for the possibility of, like, your footprints lasting longer than your lifespan, this is a whole new ballgame, either in the form of there's an afterlife or in the form of I want to leave a planet for my great-great-grandchildren.
00:50:04.060 That's going to be a more peaceful, wonderful one.
00:50:06.860 Or even in the form of, like, every time you sit at, like, a typical funeral where everybody's going through the usual eulogies of, like, distortively amplifying the good traits of someone and ignoring the bad, what's going through your head is, how do I want to be remembered?
00:50:26.260 Whoa, that's a whole other world of, like, what you're doing now.
00:50:33.020 The footprints you leave after you are going to matter.
00:50:36.500 And like all the versions we have, we would like to thank the students we train.
00:50:41.020 We would like to thank people 300 years from now.
00:50:44.180 We're going to think we've composed the most amazing, like, mass and B minor, and that's satisfying.
00:50:49.820 Yeah, we've invented a whole weird world of being able to have anticipatory motivation built around stuff that's going to last longer than us.
00:51:00.300 And in some ways, you could be like Paul Ehrlich and think about what's going to happen to the planet in a century from populations.
00:51:08.320 Or you could think about the afterlife.
00:51:10.600 But any of these are, like, radically human domains.
00:51:14.900 Mm-hmm, that's that extension of knowledge out indefinitely into the future, right, which is something that seems to characterize human beings.
00:51:24.680 And that might also be a consequence of cortical expansion, right, the discovery of that infinite future.
00:51:30.540 Yeah, yeah.
00:51:31.440 And so, okay, so let me ask you a question.
00:51:35.220 Let me ask you a question about that, too.
00:51:37.820 Yes, I'm not exactly clear.
00:51:40.980 I've spent a fair bit of time studying the dopaminergic system and its relationship to reward and reinforcement.
00:51:47.660 But I wasn't as clear as I would like to be about the role of dopamine in anticipation of future reward.
00:51:56.140 And like I said, I read that in your book, and I started to understand it, but I don't completely understand it.
00:52:01.800 And so, now, dopamine will signal if you lay out a structure of behavior, and that structure of behavior produces the desired outcome.
00:52:09.760 You get a dopamine kick.
00:52:11.220 That feels good, which is sort of the generalized element.
00:52:15.180 But the dopamine also preferentially encourages the neural structures that were active in the sequencing of that behavior to grow and flourish.
00:52:25.080 And that's the distinction between reward and reinforcement.
00:52:28.200 But you talk about anticipation.
00:52:30.900 And I know I'm missing something there.
00:52:32.400 So, will you walk me through in a little bit more detail how the dopamine system works in relationship specifically to anticipation of the future rather than just responding, say, to successful behavior?
00:52:43.800 So, you know, unpacking this a bit, exactly what you were referring to, like, take a rat, take a monkey, take a college freshman and psych 101, whatever, and give them a totally unexpected reward from out of nowhere.
00:53:00.420 And you can show that there's activation of dopaminergic reward pathways in the limbic system.
00:53:09.620 And you can do that with functional imaging.
00:53:12.140 You could do that with something invasive with your lab animal, whatever.
00:53:15.760 Okay, dopamine's about reward.
00:53:17.620 It's completely about reward.
00:53:19.280 Give somebody cocaine, and they will release more dopamine than any vertebrate in all of history has ever been able to do.
00:53:25.800 And, yeah, it's about reward until you then get a little bit more subtle with your paradigm.
00:53:32.100 And now you take that, you know, human rat monkey and put them in a setting where you've trained them in a contingency.
00:53:41.120 A little light comes on, which means now if they go over to this lever and hit the lever 10 times, they'll then get a reward.
00:53:49.800 Signal, work, reward, signal, work, reward.
00:53:52.140 And as soon as they've learned it, when does dopamine go up?
00:53:56.540 And what we think we just learned from the first example is when you get the reward.
00:54:01.060 Not at all.
00:54:02.420 It goes up when the signal turns on.
00:54:05.800 Because that's you sitting there saying, I know how this works.
00:54:10.620 I know how that light helps me.
00:54:12.720 I'm on top of this.
00:54:14.240 I know that lever pressing.
00:54:15.560 I'm really good at it.
00:54:17.120 I'm in familiar territory.
00:54:19.940 Exactly.
00:54:20.440 And I have agency.
00:54:22.500 And this is going to be great.
00:54:24.500 It's about the anticipation.
00:54:26.600 So why I have agency?
00:54:28.260 Why use that phrase?
00:54:29.480 Because that's very interesting, right?
00:54:31.000 Because agency implies that, well, it implies now that you're master of the situation, right?
00:54:36.860 Is that you said you're on top of it.
00:54:39.340 So is it the signaling that you're in it?
00:54:41.740 It's got to be something like the signaling in a domain.
00:54:43.760 The signaling that you're now in a domain where your behavioral competences are matched to the environmental demands, right?
00:54:49.860 And that's like being on sacred ground in a very fundamental sense, right?
00:54:55.020 Because you know what to do there.
00:54:56.720 And it seems profoundly logical.
00:54:59.240 And then you see this gigantic piece of vulnerability and illogic in the system.
00:55:04.380 Okay, so the light comes on, dopamine goes up, it's about anticipation.
00:55:09.800 Really significantly, if you block the dopamine rise, you don't get the lever pressing.
00:55:15.500 It's not just about anticipation.
00:55:17.340 It's about the work you're willing to do, driven by the anticipation.
00:55:21.420 So that's motivation, that's goal-directed behavior, all of that.
00:55:25.220 Now you throw in this extra wrinkle.
00:55:26.940 Like, well, we've been talking about our circumstances, the light comes on, you do the work, you get the reward.
00:55:32.700 You do the work, you get the reward, 100% predictability, and you have a complete sense of mastery and agency over the game.
00:55:38.760 Now the grad student switches things to you do the work, you press the lever, you do the work on that, and you get the reward only 50% of the time.
00:55:50.340 It's not guaranteed.
00:55:51.800 And beautiful work, Wolfram Schultz, Cambridge, who like pioneered all of this, showing at that point, as soon as the buzzer, the light comes on signaling, it's one of those circumstances.
00:56:04.860 Again, you get a much bigger rise of dopamine than you got before.
00:56:10.580 Now, let me ask you about, okay, so let me ask you about that.
00:56:13.980 So what that seems to me to indicate is that you've now entered an environment where that's quasi-predictable, but now there's novelty.
00:56:22.680 And the advantage to having the dopamine signal kick in when novelty makes itself manifest is that it signals that there's also more to be learned here through exploration that might signal extreme future reward if you can just map the territory properly.
00:56:38.500 Right, because it's good to have a good thing, but it's even better to have a potentially better thing.
00:56:43.980 And novelty does contain, is that what's happening?
00:56:47.380 That's exactly it.
00:56:48.380 The most proximal thing that's going on in your head when suddenly dopamine goes 10 times higher is you've just introduced this word into the neurochemistry.
00:56:59.240 You've introduced the word maybe.
00:57:02.020 Right, right.
00:57:03.320 Maybe is intermittent, you know, that's incredible.
00:57:07.520 Yeah, yeah, yeah.
00:57:08.300 And what's always between the lines with maybe is exactly what you're outlining.
00:57:13.480 If I keep pressing the lever, I'm going to figure out what the maybe is about and be able to turn it.
00:57:19.640 Yeah, yeah.
00:57:21.120 I'm going to master this.
00:57:23.460 I'll be the new master of a new territory then.
00:57:25.840 Exactly, and the longer they can dangle the maybe in front of you, and the more they can manipulate you into thinking that what feels like a 50% chance of getting reward, in reality it's a one-tenth of a thousandth percent chance, but they understand you're signaling sufficiently.
00:57:43.940 So that's intermittent partial reinforcement, and that's why it grips you, because it falsely signals novelty treasure, and you can manipulate that.
00:57:56.200 Now, you pointed out something extremely dangerous in your book, right, because I had thought about this in terms of building the ultimately addictive slot machine.
00:58:04.680 You showed that if you're playing a slot machine and the tumblers line up, almost line up, two out of three or four out of five, then you're much more likely to get a dopamine kick.
00:58:16.720 So you could imagine a digital slot machine where you have multiple tumblers, where you code it to the player so that the machine knows that it's the same player playing, and that the proportion of almost lined up tumblers increases with gameplay.
00:58:33.100 So then you'd have intermittent partial reinforcement combined with a novelty indicator that indicated that you were obtaining false mastery over the damn game.
00:58:43.540 God, you'd have old people glued to that nonstop.
00:58:46.460 Because as soon as you switch from just going with maybe, incredibly powerful though that is, you switch over to almost.
00:58:55.260 Yeah, right.
00:58:56.260 Almost.
00:58:56.740 And yeah, do that, like, asymptotically, and people will press or lever press till, like, they die of starvation at their slot machine in Las Vegas.
00:59:09.500 Right, right.
00:59:10.180 And it goes over and feeds them for free.
00:59:12.400 Yeah.
00:59:13.220 Right.
00:59:13.500 The notion that not only...
00:59:14.780 Okay, and so as far as you're concerned, so that's so cool.
00:59:18.240 So imagine that, so I was thinking mythological terms too, because, so there's a hero element that's emerging there, because the hero in mythology is the person who goes into unknown territory and masters it, right?
00:59:30.100 And the hero is a broad symbol, character, because the hero isn't just the person who goes into unknown territory and masters it, but also gains what's there and then distributes it reciprocally.
00:59:42.480 That's the whole hero mythology, essentially.
00:59:45.560 And so your point is that the dopamine system kicks in, in part, as a consequence of predictability.
00:59:54.140 So that shows that you know what you're doing when you're in a place that's going to give you reward.
00:59:59.480 So you're in a garden that's fruitful.
01:00:01.440 But it's even better if there's an intermittent element of the reinforcement, because it shows you that there's fruit there that you have left to discover.
01:00:09.780 And if you go down that pathway, you're going to be hyper-motivated to go down that pathway.
01:00:15.100 So you want to be in a garden where there's fruit, but where the possibility of more fruit beckons, and where that possibility is dependent on the morality and, what would you call it, daring of your actions.
01:00:28.620 Now, I would say that that pattern, if a female is observing that pattern of interaction in a male, that male is going to be maximally reproductively attractive.
01:00:37.480 Well, I think that probably depends on what species we're talking about, just to become as...
01:00:42.880 Oh, sorry, I meant people.
01:00:44.060 I meant human beings.
01:00:45.600 Okay, so just to be a pain and now come back and say, well, I think that probably also depends on the culture.
01:00:50.800 But yes, that is heroism.
01:00:56.120 That's, I mean, the key thing about the path of the hero is they have setbacks.
01:01:01.260 You press the lever 10 times, and you don't get the food pellet.
01:01:05.320 And what the dopamine system is about is then saying, I'm going to press the lever twice as much, 10 times as much, more fervently.
01:01:14.640 I'm going to cross my toes.
01:01:15.880 I'm going to wear my lucky socks and underwear.
01:01:18.320 I'm going to chant, you know, ritualistic whatever, orthodoxy, because I'm willing to come back and try even harder.
01:01:26.200 And then you surmount your setback, and that's your path of the hero, and, you know, that's what dopamine is doing there.
01:01:37.080 That's why you don't give up at the first setback, and that's why ultimately getting a reward predictably every single time you press the lever gets boring after a while and gets...
01:01:52.280 Yeah, well, it shows you that there's nothing left to discover, eh?
01:01:55.880 So that's interesting, because you imagine if the optimal garden is one that's fruitful, but where the possibility of more future fruit also lurks,
01:02:08.240 then when it's reduced to merely being fruitful, there's an element of it that's dull, right?
01:02:15.820 Because there's no more future possibility.
01:02:18.100 There is predictability, and that's fine.
01:02:20.960 It's better than privation.
01:02:23.460 But it's not as good as an infinite landscape of future possibility.
01:02:28.120 Right, right.
01:02:29.040 So, you know, Dostoevsky...
01:02:31.220 Oh, sorry, go ahead.
01:02:32.420 If, in addition, not only just sticking it out get you more mastery and eventually almost becomes definite and all that,
01:02:40.420 but if also you're set up so that your sense of self becomes more solidified because you're sticking with it,
01:02:49.020 because your metaphorical ability to look at yourself in the mirror and all,
01:02:54.660 if that's an added layer of what you've been, like, acculturated into, whoa, that's...
01:03:02.780 Yeah, yeah, you bet.
01:03:03.960 That oxygen...
01:03:04.520 Hey, so there's an analogy there.
01:03:06.160 There's an analogy there with what you might describe as the, what would you say, the admirability of fair play.
01:03:15.140 So imagine that you have a son who's playing a hockey game or a soccer game, and he's like, he's the star.
01:03:22.800 But then when he scores a goal, he celebrates a little too narcissistically, and he hogs the ball on the field, right?
01:03:29.560 And then if his players, his fellow players make a mistake, he gets pissed off and has a little tantrum.
01:03:34.220 And you take him off the field and you say, look, kid, you know, it doesn't matter whether you win or lose.
01:03:40.680 It matters how you play the game.
01:03:42.360 And he says, what the hell do you mean?
01:03:44.580 I'm clearly the best player on the team.
01:03:46.620 If people send me the ball, I score, we win.
01:03:50.240 I'm not passing the ball to these losers because then we lose.
01:03:53.160 What the hell are you talking about, dad?
01:03:56.520 And you don't know what to say, but what you should say is, look, kid, the reason it doesn't matter whether you win or lose, but how you play the game is because life is a sequence of never-ending multiple games.
01:04:06.900 And you're a winner if people want to play with you, and if you're a little prick when you win any given game, and if you whine and complain because you've lost, even if you're an expert at that game, no one's going to want to play with you, and you're a loser, right?
01:04:20.320 And I think that's analogous to, I think it's analogous in a very profound sense to that prefrontal maturation that puts the future above the present.
01:04:29.700 But I also think it's analogous in a deep way to the pattern of behavior that we talked about, and I don't know exactly why this is, but I know it's there somewhere, that's characterized by this wanting to be in the place where future reward beckons as well as present reward.
01:04:47.720 You know, those things are going to stack.
01:04:49.200 They have to stack on top of each other, right?
01:04:51.620 Because otherwise there's going to be an intrinsic contradiction in the ethic.
01:04:54.920 So there has to be a concordance between that fair play ethos and that exploratory ethos.
01:05:03.580 Maybe it is, maybe that's in play, right?
01:05:05.520 If you're a good player and you're out there on the field, you're not just trying to score the goal.
01:05:10.460 You're also trying to play with various ways of scoring the goal.
01:05:13.380 You're playing with your teammates.
01:05:14.980 And so maybe it's in that play that you optimize exploration plus reward seeking at the same time.
01:05:23.160 And you do that communally.
01:05:24.500 And maybe that's signaled by the system of play.
01:05:27.100 You know, Jack Panksepp, the other thing he did that's so damn cool is Panksepp outlined the neurocircuitry of play.
01:05:33.380 He was the first scientist to do that, to show that there's actually a separate circuit in mammals for play.
01:05:38.460 And so, and it's not exploration exactly, right?
01:05:43.540 It's not exactly the same circuit that mediates exploration, but it's allied with it.
01:05:48.740 So I don't know how that fits into dopaminergic reinforcement, but I know that play is intrinsically reinforcing.
01:05:54.560 So, well, sort of two threads from obviously completely different universes of showing the power of this, exactly the point you bring up, which is in multiple games and multiple players and formal game theory, like you choose, you foster cooperation.
01:06:14.860 If there's third party punishment, if you can be for being third party punters, all these different layers.
01:06:21.380 But one of the things that really, really chooses and selects for cooperation is if people have the option to opt out of playing with you.
01:06:32.320 Yeah, yeah.
01:06:32.840 That's freedom of association, man.
01:06:35.160 That's why that's a fundamental freedom.
01:06:37.300 Exactly.
01:06:37.900 And every mother is a good game theorist in that regard when she's saying, if you do that, you won't have any friends.
01:06:44.520 Like, that's incredibly, like, that's one of the best lessons your dopamine system can get, either from the game theory and or from your mother, that the long-term goals look very different when you're simultaneously involved in umpteen different games at once with very different time courses.
01:07:03.620 Yeah.
01:07:04.720 Well, that's also relevant to that bat story you told, Abe, because one of the things I've been thinking about, too, so there's a gospel phrase that says that you should store up your treasure in heaven.
01:07:14.000 And not where rust and moths and so forth can corrupt it on earth.
01:07:18.920 And so, and here's what it means, as far as I can tell, and I want you to tell me what you think about this in light of our conversation.
01:07:24.660 So, the bat that has the pouch full of blood has that blood right then and there, and that's a form of treasure.
01:07:34.040 Now, the problem with that blood is that it's a finite resource, and hunting, which is what the bats are doing, is sporadically successful.
01:07:44.820 So, even if you're a great hunter, and this is true with hunter-gatherer tribes for human beings, even if you're the best hunter, you're going to fail a fair bit of time when you're out, especially if you're on your own.
01:07:54.140 Hunting is collective, and your success is erratic.
01:07:57.200 So, even if you're a great hunter, then you might say, well, what would make you the best of all possible hunters as far as your family was concerned?
01:08:05.800 And that wouldn't be your skill at hunting.
01:08:07.820 It would be your skill at distributing the fruits of your hunting among the other hunters.
01:08:13.300 So, they're so goddamn thrilled with what a wonderful guy you are, that every time they hunt, you get some food for your family.
01:08:20.880 And so, what you do is you store your treasure in your reputation.
01:08:25.440 And your reputation is actually the open book record of your reciprocal interactions across hunts.
01:08:32.520 Right?
01:08:32.800 So, you know that.
01:08:34.400 Go ahead.
01:08:35.800 Open book.
01:08:36.680 That's a small community.
01:08:38.620 If you're the one who hangs back and pretends to have to tie your shoes right at the scariest part of the mammoth hunt, they're going to know about it.
01:08:47.860 People are going to be talking about it over the fire.
01:08:50.080 Open book.
01:08:51.920 And like the agricultural transition and human industrialism, one of the biggest consequences is you can have anonymous interactions.
01:09:03.080 You lose all the open book conforming and forcing of reciprocity because you're anonymous.
01:09:15.500 You can get away with it.
01:09:16.820 But in like a setting like that, that's absolutely the constraining thing.
01:09:22.160 And, you know, what's the term?
01:09:24.380 The best among hunter-gatherers, the best insurance is somebody else having a full stomach.
01:09:30.100 Yeah, right.
01:09:30.700 Precisely.
01:09:31.320 Precisely.
01:09:31.840 Well, then you use, well, so then you use other people's bodies as your bank of future food.
01:09:38.380 But even more abstractly, it isn't even their bodies.
01:09:41.180 It's their mental representation of you as a reciprocal player.
01:09:45.360 And so if that's associated with, imagine that's a reputation.
01:09:49.240 So that's actually associated with your ethos and with the tracking of that ethos.
01:09:54.080 And if that ethos is something like generous, long-term-oriented, sacrificial player of multiple reciprocal games,
01:10:02.880 then all of a sudden you're protected against the exigencies of fate.
01:10:07.220 Because even if there's local failure in the food supply, people are so thrilled about your generous reciprocity that you're going to be provisioned even under the worst of all possible circumstances.
01:10:18.780 So, you know, those economic exchange games where you identify two people, you say, look, you're going to give this person some of $100, but they can reject the offer if they don't believe it's fair.
01:10:32.800 You play those cross-culturally, and the typical offer is $50, right?
01:10:37.060 It's about $50, $50.
01:10:39.160 But, you know, I've wondered, too, if the best offer isn't $60, especially if you're doing it in front of a crowd.
01:10:47.080 Because if you, imagine you err, and the best graduate supervisors do this, by the way.
01:10:52.800 If you err continuously slightly on the side of generosity, then my suspicions are is that accruing long-term reciprocal reward of that would pay off better than just a 50-50 arrangement, right?
01:11:07.940 And you can maybe see that with your, yeah, yeah, exactly that.
01:11:11.160 Well, I think you see that with your wife, too, right?
01:11:13.920 Is maybe you want to treat the people around you slightly better on average than they treat you.
01:11:19.380 Because that way you're doing this.
01:11:22.160 You're making the whole pie expand, including your own reputation.
01:11:26.840 Then you get some interesting cultural stuff comes in, because they've done all sorts of cross-cultural studies of, like, ultimatum gameplay and all of that.
01:11:36.080 And see tremendous cultural differences in whether it's 50-50, 51-40, 90-10.
01:11:41.980 Then you see there's a handful of cultures out there where you get punishment of generosity.
01:11:53.220 Somebody makes this viewed as an overly generous offer, and you punish them for it.
01:11:59.280 Oh, my God, what is that about?
01:12:01.800 And that's this, like, pathological sort of retribution sort of thing.
01:12:08.160 You're punishing them, because if they get away with being generous like that, people are going to start expecting you to do this.
01:12:15.780 Yeah, yeah, yeah.
01:12:16.680 I see that in families that are pathological all the time.
01:12:19.800 If someone makes a positive gesture, they'll get punished to death because of what that implies for the potential future behavior of all the other miscreants.
01:12:27.640 And what are those cultures, like, some of the ones where, like, God help you if you wind up being part of one of those ex-Eastern bloc countries, have the highest rates of this paradoxical punishment for generosity.
01:12:44.160 Oh, this guy's just going to make us look good.
01:12:46.580 And then everybody's, whoa, that is a troubled society.
01:12:51.640 Well, that's a vision of hell.
01:12:54.500 That's for sure.
01:12:55.100 Where you're punished for, that's what Nietzsche said about punishment.
01:12:58.200 It's such a brilliant line.
01:12:59.520 He said, and it was, look, if you're punished for breaking a rule, there's actually a form of relief in that, eh?
01:13:07.460 Because when you're punished for breaking a rule, that validates the entire rule system, and that's what you use to predict the world.
01:13:14.540 So there's a relief in being justly punished.
01:13:17.460 So what Nietzsche pointed out was if you really want to punish someone, you wait until they do something virtuous.
01:13:23.520 And they punish them for that, right?
01:13:25.120 And that's a good definition of hell.
01:13:26.760 Hell is the place where people are punished for doing what's truly virtuous.
01:13:30.940 Yeah, and like you said, you don't want to be in a society like that.
01:13:34.080 Maybe that's not as bad as it gets, because things could get pretty bad, but it's pretty bad.
01:13:40.860 Well, that's a pretty good predictor of societies with incredible rates of child bullying and spousal abuse and substance abuse and social capital that's gone down the drain.
01:13:53.120 And that's what those cultures are like.
01:13:57.980 Yeah, that's a pretty bad world in which generosity is explicitly and enthusiastically punished by the crowd of Yahoo peasants who arrived to get to forks at that point.
01:14:12.400 Yeah, you know, one of the things that I've talked to my clinical clients about and my family members, too, and a little bit more broadly lecturing, maybe it has to do with this initiation of an expanding and abundant tit-for-tat reciprocity, is that if you're really alert in your local environment, you can see people around you playing with the edge of additional generosity.
01:14:38.560 So they'll, people will make these little offerings, that's a good way of thinking about it, where they just go out of their way a little bit in a sort of secretive manner, you know, they'll sort of sneak it.
01:14:49.880 It's like a student who writes you an essay and dares to sneak in one original thought just to see what the hell happens, you know, but if you jump on that and you notice and you reward people for staying on that edge where they're being a little more generous and productive than they usually are,
01:15:08.560 you can encourage people around you to get, to be just doing that like mad and they like you a lot for it, too, because actually people are extremely happy when they're noticed for doing something that puts them on the edge of that generous expansiveness and then rewarded for it.
01:15:25.220 So even if you're not in a society that punishes that, you can actually act as an individual to differentially reward it.
01:15:31.880 That's what a good mentor does.
01:15:34.680 And it's always a cost-benefit analysis of how much am I willing to incrementally risk to start ratcheting things even further.
01:15:43.640 That's exactly it.
01:15:44.900 One of the most fascinating wrinkles in terms of like accounting for like the world's miseries and stuff is when you think about like dopamine, what are the things we anticipate?
01:16:00.960 Well, if you're a baboon, and I spent like 33 years of my life studying baboons in a while during summers, if you're a baboon, your world of pleasures and anticipation are pretty narrow.
01:16:15.060 Like you get something to eat that you want, you get to mate with someone that you want, or you're in a bad mood and there's somebody smaller and weaker who you can like take out on with impunity.
01:16:26.780 Like that's basically the realm of pleasures for a baboon.
01:16:30.960 And then you get to us, and we have all that, but we also have like liking sonnets, and we also have taking cocaine, and we also have solving Fermat's last theorem.
01:16:43.680 And we also have, you know, we've got this ridiculously wide range of pleasures.
01:16:52.040 Like we can, we're the species that can both secrete dopamine in response to cocaine or winning the lottery or multiple orgasms, and also secrete dopamine in response to smelling the first great flower in spring.
01:17:06.360 And it's the same dopamine.
01:17:07.360 And it's the same dopamine neurons in all those cases.
01:17:12.020 And what that means is, we have to have a dopamine system that can reset incredibly quickly.
01:17:19.920 Because some of the time, going from zero to 10 on the dial is, you've just gone from no nice flower smell to nice flower smell.
01:17:30.320 And some of the time, going from zero to 10 is, you've just like conquered your enemies and gone over the Alps with your elephants or something, and this is fabulous.
01:17:40.360 We have to constantly be able to reset the gain on our dopamine system.
01:17:45.920 Well, you point to something else there that's really cool, too, is that, so now you could imagine a garden that has fruit in it, and then you could imagine a garden that could even have more fruit in it.
01:17:57.780 But then you could imagine refining your taste so that you can now learn to take pleasure in things that wouldn't have given you pleasure before.
01:18:06.560 That's what artists do, eh?
01:18:08.900 Is they offer people a differentiated taste.
01:18:12.320 So, you know, if you think of a landscape painting, it's like there are certain visual scenes now that we regard as canonically beautiful.
01:18:20.720 But it's virtually certain, I mean, I know there's an evolutionary basis to that to some degree, but it's virtually certain that our taste for beauty is at least in part informed by the brilliant geniuses of the past,
01:18:34.000 who are able to differentiate the world more and more carefully and say, look, here's actually a new source of reward, right?
01:18:42.020 People do that when they invent a new musical genre or a new form of dance, right?
01:18:47.200 So not only can we multiply the rewards indefinitely if we're pursuing the proper pathway,
01:18:52.380 but we can differentiate the landscape of potential rewards, I would say, virtually indefinitely.
01:18:59.360 Now, that would be part of that prefrontal flexibility that can modify our underlying limbic responses, too.
01:19:05.040 Even though we're, you know, running down the same dopaminergic trackways, let's say, that the poor baboons run down.
01:19:10.740 Which is totally cool and so human and all, but has like this massive tragic implication,
01:19:19.500 which is the only way you could use the same dopamine neurons and same dopamine range from zero to maxing out
01:19:27.220 for like both haikus and like lottery is the system resets.
01:19:33.980 It's got to keep resetting as to what the scale is and what the gain is on the system.
01:19:38.640 What that means is it constantly resets.
01:19:42.740 It constantly habituates.
01:19:45.900 And what that means is like the most tragic thing about the human predicament.
01:19:52.060 Whatever was a great unexpected reward yesterday is going to feel like what you're entitled to today
01:19:59.920 and is going to feel insufficient tomorrow.
01:20:03.260 So, Dostoevsky, in Notes from Underground, he wrote one of the world's most compelling critiques
01:20:12.580 of, what would you call it, satiating utopianism.
01:20:17.860 So, Dostoevsky said, essentially, if you gave people everything they wanted,
01:20:23.020 nothing to do but eat cakes, lie around in pools of warm water,
01:20:26.820 and busy themselves with the continuation of the species.
01:20:29.540 So, sort of ideal baboon life that people would purposefully, eventually, purposefully rise up
01:20:36.620 and just smash all that to hell just so something interesting would happen
01:20:40.840 because that's the sort of crazy creatures we are.
01:20:43.820 But, you know, you said that's a tragedy and you can understand that, right?
01:20:47.700 Because it means that today's satiation is tomorrow's unhappiness.
01:20:51.600 But, by the same token, it's also the enabling precondition for the impetus to discover new
01:20:59.600 landscapes of reward and new forms of reward, right?
01:21:02.300 Because if you didn't habituate to what you already had, you'd, well, I think you'd fall
01:21:07.680 into a kind of infantile satiation and maybe you'd just fall asleep, right?
01:21:12.240 Because if you're completely, this is the difference between satiation and incentive reward.
01:21:16.940 If you're satiated, then you just fall asleep.
01:21:19.860 Right. Consciousness isn't for satiation.
01:21:22.980 Consciousness is for expansion, something like expansive exploration.
01:21:27.280 If we didn't habituate to reward, we would just satiate and then we wouldn't need to be conscious.
01:21:33.040 It's something like that.
01:21:34.180 I mean, this is a huge, like, half-full, half-empty thing.
01:21:40.060 And we're the species that's always hungry because yesterday's excitement is not enough tomorrow.
01:21:51.320 And that means it's never going to be enough.
01:21:54.340 And we're the species that yearns in that way and is never satisfied.
01:22:02.140 And thus, among other things, we're the species that then invents, you know, technology and poetry
01:22:10.060 and then the maturina and wheels and fire and everything.
01:22:15.520 Yeah.
01:22:16.500 It's like, it's this double-edged...
01:22:19.840 Okay, so I'm going to go back to this Abrahamic story because it's very interesting in this regard, right?
01:22:27.000 We talked about it already in relationship to the possibility of a particular ethos coming to dominate an evolutionary landscape.
01:22:33.920 But something very interesting happens at the beginning of the Abrahamic story.
01:22:38.800 And Abraham is the father of nations.
01:22:40.700 So this is a good classic narrative example.
01:22:44.460 So Abraham is actually fully satiated at the beginning of that story because he's like 75 and he has rich parents.
01:22:52.240 And all he's done his whole life is, like, lay in a hammock and eat peeled grapes.
01:22:56.000 And, like, he has everything he needs, absolutely everything.
01:23:00.500 And then this voice comes to him and says, this isn't what you're built for.
01:23:05.740 You should get the hell out there in the world, right?
01:23:08.540 And Abraham hearkens to that voice, so to speak.
01:23:12.180 He leaves his satiated surrounding and he goes out into the world.
01:23:16.040 And actually, what happens is quite catastrophic.
01:23:19.600 It's certainly not a simple comedy, the story,
01:23:23.240 because he encounters war and famine and Egyptian tyranny and the aristocrats conspire to steal his wife.
01:23:32.280 And he's called upon by God to sacrifice his only son.
01:23:37.940 And it's like, it's quite the bloody catastrophe.
01:23:40.040 But the idea in the story is that the path of maximal adventure is better than the path of infantile satiation.
01:23:49.960 And so you might say, human beings are eternally dissatisfied.
01:23:53.700 I mean, that's one way of looking at it.
01:23:55.440 Or you could say, well, there's an abstract form of meta-satiation.
01:24:00.480 Let's put it that way.
01:24:01.840 That's the same as being on, it's like a bloodhound being on the trail.
01:24:06.300 It's the pleasure of the hunt.
01:24:07.840 It's the pleasure of the adventure.
01:24:09.240 It's the pleasure of that forward-seeking, right?
01:24:11.920 And I like to think about it like Sisyphus, you know,
01:24:14.060 except that what Sisyphus is doing is pushing a sequence of ever-larger boulders up,
01:24:19.060 a sequence of ever-higher mountains.
01:24:21.100 It's not the same.
01:24:22.120 It's, you know, it's this continual movement upward towards some unspecified positive goal.
01:24:27.340 And then the ultimate satiation isn't the top of any of those mountains.
01:24:31.340 It's the sequential journey across that sequence of peaks.
01:24:36.040 And I suspect that's what that dopamine system is actually signaling when it's,
01:24:41.080 because that would make sense with regards to anticipation.
01:24:44.100 It's the happiness of pursuit rather than the other way around.
01:24:50.600 And that's incredibly addictive in that regard.
01:24:55.120 You know, you can't get rats in a normative social environment addicted to cocaine.
01:25:02.900 You have to put them in a, you have to isolate them in a cage.
01:25:06.040 So if you have a rat that's going about his rat business, you know,
01:25:09.340 he's got his rat friends and his rat family and his rat adventures,
01:25:12.760 he won't succumb to cocaine like an isolated rat in a cage.
01:25:17.240 So one of the things that's also worth contemplating,
01:25:21.260 and this is relevant to your last book and maybe your next one,
01:25:24.780 is that because you're looking for a solution to something like the human propensity for violence,
01:25:29.500 you know, you might say, well, if we're not on the true adventure of our life,
01:25:34.880 which would be signaled by optimal dopaminergic function, let's say,
01:25:38.220 then we're going to look for all sorts of false adventures.
01:25:41.160 And some of those false adventures are going to be addictive.
01:25:43.920 And some of them are going to be downright pathological.
01:25:45.240 You know, you talked about the baboons who take pleasure in pounding the hell out of this,
01:25:50.080 you know, the, the weak guy that's sitting beside him.
01:25:52.660 It's like, if you're not on the track with your nose to the ground,
01:25:57.160 optimizing the firing of those exploratory and playful dopaminergic circuits,
01:26:01.360 you're going to be searching everywhere for a false adventure.
01:26:03.960 And that can come in all sorts of pathological forms.
01:26:07.960 And often like one of the falsest ones is getting what you were yearning for.
01:26:14.600 Yeah, right.
01:26:15.240 In terms of that, I mean.
01:26:17.780 Why, why do you say that?
01:26:19.340 Why do you say that?
01:26:20.380 Why did that come to mind?
01:26:23.260 Because.
01:26:27.160 Like, may, may, may you live in interesting times.
01:26:31.940 Right.
01:26:32.580 One of the greatest curses you can place on someone is to give them precisely what they've always thought they wanted.
01:26:39.620 Yeah.
01:26:40.620 And things get a little more nuanced than that.
01:26:46.640 I mean, they're, I, I, I love Borja's, Borja's stories.
01:26:51.180 He's one of the immortals where off going through this traveler journey or going through the deserts and the jungles and all of that,
01:27:00.640 searching for this mythic tribe of immortals.
01:27:03.300 And he eventually finds them because they found this river that you drink from it and you're immortal.
01:27:08.260 And they've been immortal.
01:27:09.940 And how cool is that?
01:27:11.660 And they're perpetually on the move because what they're doing is they're now looking for the fable river that will give them mortality.
01:27:21.720 Immortality turned out to be a total drag.
01:27:24.200 And they're going out of their minds with how pointless this all is.
01:27:27.240 So, this is their new quest because it turns out, like, what they wanted wasn't quite what they really wanted.
01:27:34.200 Well, you know, there's an old Jewish story about God.
01:27:37.560 It's a code.
01:27:38.180 It's like a Zen code, except it was the ancient Jews that came up with it.
01:27:43.600 What does the God who is omniscient, omnipresent, and omnipotent lack?
01:27:50.620 And the answer is limitation.
01:27:53.100 Yeah.
01:27:53.300 And so, one of the corollaries of that is God and man are, in a sense, twins, is that the absolute lacks limitation.
01:28:01.240 And so, for there to be totality, the absolute has to be paired with limitation.
01:28:06.040 And that's because limitation has advantages.
01:28:08.820 It's very paradoxical, eh?
01:28:10.120 That limitation has advantages that totality lacks.
01:28:13.420 And you can see that even in the creativity literature because the creativity literature shows quite clearly
01:28:18.340 that creativity is enhanced by the placing of arbitrary limitations.
01:28:23.780 Like, there's an archive online.
01:28:26.500 This is very funny.
01:28:27.680 There's an archive online of haiku that donated, devoted to nothing but the luncheon meat spam.
01:28:36.280 There's like 50,000 haikus written about spam.
01:28:39.220 I think, of course, MIT engineers set this up because, of course, they would.
01:28:43.520 But it's such a comical example because it shows you that, paradoxically, when you impose limitations,
01:28:50.000 and that might even include the limitations of mortality, that you produce a plethora of creative consequences emerging out of that.
01:28:57.820 And it isn't obvious, and this is what you were pointing to, it isn't obvious that if you transcended that,
01:29:04.220 absolutely, that you would be better rather than worse off.
01:29:08.140 I mean, it's a tricky question because we're always looking to be healthier and to live longer, and no wonder.
01:29:12.740 But there is something to be said for limitation.
01:29:15.780 And the fact that you have to transcend that in an adventurous manner, right, it gives you,
01:29:20.680 maybe life is the game that a particularly daring God would play, you know, because it has an infinite cost.
01:29:28.040 That's death.
01:29:28.820 And God only knows what that enables.
01:29:31.140 At the same time, it constrains.
01:29:33.660 I mean, so what's it like working with baboons, sir?
01:29:37.640 I mean, they seem like a particularly dismal primate species.
01:29:40.660 So what's it been like spending the time out there in the baking sun,
01:29:46.220 watching these, like, pretty brutal animals go at each other for 30 years?
01:29:50.000 There's, there's, they're perfect.
01:29:54.640 They're perfect for what I study.
01:29:56.360 My sort of roots as a scientist was as a stress physiologist and kind of understand what stress does to the brain,
01:30:05.660 not good things, what does stress do to vulnerability to mental illness, not good things,
01:30:10.700 what does stress do to your body, all sorts of stuff.
01:30:14.060 What does, it depends on who you are and your society and social rank and all of that.
01:30:20.660 So in my lab, I spent forever studying the effects of stress on molecular biology of neuron death and all that.
01:30:29.060 But out in the field, it was, okay, trying to make sense of these baboons.
01:30:34.360 Who's got the rotten blood pressure?
01:30:36.600 Who's got the bad cholesterol levels?
01:30:38.460 Who's got the immune system that isn't working?
01:30:40.280 What does it have to do with their rank and patterns of social stress and patterns of affiliation and basically health cycle if you're in baboons?
01:30:49.160 And why them?
01:30:52.020 They were the perfect species to study because they're out in the Serengeti, which was my field site, which is an amazing ecosystem.
01:31:01.940 Like if you're a baboon, you live in these 50 to 100 animals or so out in the Savannah, nobody messes with them.
01:31:10.220 Once a year, a lion picks off someone most of the time.
01:31:13.860 You can't touch them with that.
01:31:15.240 Infant mortality is lower than among the neighboring humans.
01:31:20.340 And you only spend three, four hours a day doing your day's foraging.
01:31:23.960 And what that means is you've got like eight, nine hours of free time every day to devote to generating psychological stress for everybody else.
01:31:33.740 They're exactly like us.
01:31:36.320 None of us get ulcers because we're like fighting for canned food items and bombed out supermarkets.
01:31:42.660 We have this luxury of generating psychosocial stress because we're westernized privileged humans.
01:31:47.960 And baboons are one of the only other models out there because they've got nine hours of free time every day.
01:31:54.640 And if you're a baboon and you're miserable, it's because another baboon has worked very intentionally to bring that about.
01:32:01.820 They're all about psychosocial stress.
01:32:04.500 They're like bloody and tooth and claw has nothing to do with them.
01:32:07.740 It's all their chest, like awful to each other.
01:32:11.500 They're perfect models for westernized psychosocial stress.
01:32:15.660 So they're not nice guys.
01:32:18.940 Like I did not grow to love a whole lot of them over the decades.
01:32:23.620 But wow, they're Machiavellian backstabbing.
01:32:28.340 And all their highest calling in life is to make some other baboon miserable.
01:32:34.960 Right, right.
01:32:35.760 So communal psychopaths.
01:32:36.660 Psychopaths.
01:32:37.300 So you did point out in your book that you studied a baboon troop where because of a historical accident, there was a plethora of females.
01:32:52.420 And then that took a lot of the competition stress away from the males and they actually started to become more civilized.
01:33:00.240 And so I have two questions about that.
01:33:02.200 It's like, why did the baboons take the psychopathic prick route on the evolutionary highway?
01:33:08.400 And what does the fact that that even, what does the fact that that's modifiable, it's quite strange really, you know, that it's modifiable.
01:33:19.120 What does that have to say, let's say about free choice in the baboon world, about whether or not it's necessary to organize your whole society on the grounds of, you know, tit for tat psychopathy?
01:33:30.460 It tells you it takes some pretty special, unique circumstances to jumpstart all the barriers to cooperation.
01:33:44.260 Right, right.
01:33:45.220 Looking at, okay, you can have one person who's willing to gamble and see a bit of vulnerability to see if somebody reciprocates, or you can have a founder effect of an inbred cooperating group,
01:33:56.060 or you can have, you know, a whole bunch of ways of jumpstarting it, but then you get a totally quirky, unpredictable event, which was the thing that happened with my baboon troop.
01:34:12.620 This was a troop my wife and I studied for years, and they had an ecological, unprecedented disaster thing that happened at one point.
01:34:23.500 There was an outbreak of tuberculosis, not among my baboons, but among the neighboring baboons.
01:34:30.760 One troop over, a troop that was living off of the garbage dump at a tourist lodge, and which is where the tubercular, it was tubercular meat coming from the tourist lodge.
01:34:42.480 And tuberculosis, you know, it takes, Thomas Mann would have enough time to write hundreds of pages of a novel before TB kills somebody.
01:34:51.700 TB kills a non-human primate in a couple of weeks.
01:34:55.040 It's like, it's a wildfire in terms of how destructive it is.
01:34:58.860 So you had this neighboring troop that had, you know, pig heaven.
01:35:05.720 They had this garbage dump from a tourist lodge, and every day a tractor came and dumped all the, like, leftover desserts and stuff from the tourist dinners and banquets.
01:35:15.500 So they were living off of that.
01:35:17.120 I actually did some studies on that troop and showed they got to start to metabolic syndrome.
01:35:21.540 They got elevated troops, they got borderline diabetes, like, yeah, like us, the same, but they had better infant survival.
01:35:32.660 The same pluses and minuses of, like, a westernized, overly indulgent diet.
01:35:37.320 But they had the greatest spot on earth, and every morning a subset of my guys would go over there to try to eat the food, would go over there and have to fight their way in, in this, like, twice as many resident males there who were pissed at who's this outsider coming in here.
01:35:57.440 These were only the most aggressive males in my troop who were willing to go and spend their mornings trying to fight for the garbage next door.
01:36:05.320 Mm-hmm, mm-hmm.
01:36:35.320 It was the most aggressive, jerky, least socialized, 50%, which some of them were high-ranking, but some of them were, like, hyper-androgenic, jerky adolescent males who were, like, spending all-day starting fights they couldn't finish.
01:36:49.820 It wasn't just a rank thing.
01:36:51.200 You didn't lose the dominant 50%.
01:36:52.660 Right, right.
01:36:53.220 You lost the 50% with the aggressive, unsocialized personalities, and that left, like, a completely different cohort of males.
01:37:06.220 It left you twice as many females as males, for one thing, which you don't normally see in a baboon troop.
01:37:12.020 So all these females who suddenly had a whole lot to gain from not having male baboons be the jerky, displacing aggression that characterizes them where they're in a bad mood, and if you're a smaller female, watch out.
01:37:27.420 But most of all, the guys who were left were nice guys.
01:37:32.000 They were socially affiliated.
01:37:34.760 They didn't take it out on someone smaller.
01:37:38.260 They still competed for rank, but they weren't displacing aggression on innocent bystanders at anywhere near the rate.
01:37:44.580 And this brought in an entire new culture into the troop, which was great and totally amazing, and isn't that cool?
01:37:54.500 And what was also cool was stress hormone levels, which is what I was able to study, and these guys were way down in them, and their immune systems were working better.
01:38:04.080 Yay, baboon utopia, all of that.
01:38:06.460 So at that point, reality intervened, and I couldn't look at that troop for about a decade, game park politics or whatever.
01:38:17.500 But a decade later, I was finally able to get back to this troop, and it was the same culture, the same wonderful culture.
01:38:23.680 Wow, wow.
01:38:24.740 Not every one of those names.
01:38:27.300 So that's another example in principle of how cooperation could initiate, right, is that you could have a circumstance at one point where the real pricks get wiped out for somewhat random reasons, and then you get a cooperative community starting.
01:38:43.000 You know, I've also read, and I don't remember who wrote about this, who suggested that over time, human beings, we really domesticated ourselves by using third-party enforcers to wipe out most of the psychopathic males.
01:38:59.240 And that also might have been a contributor to the initiation of something like a cooperative tit-for-tat reciprocating community.
01:39:06.720 Exactly, exactly, and long before we figured out that you pay third-party enforcers by hiring them as police or something, third-party enforcers gain prestige and trust.
01:39:18.520 Yeah, yeah.
01:39:19.240 And grievously, that's the payoff for it.
01:39:22.820 But the thing that was most remarkable there is baboons, male baboons, grow up, obviously, in their home troop, and around puberty, they get totally itchy, and they get ants in the pants, and they pick up.
01:39:36.540 And they transfer to their adult troop, which could be next door, could be 60 miles away.
01:39:42.600 They wind up being this, like, snivelly little parasite-riddled kid who shows up at five years of working their way up the ranks and all of that.
01:39:51.800 And so it's this transfer business.
01:39:54.060 A decade later, when going back to look at this troop, all of the males who were there at the time of the TB outbreak and survived it because of their personality, they had long since died.
01:40:08.200 All of the adult males were ones who had joined the troop since then as adolescents.
01:40:14.620 They had joined in.
01:40:16.820 And they were still civilized.
01:40:18.400 And they had learned, we don't do stuff here like that.
01:40:23.000 Wow, that's amazing.
01:40:24.280 That's really amazing.
01:40:25.520 Cultural transmission.
01:40:27.260 And what became, like, so damn interesting to look at is, how were they doing it?
01:40:34.940 How were they transmitting this culture?
01:40:37.200 And the best we were able to figure out, it wasn't observational.
01:40:45.080 It wasn't that, like, these new horrible kids show up and they just watch all these other, like, male baboons being nice because there's zero evidence for observational learning of any sort of cultural transmission and stuff like that.
01:40:58.200 Whoever discovers that is going to be, like, the king of non-human culture stuff.
01:41:03.220 So it wasn't that.
01:41:05.460 So then you wonder if there's self-selection.
01:41:08.540 Like, it's only the nice guys who transfer into that troop.
01:41:11.780 The males typically, they spend a few months, they check out this troop, they check out that one.
01:41:16.140 Maybe it was self-selected.
01:41:17.780 I always thought this the, well, who would choose to go to Reed College model?
01:41:23.860 Right, it's the hippie.
01:41:25.060 It's the hippie baboons.
01:41:26.620 Yeah, but as it turned out, when these new guys joined the troop,
01:41:30.360 they were just as aggressive and displacing of, like, adolescents as adolescents showing up in any other troop.
01:41:37.300 They were, it was not self-selection.
01:41:40.100 And what it was, was males.
01:41:44.700 Males, adult males, were not dumping on females anywhere near as much as in the normal troop.
01:41:50.500 As a result, females were much less stressed, and their hormone levels showed this.
01:41:58.060 As a result, females were much more willing to chance a pro-social interaction, reaching out to someone,
01:42:06.380 than they would have been in a normal troop, because the odds were better.
01:42:10.720 And what you saw was in a typical troop, it would be 70 to 80 days before one of these new transfer males
01:42:17.040 would be groomed by a female in this troop.
01:42:20.100 Is that equivalent to offering a fruit?
01:42:23.100 Yes.
01:42:24.140 And in this troop, instead, it was in the first week.
01:42:28.760 Females were much more relaxed and were willing to take a chance.
01:42:32.340 And what you saw was, like, in a world in which, like, females were grooming you,
01:42:38.060 and big adult males weren't dumping on you,
01:42:40.720 and you could sit under, like, olive trees and all of that,
01:42:44.460 over the course of the first six months after the transfer,
01:42:48.220 these guys dropped the aggressiveness.
01:42:51.620 It was not an inevitable state for them.
01:42:54.820 It was a default.
01:42:56.080 They defaulted.
01:42:57.380 Yeah, yeah, yeah.
01:42:58.120 They were not stressed and dumped on, because the females weren't stressed and dumped on,
01:43:03.180 because the resident adult males were nicer guys.
01:43:06.540 This trickle-down decrease of stress, and they would default.
01:43:11.780 And six months into it, they were like one of the regular old, like, commune hippies there.
01:43:17.540 It was transmission.
01:43:19.480 That's insanely cool.
01:43:20.900 That's an insanely cool story.
01:43:22.640 And so positive and optimistic.
01:43:24.760 It's amazing that, you know, given the multi-generational proclivity,
01:43:29.480 let's say, of the baboon tribes to be relatively psychopathic,
01:43:33.820 it's amazing that there is that much behavioral variation left in this species
01:43:37.760 to be transformed that rapidly.
01:43:40.700 That's a single generation, essentially.
01:43:42.600 I mean, you get a bit more than one generation there,
01:43:44.880 but that's transformation within a single generation.
01:43:47.620 It's amazing.
01:43:48.880 Anyone who says, like, humans don't have that much cultural malleability hidden in them,
01:43:56.220 what, baboons are more sophisticated in their potential variety of social systems?
01:44:02.840 Anyone who says, like, humans are not capable of having a radical transformation, blah, blah,
01:44:10.040 like, if baboons can do it.
01:44:12.800 And they were literally, I studied at college with this guy, Irv DeVore,
01:44:18.400 I think you overlapped with him when you were at Harvard,
01:44:21.440 who was, like, the king of baboon field biology.
01:44:24.840 And I've been writing fan letters to him from the time I was 12 or so,
01:44:28.560 and went to study with him.
01:44:30.000 And he was the person who literally wrote the textbook about baboons
01:44:35.320 and made them the textbook example of the inevitability of stratified,
01:44:41.160 male-dominated societies with high degrees.
01:44:43.700 Right, right, right, right.
01:44:45.320 And, like, ridiculous.
01:44:47.380 The inevitability, because they go out and hunt, inevitably, aggression.
01:44:51.720 Yeah, yeah.
01:44:52.600 Patriarchy.
01:44:53.300 Evil patriarchy.
01:44:54.520 Exactly.
01:44:55.720 Dawn of man, territorial, 1960s Robert Ardrey stuff.
01:45:00.760 And, like, baboons were the textbook example,
01:45:04.540 and in one generation, it could be transformed.
01:45:09.740 That's amazing.
01:45:10.820 Then it is, what does in that culture?
01:45:15.120 Were there vulnerabilities built into it?
01:45:18.240 Right, right, right.
01:45:19.240 Like, are they as good at defending themselves against lions, for example?
01:45:23.300 Probably, though.
01:45:24.120 You know, they probably are.
01:45:25.080 I doubt if it's that simple.
01:45:27.480 It's that you get rid of the aggressive guys and the, you know,
01:45:30.580 the hyper-aggressive guys,
01:45:31.820 because they're not exactly heroic, aggressive defenders.
01:45:34.980 They're more like impulsive psychopaths.
01:45:36.820 So I doubt very much that that would constitute a downside.
01:45:40.620 We have to stop.
01:45:41.980 We're 106 minutes in.
01:45:43.960 I don't want to stop,
01:45:45.140 because I didn't get to talk to you about stress,
01:45:47.400 which I really did want to talk to you about.
01:45:49.320 And we just barely touched on your field work.
01:45:51.660 And so maybe we would have a chance to continue this discussion,
01:45:55.540 because there's lots of other avenues we could walk down,
01:45:58.160 especially on the stress front,
01:45:59.560 because there's like, there's,
01:46:01.220 and there's more on the dopamine front, too.
01:46:03.200 I talked to Carl Friston about the fact, for example,
01:46:06.080 that dopamine also signals incremental progress
01:46:09.320 towards a valid goal and reduction in entropy.
01:46:12.480 So positive emotion signals reduction in entropy,
01:46:15.100 and negative emotion signals increase.
01:46:18.420 And that's like, you can talk about that for like five decades.
01:46:21.220 And so I would love to talk to you again.
01:46:23.720 I am going to talk to Dr. Sapolsky for another half an hour.
01:46:26.840 For those of you who are watching on the YouTube side,
01:46:29.680 we usually delve into more autobiographical issues.
01:46:32.080 So I'm very curious to know, for example,
01:46:34.000 how the hell he ended up on the Serengeti surrounded by baboons.
01:46:37.740 You know, he must have done something terrible in a previous life.
01:46:40.120 That's my theory.
01:46:41.020 So we'll find out about that when we switch over to the Daily Wire Plus side.
01:46:45.240 Thank you to the film crew here in Florence
01:46:47.240 for facilitating this conversation,
01:46:49.500 and to the Daily Wire Plus folks for making this possible.
01:46:51.960 And thank you very much.
01:46:53.220 I've been trying to get you on this podcast for a long time.
01:46:56.700 I'm a great admirer of your work.
01:46:58.340 I learned all sorts of things from you over the years
01:47:01.200 that have been extremely useful to me.
01:47:02.920 So it's a pleasure to talk to you.
01:47:04.660 And to everyone watching and listening,
01:47:07.540 thank you very much for your time and attention.
01:47:09.800 Thank you, sir.
01:47:10.780 Huge pleasure at this end.
01:47:12.300 I feel giddy with intellectual stimulation.
01:47:17.660 Hey, we got the dopamine circuits mutually entangled, man.
01:47:21.660 We'll talk very soon.
01:47:23.780 And for everyone else, bye.
01:47:25.400 And we'll see you on another YouTube site.
01:47:27.320 Thank you.
01:47:27.380 Bye.
01:47:30.960 Bye.
01:47:42.220 Bye.
01:47:42.700 Bye.
01:47:44.360 Bye.
01:47:45.520 Bye.
01:47:45.600 Bye.
01:47:45.840 Bye.
01:47:46.480 Bye.
01:47:49.260 Bye.
01:47:54.460 Bye.