TRIGGERnometry - July 09, 2018


Dr Diana Fleischman on Evolutionary Psychology, Men & Women & Effective Altruism


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

193.57545

Word Count

12,691

Sentence Count

458

Misogynist Sentences

63

Hate Speech Sentences

51


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Hello and welcome to Trigonometry. I'm Francis Foster.
00:00:13.640 I'm Constantine Kissinger.
00:00:14.740 And this is the show for you. If you're bored of people arguing on the internet over subjects they know nothing about at Trigonometry, we don't pretend to be the experts. We ask the experts.
00:00:24.420 Our amazing expert guest this week is an evolutionary psychologist at the University of Portsmouth, Dr. Dana Flashman.
00:00:32.380 Welcome to Trigonometry.
00:00:33.400 Hello.
00:00:34.060 Hello.
00:00:34.720 Thank you so much for coming on.
00:00:36.240 For anyone who doesn't know what an evolutionary psychologist is, could you just tell us a little bit about who you are and what you do?
00:00:41.660 Sure.
00:00:42.400 So evolutionary psychology looks at the human mind in a similar way that you would look at any other kind of adaptations in physiology, right?
00:00:52.140 So birds have wings, and those wings are specially designed to fly at certain speeds and to catch certain kinds of prey.
00:00:58.400 And in the same way, the human mind has certain adaptations that have evolved to help us survive and reproduce.
00:01:05.300 And so, for example, I study disgust a lot, and disgust has got evolutionary history that's kind of human-specific.
00:01:12.200 And the idea there is that disgust evolved to distance us away from things that could potentially contaminate us, both toxins and disease.
00:01:20.640 and so there's a reason why things are disgusting
00:01:22.880 and it's an adaptation to help us survive and reproduce, for example.
00:01:27.100 Isn't it true that women feel disgust more than men?
00:01:29.840 It's very true, yeah.
00:01:30.840 Women are more disgust-sensitive than men overall,
00:01:32.520 especially in the sexual domain.
00:01:34.860 So, no, I was going to say,
00:01:36.560 it's because they hate a lot of men.
00:01:38.900 That's what it is.
00:01:40.000 So, yeah, if you look at,
00:01:41.340 there's one very well-known disgust inventory
00:01:43.700 and it says that there's three kinds of disgust,
00:01:45.800 pathogen disgust, sexual disgust and moral disgust.
00:01:48.800 it's a big question as to whether or not moral disgust actually exists for real or if it's just
00:01:53.360 a kind of extension of of anger but yeah women are much more uh disgust sensitive in the sexual
00:01:59.500 domain so there's a huge difference and one of the studies that i did was i looked at whether
00:02:04.540 or not so if you if you make men sexually aroused so if you um have them look at pornography and
00:02:10.240 then you have them think about discussing possible sexual scenarios they're much less sexually
00:02:15.040 disgust sensitive when they're sexually aroused than when they're not. Obviously, everybody knows
00:02:19.640 that from being alive. And so I did this from my PhD. I did a study where I looked at women
00:02:29.100 and we showed them pornography and then we showed them disgusting things like corpses, people
00:02:33.700 vomiting, stuff like that. Or we showed them disgusting things and then we showed them
00:02:38.360 pornography. And we had also a probe in the women's vaginas. This has escalated very quickly.
00:02:47.140 And so the probe shoots out light into the vaginal canal. And the more aroused the woman is,
00:02:55.660 the less light comes back because those blood vessels take up that light. So what we found is
00:03:00.120 that women who saw disgusting stuff first, they became less sexually aroused. But we don't find
00:03:04.420 that sexually aroused women, I didn't find, that sexually aroused women are less disgust
00:03:08.600 sensitive. So while men become less disgust sensitive when they're aroused, it doesn't
00:03:12.600 really seem that women have that same effect as much. And there's a variety of reasons
00:03:16.680 why women are more disgust sensitive than men. One is just that, you know, women have
00:03:21.300 a certain like five or six chances on average to have offspring, and they have to be choosier
00:03:26.460 about who they have offspring with. But another reason is that women are just much more likely
00:03:30.600 to get sexually transmitted infections, and I've tried to come up a way to talk about this
00:03:35.380 delicately. It's very difficult, but women are, we have a pocket, and you don't have a pocket,
00:03:40.860 I mean, you have pockets too, but women have a pocket, and so basically if a woman has sex,
00:03:47.460 she's much more likely to get the sexually transmitted infection of her partner than she
00:03:50.700 is to pass something on to her male partner if they have regular, whatever, penis and vagina
00:03:55.120 intercourse. So we don't know how long sexually transmitted infections have been around,
00:03:59.000 but women are much more likely to get them
00:04:01.020 and they have a much worse disease burden
00:04:03.100 again
00:04:03.460 so yeah
00:04:05.360 those are a couple reasons why women are more disgust sensitive than men
00:04:08.360 and how did you get into studying disgust as a little girl
00:04:10.920 were you curious about evolutionary psychology
00:04:13.300 how did you become who you are
00:04:15.340 so many reasons
00:04:16.140 so I'm kind of an unusual
00:04:19.260 I'm just
00:04:20.180 I'm a strange woman generally
00:04:21.720 but I was kind of a tomboy growing up
00:04:25.100 and my mother was like incredibly
00:04:26.720 incredibly obsessive compulsive disorder would only wash me in mineral water that she had boiled
00:04:32.540 and put alcohol in mineral wow my father had to bring home water from the shop she wouldn't wash
00:04:37.480 me in tap water uh and she was just incredibly disgust sensitive about everything was this in
00:04:43.600 brazil because i know that you have brazilian heritage yeah no i was only in brazil when i was
00:04:46.800 yeah when i was three months old i left brazil but yeah um and so my mom was really really
00:04:51.160 obsessed with cleanliness and with contamination. And she got better over time. And then I also
00:04:56.980 spent my weekends, my dad is big into riding horses, and my grandpa was too. So I spent all
00:05:01.660 my weekends on this like filthy farm, interacting with mangy animals. And then at home, my, you know,
00:05:07.420 my living room is all white, everything was completely pristine. And so I became really
00:05:11.300 interested in individual differences and discuss sensitivity. And you often find that unpalatable
00:05:16.760 sorts of areas of psychology are the last to be, you know, they're the frontiers, basically.
00:05:23.480 And so Disgust is pretty new. Nobody really wanted to study Disgust until, you know, so Disgust is
00:05:27.900 a relatively new area of research compared to other things like, I don't know, happiness or
00:05:32.820 fear or whatever else, you know, fluffier topics. So it was really a niche that I thought I could
00:05:38.400 get into. And it's really important for all kinds of reasons for health psychology, you know,
00:05:43.500 for sexual psychology and sexual health and also dietary stuff is really you know people they
00:05:49.180 crystallize about what they eat when they're very young and people often can't change their diets
00:05:53.280 because of disgust even if you know they'll end up dying of a heart attack if they don't change
00:05:58.160 their diets so that's all super interesting and and what i've worked on for some time i mean i
00:06:03.940 work on different stuff now but yeah well we'll get into that but you we started talking about
00:06:08.020 men and women at a level that I was not quite expecting but let's broaden it out a little bit
00:06:13.880 we mentioned before we started the show James the James Damore memo and things like that as an
00:06:19.320 evolutionary psychologist what can you tell us about the psychological differences between men
00:06:24.540 and women on average that kind of scientifically verified because I think most people understand
00:06:29.460 the biological differences between men and women yeah well increasingly less so but but but there
00:06:35.780 are kind of basic things that we all know but psychologically it seems like we're kind of always
00:06:40.260 veering between offensive stereotype and useless generalization so what what are the differences
00:06:45.500 between men and women um well so you might expect that you know one kind of very conservative claim
00:06:50.960 you could make from an evolutionary perspective is that men and women should only differ in domains
00:06:55.040 in which they've had different selection pressures and that doesn't sound like a big deal but it will
00:07:00.360 make a big deal you know so some of the the big differences between men and women are in the
00:07:04.460 sexual domain. And I studied with a evolutionary psychologist called David Buss. And he did some
00:07:10.300 really interesting stuff where he asked men and women, for example, how long would you have to
00:07:14.040 know somebody before you would reasonably be likely to have sex with them? And how many sex
00:07:19.580 partners would you like over your lifetime and things like that? And some of the biggest sex
00:07:22.720 differences are in that domain. And you also see that, you know, kind of lesbians and homosexual
00:07:29.080 men are kind of interesting in this regard too, because gay men don't really have a limit in
00:07:34.700 terms of female coyness on how much sex they can possibly have. And so you see gay men have a lot
00:07:39.240 more sex partners than straight men can reasonably have because they're not limited by women's
00:07:44.800 choosiness as much. So I think the idea that women are choosier has even come into question
00:07:49.760 recently. There was a really famous study by Clark and Hatfield back in the early 90s where
00:07:55.940 they had a confederate attractive confederate a confederate is like somebody who's in cahoots
00:08:00.540 with the experimenters and they went up to a random undergraduate on the street and they said
00:08:05.060 um hi i've noticed you're on campus i'm really attracted to you would you either uh a go on a
00:08:11.500 date with me be come back to my apartment or c have sex with me and zero percent of the women
00:08:16.720 said yes to sex and what percentage of the men said yes to sex from a random woman on the street
00:08:21.240 100%
00:08:21.960 75%
00:08:23.280 yeah
00:08:23.460 I'm not surprised
00:08:23.900 and the ones who said no
00:08:25.180 you're surprised
00:08:25.860 I'll be honest with you
00:08:28.460 right
00:08:28.680 and maybe it's my
00:08:29.660 innate Britishness
00:08:30.800 but a woman coming up to me
00:08:32.260 who I've never met before
00:08:33.440 I don't know her name
00:08:34.260 and going
00:08:34.480 do you want to have sex with me
00:08:35.960 that would be
00:08:36.820 excuse my language
00:08:37.980 fucking terrifying
00:08:39.100 that would be
00:08:42.080 I would feel my virginity
00:08:44.180 grow back
00:08:44.680 I'll be honest with you
00:08:45.620 so
00:08:46.600 and they also included
00:08:48.800 men who said
00:08:49.540 I would love to
00:08:50.480 but you know
00:08:50.840 my mother's in town or my girlfriend at home or whatever the case may be.
00:08:54.820 But something interesting is that 50% of women said yes to the date.
00:08:58.780 Wow.
00:08:59.740 Yeah, it's interesting, yeah, but very few said yes to the apartment.
00:09:03.560 And there was these people who said, you know, that if you change the rules of the game,
00:09:10.440 like if you tell women they're definitely never going to be slut-shamed,
00:09:13.280 the sex is going to be good, and they will be in no personal danger,
00:09:16.400 then you can get the averages a little bit closer together.
00:09:19.340 but certainly like you said that you'd be terrified if a woman came up randomly to you on the street
00:09:23.960 and said would you like to have sex with me you'd think she was some kind of bunny boiler
00:09:26.700 and so um i think men and women have similarly uh you know would be afraid in that situation
00:09:32.860 the idea that there's a huge difference there i don't know if that may necessarily make sense
00:09:37.180 so that's the one domain but um you also see differences in other kinds of domains so women
00:09:42.560 have on average taken care of offspring much more than men have and uh in some cultures you know
00:09:48.860 In kind of modern Western secular cultures, what's called weird, that's Western educated, industrialized, rich democracies, men and women tend to converge and take care of offspring more together.
00:10:00.700 But in most cultures, men hold their kids like whatever, 15 minutes a day.
00:10:06.100 Women hold their kids nine times as long as men do.
00:10:09.540 And they always know that they're the mother of their offspring, for example.
00:10:13.260 And so you should see a lot of differences also in domains that have everything to do with child rearing.
00:10:19.440 And there's also this difference in the kinds of roles that men and women have had in terms of hunting, gathering, and seeking out mates, warfare.
00:10:29.060 It's never been the case that a band of women have gotten together with spears and hand axes to go and kidnap the men from a neighboring group and have sex with them.
00:10:37.500 Like, that's never, ever happened.
00:10:38.780 God damn it.
00:10:40.560 That's another fantasy.
00:10:41.840 You know, this idea, there's kind of a fantasy, and I think it's, the more I've learned about kind of leftist ideology, the more I think it comes from a really good place.
00:10:53.520 It would be wonderful if human nature was so malleable that literally women could be socialized to go to war, to whatever.
00:11:01.840 I mean, that wouldn't be a great outcome, but it would be great if human nature was that malleable.
00:11:05.960 So I think it comes from a really nice place, but I don't really think that human nature, especially sex differences, are that malleable.
00:11:13.980 And so you see these kinds of differences.
00:11:16.600 So if you look at the research, what you see is that there's this kind of minimization of personality differences.
00:11:24.900 So, for example, if you look at extroversion, which is one of the big five personality factors, men and women don't really differ much on extroversion.
00:11:32.320 But if you look at the internal facets of extroversion, you'll see women are more warm than men are, and men are more assertive and sensation-seeking than women are.
00:11:41.480 And so there's this scientist named Marco del Giudice at the University of New Mexico.
00:11:45.620 He's my evolutionary psych colleague, and he did a really great paper called The Distance Between Mars and Venus.
00:11:50.800 And he said these characteristics that men and women have, they don't occur in a vacuum.
00:11:55.160 These personality characteristics, they converge and they correlate together.
00:11:58.180 And so if you look at a cluster of all the masculine personality characteristics and all the feminine personality characteristics, there actually is a big difference between them.
00:12:06.320 And that's averaging across gay men and gay women as well.
00:12:09.360 So there's still a big difference between these two groups.
00:12:12.720 And if you look at the personality factors as they are, like extroversion, agreeableness, et cetera, then you'll see smaller differences.
00:12:19.000 Or if you look at very specific things, like if you, for example, what's her name?
00:12:24.140 cordelia fine did testosterone racks and she said yes uh it's true that in these personality factors
00:12:30.220 they say men are much more risk-taking than women are but they didn't ask about certain kinds of
00:12:34.940 risk like what about the risk of cooking a souffle that you've never cooked before i'm like that's
00:12:40.500 that's not that nobody's gonna kill you if you cooked a bad souffle right so one thing that i've
00:12:47.220 always been fascinated with is the you know the premise of well women like a bad boy is that
00:12:53.440 scientifically true and if it is why so um yeah so i think that is true but well so it depends
00:13:04.160 on the kind of environment you live in so there's some idea that you actually you're malleable in
00:13:09.960 the inter you know in the beginning years of your life as to how much you like these kind of what
00:13:15.140 are called dark triad characteristics so dark triad is narcissism machiavellianism and psychopathy
00:13:21.240 And those are, you know, how great you think you are.
00:13:23.440 That's an accurate description.
00:13:27.480 Narcissists are actually considered more attractive than other people who aren't narcissists.
00:13:31.240 Really?
00:13:31.540 Probably because their narcissism is somewhat informed by their actual self, their actual mate value, what we call mate value.
00:13:37.960 So I think more attractive people are more likely.
00:13:39.680 I mean, attractiveness and narcissism do correlate.
00:13:42.560 Like, objectively, if you think somebody's more attractive, they're more likely to be narcissists.
00:13:46.520 And then whenever I fill out a narcissistic personality inventory, they're like, would things be better if you ruled the world?
00:13:51.860 I'm like, well, yes, objectively.
00:13:54.400 That's true.
00:13:56.120 So these kinds of personality characteristics.
00:13:58.120 So one idea kind of floating around is that if you are in a volatile environment, so if you're in an environment where, let's say, you don't have a father around,
00:14:06.340 then you're getting cues from the environment that either your dad's dead or that this is not the kind of environment where men invest in their kids, for example.
00:14:14.740 or also if you're in an environment where you see people deceiving each other or cheating each other.
00:14:19.780 So what's the best thing to do in an environment like that is to have babies that are also going to be able to be good liars
00:14:25.740 and good at deceiving and manipulating other people.
00:14:28.540 You don't want to find the most agreeable nice guy in the world to have kids with
00:14:32.900 if you're in an environment that is volatile in this way.
00:14:36.180 And you see this kind of dichotomy about what women like in romance novels.
00:14:41.720 So in romance novels, which are the best-selling books on the planet, what happens is the woman falls in love with a man who's this bad boy rogue, one of the top, you know, they're pirates or thieves or whatever, right?
00:14:54.660 And then he becomes so devoted to her that he can express this deep devotion to her and he's nice to her and agreeable to her where he isn't to anybody else.
00:15:04.260 And that is sort of this feminine fantasy, is to be with somebody who has the capacity to manipulate and fuck over and deceive other people, but doesn't to her, is so in love with her that he treats her really well and her family and her kids really, really well.
00:15:18.260 And that's what really optimally, I think, women want, because that's the kind of phenotype, that's the kind of man who could succeed in any environment.
00:15:26.260 And he would make the kind of kids that could succeed in any environment.
00:15:29.980 So he would be nice to her, but he would also be able to protect her from external danger and provide or whatever.
00:15:35.320 Is that the evolutionary?
00:15:36.600 Yeah, that kind of thing.
00:15:37.720 But if you're really in a really volatile environment, you want somebody who can, yeah, like fuck over other people.
00:15:43.900 So that's like what you optimally want.
00:15:45.600 And even women who say like, oh, I want a man who's generous.
00:15:48.020 Yeah, you want a man who's generous, but you don't want to come home and find a tramp sleeping on your sofa because he decided to bring one home.
00:15:53.680 You want a man who's generous to you.
00:15:55.440 You want a man who's generous to your family.
00:15:57.780 You want a man who always tells you potentially that you're right.
00:16:00.940 But there's like a whole manosphere, red pill thing about how much women need pushback
00:16:05.040 and how women actually test to make sure that men that they're with actually still have these kind of bad characteristics
00:16:11.560 that they can still fend for themselves in that kind of way.
00:16:13.940 Oh, really? Tell us about that. That sounds interesting.
00:16:16.660 I haven't heard about this stuff.
00:16:18.980 So I'm writing, I'm sort of writing a book and I probably won't go into it very much,
00:16:23.780 but I've started reading some of this red pill stuff because I just...
00:16:27.920 So if anyone doesn't know, including me, what is red pill?
00:16:30.220 Oh, you don't want to know what red pill is.
00:16:31.760 I'm familiar with red pill in the sense of people who might have been left-leaning becoming less left-leaning.
00:16:37.320 There's a lot of different red pills that you can possibly take.
00:16:40.220 I'm starting to get the sense that's not the one you're talking about.
00:16:42.720 No, that's not the one I'm talking about.
00:16:43.880 But red pill is a, it's men, so I can think the common wisdom is that women have it worse than men.
00:16:54.340 Women are oppressed compared to men.
00:16:56.640 And the red pill, like for example, in this red pill documentary, says that actually women do have some advantages.
00:17:02.920 And these are underplayed relative to men's advantages.
00:17:05.860 And not only that, but that women also do manipulative things to men in order to get their way.
00:17:12.780 that women have kind of secret power wait is this the cassie jay documentary yeah i have seen it and
00:17:18.240 i've just asked you a question about something i've seen as if i've not but there's there's
00:17:22.880 extra red pill stuff there's also like red pill stuff about you know what's called shit testing
00:17:27.260 so yeah what when women when and and you're i guess you're blissfully ignorant of the way
00:17:33.660 that's good i've i've happily married for a long time now so
00:17:42.380 He doesn't even realize he's getting played.
00:17:44.700 I've repressed all of this information just not to know what's happening
00:17:48.540 and just to be happy with my life.
00:17:50.140 So please tell me what's wrong with that.
00:17:51.700 So women, whatever, do something called shit testing,
00:17:55.940 and they do a variety of things to see kind of what the boundaries are,
00:17:58.540 like how much they can get away with, how generous a man is going to be with them,
00:18:02.320 how compliant he is to her demands, especially if they're unreasonable.
00:18:07.980 So if you make an unreasonable demand of a man,
00:18:11.060 is he just going to say yes I'll do that or will he will you experience some kind of pushback
00:18:15.700 so kind of testing a man to see if he has the kind of status whatever balls whatever you want
00:18:23.080 to call it in order to push back against you and not give in to your every whim and I think
00:18:29.280 and I find that women's kind of self-help books are very often like you go girl and you're amazing
00:18:36.300 as you are, and there's nothing you need to do to change in any way, shape, or form,
00:18:40.600 and somebody will love you just as you are.
00:18:42.420 Whereas I find men's self-help books are very much saying you should work out, you should
00:18:46.900 be playing chess, you should be trying to sharpen your mind, and you also need to play
00:18:52.980 this game that women are playing.
00:18:54.180 Whereas I really haven't read very, you know, and I've been trying to read some women's
00:18:57.560 self-help books that say, no, you are a woman and you are playing a game, but you're deceiving
00:19:01.400 yourself about these games that you are playing.
00:19:03.100 And I find that I'm very self-aware about the kinds of ways that I am testing things out and seeing where men are at with me.
00:19:14.080 So coming back to the broader picture of men and women then, I'm sure you've seen the Kathy Newman, Jordan Peterson interview.
00:19:20.720 Actually, I started watching it and then it was very frustrating.
00:19:25.820 It was incredibly frustrating.
00:19:27.280 I mean, he's definitely said some things that I know that the way to get famous in the kind of intellectual landscape is to say things without nuance.
00:19:35.740 If you say things with nuance, it's never going to be a soundbite because nobody cares.
00:19:39.980 So I know that I've seen Jordan Peterson definitely say things in a less nuanced way.
00:19:45.300 But I mean, otherwise he wouldn't be famous if he was totally nuanced, right?
00:19:49.660 Probably. I'm not a huge fan of Jordan Peterson.
00:19:51.900 But in that interview, actually, part of the conversation was very much the James Damore thing.
00:19:56.140 He was trying to explain that the gender pay gap is not simply down to discrimination,
00:20:00.900 that there are factors like the big five personality trait differences between men and women.
00:20:04.940 So what can you tell us about the impact of the differences between men and women
00:20:08.120 on things like the real world stuff, like the gender pay gap, for example?
00:20:13.800 So this talk wasn't recorded.
00:20:16.680 I probably should record it at some point.
00:20:18.080 But I gave a talk for the Adam Smith Institute back in December.
00:20:21.460 And I talked a little bit about where this kind of wage gap comes from.
00:20:24.380 And I hope I get all the details of the study right.
00:20:27.000 But they were looking at people who scored very high on quantitative measures, like the SAT, men and women, who scored very, very high.
00:20:34.980 And they were looking at this kind of elite intellectual group.
00:20:37.900 And they asked the men and the women in this group, how many hours a week would you work optimally if you were, you know, if you had your druthers, which I don't know what druthers are, but they are a thing that you can have.
00:20:48.920 So if you had your druthers, how much, you know, how many hours a week would you work?
00:20:52.380 And I think far fewer women, it's like 30 or 40% fewer women,
00:20:57.040 said that they would work more than 40 hours a week.
00:20:59.680 And so if you ask women, I think this is the major difference here.
00:21:02.600 If you ask women, how much would you like to work in a week
00:21:06.580 and how much time would you like to spend with your family
00:21:08.220 and how much time would you like to spend with your kids,
00:21:10.260 women say, on average, I would like to spend more time with my family
00:21:13.400 and my kids and less time working than men say that they would like to.
00:21:19.740 And so that's kind of a, it's not very well appreciated, but I do think it comes down a lot to personal preferences.
00:21:29.420 And men are also more willing to kind of work on call.
00:21:33.220 You know, there was this study that was done which showed that Uber drivers, like men, were making 9% more as Uber drivers than women were
00:21:39.580 because they were willing to work more when the surge came.
00:21:42.200 They were willing to make more short trips.
00:21:43.660 They were essentially showing that they were more willing to take risks and do jobs that were less appealing.
00:21:50.560 Another really interesting thing about men and women in terms of the pay domain is that lesbians make more than straight women do.
00:21:59.600 There's something called the lesbian wage premium.
00:22:01.820 Lesbian privilege.
00:22:03.660 And lesbians make 9% more than straight women do.
00:22:07.220 and you know if
00:22:09.180 you were kind of going with the kind of typical story
00:22:11.460 and you said okay well you know lesbians
00:22:13.540 obviously don't present as
00:22:15.400 gender typical they often
00:22:16.940 come across as more butch obviously
00:22:19.000 I think everybody knows that
00:22:20.540 Not on the internet
00:22:22.240 Not on the internet
00:22:23.440 So you know this is what
00:22:29.200 this idea like oh you know women are assertive
00:22:31.580 and they act and I think that this is true
00:22:33.380 to some extent that women who act assertive
00:22:35.280 they're called bitches you know
00:22:37.080 in whatever American black slang, you say a man who acts dominant, he's bossed up, whereas a woman
00:22:43.460 who acts dominant is called a bitch, right? I do think that that happens. But you would think that
00:22:48.820 a woman who didn't act in a gender typical manner, like some lesbians do, would make less money
00:22:53.280 because they would be oppressed for not fitting in with the larger gender roles. But no, they make
00:22:57.620 more money. And I think it's because lesbians have some of these characteristics, like status
00:23:03.060 seeking risk taking kind of behavior that more feminine straight women are less likely to have
00:23:09.380 do you think it's also as well because men are more aware of status and it's more about men
00:23:15.240 by their nature we compete with one another for everything for partners for whatever it may be
00:23:23.760 that we're more obsessed with status therefore we're more obsessed with working more generating
00:23:27.640 more money so that we can present ourselves as being more successful yeah absolutely so i mean
00:23:32.580 men throughout whatever evolutionary history have gotten sex partners and gotten laid by
00:23:37.940 increasing their status so it's not about getting laid yeah yeah so um we have men men have increased
00:23:44.620 their status in order to get sex partners and you know women who are high in status i think that
00:23:50.140 they're desirable i think men also like intelligent um women who are high in status but i think that
00:23:57.340 if you have a selection pressure that and and status really differs from culture to culture
00:24:02.340 So this is one area that I think is malleable. If you made being a stay-at-home dad a really high-status thing, then you would see more men being stay-at-home dads. I think that women should date stay-at-home dads whenever possible, right? Because I do think that all of this stuff is really driven by female sexual choice. And so that is a malleable thing.
00:24:21.600 In some cultures, there's not really any cultures where being a stay-at-home dad is considered the most high-status thing that you can do.
00:24:28.480 And men chase status, whatever it is, money, power, land, whatever it is in that particular culture.
00:24:34.960 So I think that is a malleable characteristic.
00:24:37.140 But the fact that men are seeking status and chasing status, I think, is pretty hard to change.
00:24:44.480 So you're saying that the bulk of the gender pay gap is down to behavior?
00:24:48.900 Choices.
00:24:49.520 Choices.
00:24:49.820 Yeah.
00:24:50.220 because I'm sure an element of it is down to discrimination and we've had people women on
00:24:55.020 who've told us a personal story about being discriminated because they're women and we
00:24:59.240 have no reason not to believe them so I'm sure that's part of it but the majority of it you
00:25:03.180 think is down to choices I think the majority of it is down to choices and yeah I think I do think
00:25:10.540 that women do have stories about discrimination but I also think that that is kind of how it's
00:25:15.480 framed. And so nobody wants to hear a story from a woman who said, no, I've never been
00:25:21.560 discriminated against, and I've never had any problems. You also see this in other domains,
00:25:26.560 like one of my colleagues did a study where he asked people who are minorities in America,
00:25:31.900 do you think that you're regularly discriminated against? And East Asians said, like 80% of them
00:25:38.260 said rarely or never. And something like 68% of black Americans said rarely or never do they feel
00:25:44.440 discriminated against and it's kind of squeaky wheel thing right you the stories that people
00:25:49.000 hear are definitely those stories in which discrimination played a very important role
00:25:53.700 and the same way that somebody might really overestimate the murder rate because you hear
00:25:58.240 much more about murders than other kinds of crime you also hear a lot more about discrimination
00:26:02.260 than what is representative and i think in the general population well since you touched on
00:26:07.780 that i mean we've done something controversial men and women let's uh lighten up a bit and talk
00:26:11.840 about racism i mean i i remember i can't remember whether it was brett weinstein or another evolution
00:26:19.700 or maybe it was jordan peterson in conversation with brett weinstein they were talking about
00:26:23.720 disgust being an element of uh racism in human beings and the idea was that if a group of people
00:26:31.380 encountered another group of people they were quite likely to get diseases from them yeah and
00:26:36.620 you talk about this a lot disease being and disgust being very closely interlinked is that
00:26:41.520 And then, like, why is there racism?
00:26:43.160 Why are people racist?
00:26:44.540 Why are people xenophobic?
00:26:46.200 Why did we evolve in this way?
00:26:48.500 Wow, really, the softballs just keep coming.
00:26:53.500 Wow.
00:26:54.700 So there is a disease story that people have told.
00:26:59.000 There's an anthropologist, he's an evolutionary biologist,
00:27:02.900 and he talks a lot about how if you were in a group
00:27:06.180 and then there were these outgroup people,
00:27:08.060 that they would have carried novel diseases
00:27:09.780 for which you would not have necessarily had any defense against.
00:27:13.120 I don't think that that really makes sense.
00:27:16.080 And it was a really popular explanation for a long time.
00:27:19.840 So xenophobia is related to what's something called pathogen burden.
00:27:24.720 So the more diseases a given country has,
00:27:26.940 the more kind of bad attitudes they have against outgroup people.
00:27:31.280 And these things were related to one another, they thought.
00:27:35.340 And also conservatism, political conservatism is associated
00:27:38.120 with how much disease there is in a given population.
00:27:41.300 I don't really know if one thing causes the other.
00:27:44.380 I just think that if you have more exposure to various different people,
00:27:49.180 then you don't necessarily consider them your out group.
00:27:51.060 But there certainly has been selection pressure over evolutionary history
00:27:55.280 such that people who look different than you didn't necessarily follow your social norms
00:27:59.300 and you would not expect them to behave ethically towards you.
00:28:02.820 So if you encountered somebody from an out group, which was a group that was different from you,
00:28:07.300 They probably didn't look that different from you, for example.
00:28:09.900 So if you think about the Hutus and the Tutsis, that civil war that happened in Rwanda,
00:28:18.080 those people, I mean, to white people, they don't look that different.
00:28:21.020 And sometimes there was just very, very subtle physiological differences between them that
00:28:26.780 caused people to get discriminated against.
00:28:29.260 So in the current society, in which we have people from all corners of the globe and all
00:28:33.860 kinds of mixtures of all different kinds of ethnic groups together that is really a very novel kind
00:28:39.420 of situation where you know previously out group members they would have looked almost exactly like
00:28:44.940 you except they would have had slightly curlier hair or slightly lighter eyes or been slightly
00:28:49.540 taller or thinner or whatever and now you have people who look really wildly different than you
00:28:54.080 so there is a certain i think during your development you you're exposed to these people
00:29:00.540 and then you have repeated interactions with them
00:29:03.200 and then you're less likely to become prejudiced.
00:29:05.140 But I think that interacting with people who look very different from you
00:29:08.040 is always going to have some kind of fraught evolutionary cognition behind it
00:29:13.920 because throughout our evolutionary history,
00:29:15.960 you could not expect those people who look very different from you
00:29:19.260 to treat you with the same kind of moral accord that they treated their in-group.
00:29:24.640 So, you know, if, for example, I did a study like a long time ago,
00:29:28.780 probably like nearly 10 years ago,
00:29:30.540 which looked at ovulating women. And so one question is, we had ourselves, was ovulating
00:29:37.180 women, ovulation is the time in which it's most dangerous to be sexually assaulted because it's
00:29:42.200 the time when you can get pregnant, right? And so would ovulating women prefer outgroup men? So
00:29:47.280 would they find them more attractive? Because there's this thing called hybrid vigor, obviously
00:29:51.740 mixtures of two different ethnic groups. Those children tend to be more vigorous than
00:29:55.220 with single ethnic group? Or would they be more afraid of such men? And we found that ovulating
00:30:01.480 women showed, you know, increased aversion to men who were black or Latino or mixed race
00:30:07.860 compared to other white men. And those are the kinds of interactions, I mean, that have a deep
00:30:13.740 evolutionary impact. If I was, you know, throughout most of our evolutionary history,
00:30:18.700 I would have not encountered anybody who looked different than me, unless it was probably a very
00:30:24.300 aggressive interaction. Although there was some trade and things like that, I don't think women
00:30:28.740 generally were involved in that. And so it's unclear really how much of a footprint that has
00:30:33.360 on our minds. But it is clear that it's very easy, for example, to leverage disgust or fear
00:30:39.140 when talking about at-group people. And, you know, people talk about this example all the time about
00:30:44.640 how during World War II, East Asians were compared to insects and cockroaches, and how Jews were
00:30:50.980 compared to rats and vermin about how outgroup members are considered to like smell different
00:30:57.320 or have these various other qualities that are animal-like and certainly there is a continuum
00:31:04.440 like a human-animal continuum. I'm just thinking about whether I should go into one more quite
00:31:09.680 complicated study. I guess I will. So you know what extinction is right? Extinction is when you
00:31:15.500 have like a let's say you have a fear reaction like every time I show you a snake you have a
00:31:20.280 reaction, then over time you get desensitized to it. So this colleague of mine did a study where
00:31:24.980 he showed white people snakes and spiders, and then he gave them a very mild electric shock.
00:31:30.340 Or he showed them white faces, or he showed them black faces and gave them a very
00:31:33.620 mild electric shock. And it's very easy to pair physiological arousal, that is heart rate increase
00:31:39.880 and skin conductance increase, with black faces if you're shocked, and white faces too. So that
00:31:46.540 happened but then as he extinguished that response as he started showing them black faces and then no
00:31:51.280 longer shocking them it took much longer for that response to extinguish than it did for their
00:31:56.360 response their fear response to in-group faces and it was similar to the response to spiders and
00:32:02.800 snakes and that's where I think this kind of comparison between animals sometimes comes from
00:32:07.980 is that you're trying to leverage that fear response that you have towards non-human animals
00:32:12.660 towards out group members yeah i mean that's slightly mind-blowing just saying there and i
00:32:19.880 was just trying to get my head around it but one point that i wanted to make to you is that
00:32:23.780 human beings have always found a way to discriminate to discriminate against other
00:32:29.300 human beings even if they're in the same society for instance um in the 1950s in the uk like my
00:32:36.060 father grew up in lancashire in the north of england and you were discriminated against if
00:32:41.340 you were a catholic and even on the surface if they didn't know you were catholic one way they
00:32:46.700 could discriminate against you is they'd ask for your cv they'd check what school you went to oh
00:32:50.960 saint mary magdalene school and uh you're not going to get this job x why do we do that even
00:32:56.760 when a person looks absolutely identical has the same accent there's almost no difference you can
00:33:03.760 still find a way to say no you're not part of my tribe i i guess i would just speculate and say
00:33:09.840 that there is this kind of motivation to make an out-group and an in-group all the time
00:33:15.960 and to think about these kinds of things.
00:33:17.920 I think sport taps into that in a more healthy way than other kinds of in-group, out-group kinds of thought.
00:33:26.520 But, I mean, I couldn't explain that.
00:33:28.780 The whole Catholic, I mean, I come from America where that's not really a thing.
00:33:34.020 So I find that really difficult to kind of wrap my head around.
00:33:36.480 But I think that perhaps there's just an appetite for that.
00:33:38.980 and certainly if you have a group of people that you're tight with like a family or a kin group
00:33:45.400 and then you say there's this out group and they're actually trying to exploit us
00:33:49.100 you'll become much more tightly knit with that in group and so maybe there's a motivation to do that
00:33:53.640 because it actually helps you get in better with the people that you're close to that's just pure
00:33:58.540 speculation though that makes absolute sense yeah yeah that's very interesting it's all very
00:34:04.100 interesting so far um well we talked about altruism before we started yeah sorry i can i can try to
00:34:10.660 take care of that and because there's a great american comedian that francis and i both like
00:34:16.020 david tell yeah who whenever he comes on stage she picks out a member of the crowd and he goes
00:34:20.060 do you like comedy and the guy's like yeah i can change that so um yeah altruism we talk about
00:34:28.100 effective altruism i mentioned i listened to a lecture of yours about morality tell us what
00:34:32.000 effective altruism is? Okay, effective altruism is the idea that you should actually, I mean,
00:34:39.300 so not all effective altruists are utilitarian, but I'm a utilitarian. And the way that I think
00:34:44.320 about it is that you should try to engage in actions, for example, or donate to charities
00:34:49.620 that actually can create the most good. And Peter Singer has got an example that he uses all the
00:34:55.400 time. For example, if you watch television, you'll sometimes see advertisements for seeing
00:35:00.940 eye dogs. And to train a seeing eye dog puppy, it takes about 40,000 pounds. It's very, very
00:35:06.880 expensive to train a seeing eye dog puppy to become a seeing eye dog for a blind person,
00:35:12.040 a blind person's companion. Whereas for the same amount of money, you could probably save hundreds
00:35:16.460 of people from becoming blind at all in Africa from river blindness. So he kind of makes this
00:35:22.940 comparison. Is it better to have one blind person's life become better because they have a
00:35:27.120 seeing eye dog? Or is it better to prevent possibly dozens or hundreds of people from becoming blind
00:35:32.500 in the first place? And effective altruism considers the relative risks and benefits of
00:35:39.340 various kinds of actions on human flourishing, but also all sentient life flourishing. So there's
00:35:45.680 kind of three main strands in effective altruism right now. One of them is non-human animals,
00:35:51.720 which is something that I'm very focused on. So something like a trillion fish and
00:35:55.800 billions and billions of animals are killed in pretty horrible ways. And that's actually
00:36:00.520 a pretty low-hanging fruit. You could reduce a whole lot of suffering if, for example,
00:36:05.520 you could make clean meat, which is now the big push, which is this meat that you make without
00:36:11.900 actually slaughtering any animals or raising any animals for food. I'll go into that more in a
00:36:16.140 minute. And then there's decreasing existential risk. So, you know, by all accounts, human
00:36:22.520 civilization has led to many, many happy lives. There are many more happy lives on this planet
00:36:27.180 than there ever have been previously. And if we could stretch our resources even further,
00:36:31.900 perhaps we could have many, many more billions of happy lives in the future colonize the galaxy
00:36:36.300 and just create many happy, flourishing human or other sentient lives. But if there was something
00:36:45.500 like a meteor strike or political unrest or a huge pandemic, it could either kill many,
00:36:51.540 many people who would go on to lead happy lives or it could basically take civilization back to
00:36:56.800 the dark ages. We wouldn't be able to make these strides in technology that are going to enable
00:37:01.900 many, many more happy future lives. And then there's also eradicating global poverty. So
00:37:07.620 that's another big strand of effective altruism, which is helping people in the developing world
00:37:11.980 because they're much cheaper to help than people in the developed world. So you can, for example,
00:37:16.880 for, I don't know, $3,000, give or take, buy malaria nets and buy a year of healthy life
00:37:25.680 or possibly save a life from malaria.
00:37:28.540 So what they do, there's a bunch of different charity evaluators.
00:37:31.640 There's like GiveWell, for example, and they evaluate charities as to how much life does
00:37:37.320 a dollar buy for a human of flourishing.
00:37:41.580 And so that's kind of the whole effect of altruism.
00:37:45.060 Most of the landscape right now is looking at those three main cause priorities.
00:37:50.480 And I focus a lot on non-human animals, so I've been vegan for a long time,
00:37:55.500 and I am on the board for something called the Sentience Institute.
00:38:00.760 So what they are interested in looking at is things like how are animals being killed in crops
00:38:05.760 and what are the best ways, for example, to reduce the amount of sentient suffering on the planet.
00:38:13.560 And so one thing that I generally plug for people is that you probably don't want to go vegan.
00:38:18.640 A lot of people don't want to go vegan because it's difficult.
00:38:21.640 And so most of the animals that you eat in terms of deaths per calorie are small animals.
00:38:29.420 So if you were to give up eating fish and chicken and eggs, you would be causing 90% fewer deaths per year than if you carried on eating those things.
00:38:38.700 because it takes about 200 chickens to make the same amount of meat
00:38:42.880 as it is in a cow, for example.
00:38:45.300 And so this is something also Ezra Klein,
00:38:48.440 who I disagree with politically to a great extent,
00:38:50.720 but he starts off all his podcasts now saying give up chicken.
00:38:53.560 If you give up chicken, then you're actually doing a huge amount
00:38:56.040 to reduce the amount of suffering you cause yearly
00:38:58.720 because every chicken meal you eat or every two chicken meals kills a chicken,
00:39:02.880 whereas it would take you about two years of eating beef probably to kill a cow.
00:39:06.720 It makes a huge difference.
00:39:07.620 and i don't think that cows are 200 times more sentient or more capable of suffering than
00:39:12.460 chickens are well there goes our sponsorship with that we guys want to have kfc or whatever
00:39:17.540 celebrate at the end give up chicken eat whales yeah give up yeah whales also yeah i mean people
00:39:23.640 really have i don't know why people like whales so much like i don't they have like hair for teeth
00:39:29.360 and their eyes are all fucking low and their singing is terrible i don't know why people
00:39:35.180 like whales so much i have to say you have a fantastic sense of humor so yeah you should eat
00:39:41.680 whales we should eat whales okay yeah and what else what prawns aren't sentient are they i i would
00:39:47.840 put prawns i'm not sure where prawns are they're yeah they're like the cockroaches of the sea
00:39:52.540 however however yeah two things about prawns that are very bad one thing is that they're fed other
00:39:57.740 fish yeah so if you eat a prawn it's like eating fish that have you know vertebrate fish and that's
00:40:03.120 not good and the other thing is that there's a lot of slavery in the supply chain for prawns
00:40:07.440 about a third of i thought you're gonna say prawns are slaves also no so what happens is uh you know
00:40:15.640 people from burma or whatever uh they go somewhere like thailand to get work and they end up on a
00:40:20.680 ship and they sort out what's called trash fish so if there's a big trawling net they they take
00:40:25.700 out the fish and those fish are ground up to go into aquaculture which is for prawns but what they
00:40:31.780 do is they want to get paid they get on the ship but then they're out at sea and they're sometimes
00:40:35.640 just held at gunpoint for years at a time and never paid and never able to talk to their families
00:40:39.820 and so it was a few years ago and i don't actually know what kind of strides have been made in this
00:40:44.040 regard but the guardian said about a third of all prawns have slavery in their supply chain
00:40:49.400 so don't eat prawns okay so no chicken all right just kill cows and whales let's go
00:40:55.560 Cows and whales.
00:40:57.420 Mallards, you can eat mallards.
00:40:58.680 Yeah.
00:40:59.020 Because they're rapists.
00:41:00.640 Yeah, they're rapists.
00:41:01.500 So there was a male chasing a female across the park the other day,
00:41:05.480 and then I tried to sort of fling a bag of groceries at the male
00:41:09.040 to keep him from chasing the female.
00:41:11.260 But, yeah, there was the chase that carried on.
00:41:14.540 Really? He didn't care?
00:41:15.880 Yeah.
00:41:16.680 This is a horrific story.
00:41:18.640 I'm traumatized.
00:41:19.860 I'm going to go home and cry.
00:41:23.160 So we're joking around.
00:41:24.960 tell us a little bit about why we do what we do
00:41:28.080 we're both comedians
00:41:28.960 what is the evolutionary
00:41:30.800 it's getting laid isn't it
00:41:32.460 what you're hearing is the sound
00:41:36.020 of Dr. Fulton about
00:41:37.880 to smash us
00:41:38.900 I'm really looking forward to having my character
00:41:41.800 eviscerated on camera
00:41:43.260 well actually you're in a long term
00:41:45.820 relationship as well, I feel extra depressed
00:41:47.540 because if you're a young male comedian
00:41:49.180 and your rationale for doing comedy
00:41:51.740 is to get laid
00:41:52.720 yeah you have some some right that actually makes sense whereas you and i it probably decreases our
00:41:58.440 chances of getting laid with our partners because we're not even at home to do it yeah exactly
00:42:02.420 so so what we're not even home to do it we're out in the evenings begging strangers for for love
00:42:10.460 um so why why do what why do we need why do we have humor what is the evolutionary basis for
00:42:17.400 human. Okay. So I'm going to start from the beginning. There's this thing called sexual
00:42:22.940 selection and sexual selection. So natural selection is the evolutionary force that
00:42:27.980 creates adaptations that cause you to be able to survive. But sexual selection are the adaptations
00:42:34.540 that you use to have sex. And there's a famous kind of analogy, which is with the peacock's tail.
00:42:42.320 So a peacock produces a tail, and the females find it attractive.
00:42:45.880 But producing a tail that's symmetrical, that's colorful, and that he can jiggle at her is actually very costly.
00:42:54.200 It makes him more prone to predation.
00:42:56.040 If he has any parasites, he can't grow that tail.
00:42:58.540 And so humor is considered like a costly signal in the same way.
00:43:02.380 It's hard to be funny, as you both know.
00:43:05.940 And so it's very difficult to be funny.
00:43:09.280 And so one idea, actually Jeffrey Miller, who is my partner, came up with, didn't come up with, but wrote a book about The Mating Mind, is that human intelligence is a costly signal.
00:43:20.980 It actually takes a lot of energy.
00:43:22.620 You have to have development that has very little disease in it in order to be smart enough to do things like make art or make jokes.
00:43:31.500 And so it's a costly signal.
00:43:33.000 You're saying, look at how brainy I am.
00:43:35.200 Look how intelligent I am.
00:43:36.300 I can make you laugh.
00:43:37.400 Because it's very difficult.
00:43:38.600 And so they've done studies, for example, where they had undergraduates caption various different cartoons, and people who are funnier also tended to be smarter, right?
00:43:50.300 I agree with that.
00:43:51.000 And so the idea is that humor is a way of showing off how healthy and vigorous you are, because you're actually wasting your intelligence a bit.
00:44:01.760 it's like a costly signal, just like a diamond is a way of wasting your money. Humor is a way
00:44:06.700 of showing, look, I'm so smart that I can afford to show off in this particular way. And in order
00:44:13.380 to make somebody laugh, you have to also have pretty good kind of theory of mind. Your brain
00:44:17.080 has to work really well. So I have to have some idea of the culture that you come from. I have
00:44:20.860 to have some idea of the kind of popular culture that you might know. I have to be able to not be
00:44:26.100 too obvious. I have to have the kind of correct timing. I have to be quick, but I have to also
00:44:30.920 speak clearly it's a whole thing and it's knowing your audience yeah it's also knowing your app
00:44:35.840 absolutely knowing your audience yeah staring away from the vegan jokes it's difficult to be
00:44:39.880 it's difficult to be funny so men tend to show off in this particular way more than women i'd
00:44:48.880 like to think i'm a funny woman but like i don't actually think it really gets me anywhere i mean
00:44:53.440 it gets me somewhere with other women like i date women and women certainly like that i'm funny
00:44:56.920 but I don't think men care I mean the men that I date tend to care that I'm funny but I don't
00:45:01.180 think I would have any problems if I was totally totally a wet blanket and not funny at all
00:45:05.560 and you know when I used to hang around with these people who were in the military and they
00:45:10.560 were on this military base in Germany and I found that all the men tried to learn German got
00:45:15.360 conversational in German so they could try and pick up German women and none of the women who
00:45:18.940 lived on this military base even those who were unpartnered learned any German because they didn't
00:45:23.160 have to because german men tried to speak english with them and then the american men who were on
00:45:28.760 this german military base uh were learning german so they could seek out women and so men have to
00:45:34.540 try a lot harder and i think that's where humor kind of comes in so you're saying that so that's
00:45:39.840 essentially all humor is it's a tool to try and get laid and and do you think i did put a lot of
00:45:45.800 but yes that is the core of it that's a core argument also as well do you think
00:45:51.820 because there's a stereotype that the more the better looking you are the less funny the less
00:45:56.940 interesting you have to be is that true or is that just a wild stereotype i had this friend who
00:46:02.480 she was she was really obese and then she uh had gastric bypass surgery and then she was really
00:46:07.400 thin and it was she's really funny and people always said oh now i know why you were funny
00:46:12.440 you used to be because you because you have to show off right so i do think that people don't
00:46:21.480 work harder than they have to work in general right yeah i i mean i think that i thought i
00:46:27.140 was really really ugly until i was you know well into my late late teens yeah and i think that i
00:46:33.400 would be pretty boring if i had thought i was attractive earlier right and i think that's
00:46:39.000 Yeah, the same way with people just work as hard.
00:46:42.240 And people tend to play to their strengths.
00:46:45.560 And if you find out that you're funny, then you'll really work on that
00:46:48.680 because everybody has a niche and you'll try and play up that
00:46:53.900 and become as good at that as you possibly can.
00:46:55.860 And if you're attractive, that's just a way easier way of signaling your quality
00:47:00.580 than being funny, right?
00:47:03.160 And people generally don't tend to cultivate being funny unless they really have to.
00:47:09.000 a damning indictment
00:47:12.380 there you go mate
00:47:12.980 that's your life
00:47:14.080 that's my life right there
00:47:15.520 that's all we're doing
00:47:16.340 I love the moment
00:47:17.080 where you went
00:47:17.700 basically
00:47:18.600 you know
00:47:19.200 being funny
00:47:20.900 is a way of wasting
00:47:21.900 your intelligence
00:47:22.580 I'm like
00:47:22.980 am I talking to my mother
00:47:24.100 my mother
00:47:26.080 in another form
00:47:27.220 conspicuous consumption
00:47:29.880 of brain energy
00:47:30.820 I'm not that kind
00:47:31.980 of psychologist
00:47:32.540 you couldn't have done it
00:47:36.040 without a Freud joke
00:47:36.920 right
00:47:37.280 Was there anything else that you particularly wanted to ask about humour?
00:47:41.560 Was there anything else that you particularly wanted to ask about humour?
00:47:44.880 I feel suddenly under pressure now.
00:47:47.680 Why do you think, so it comes to a point where women find humour particularly attractive
00:47:55.100 and do you think if a woman, what do you think is more important for a woman evolutionary?
00:48:00.820 Is it being sound of healthy of body or healthy of mind?
00:48:04.900 Which one would be more desirable?
00:48:06.140 I mean, those things tend to correlate. So people have all kinds of stereotypes about, for example, people think if they think about a scientist or they think about somebody highly intelligent, they don't also tend to think about somebody who's very attractive and very healthy and very athletic, right?
00:48:21.340 Even though good characteristics tend to cluster together.
00:48:24.940 So people who are really good with other people also tend to be smarter.
00:48:28.460 But when we think about people who are highly intelligent, we tend to think about people who are highly intelligent and then have a deficit in other characteristics, like how well they can read other people, for example.
00:48:38.580 Right.
00:48:38.960 And those things, that's a stereotype that we kind of have about scientists and funny people.
00:48:43.600 so in general if you're smart enough to make jokes you're also going to be have pretty healthy
00:48:50.060 other factors right it's pretty hard to be smart enough to make jokes if you're i don't know dying
00:48:56.400 of consumption right not mentally healthy though we know too many comedians to suggest any correlation
00:49:02.500 and just to kind of swing back to that kind of dark triad narcissism machiavellianism psychopathy
00:49:07.020 i think that there's a certain degree of showing how good you are at exploiting other people
00:49:11.960 in humor because you're figuring out something so there's a lot of jokes that are tapping into
00:49:18.320 things that people don't want to admit to themselves that they believe or taboo ideas
00:49:22.280 that everybody holds or weird things that people do that they think nobody else knows about and so
00:49:26.920 that's another reason why humor is attractive because it shows that you have this secret
00:49:31.200 knowledge of other people and that is one of the necessary ingredients for being able to exploit
00:49:36.140 other people brilliant and if you don't mind me asking where are you politically because you've
00:49:40.760 hinted at different things yeah where am i politically uh i was really like i was kind
00:49:47.040 of a socialist in my early 20s believe it or not everybody is yeah and uh and i read ayn rand and
00:49:52.880 then that swung me a bit and then i thought ayn rand is a terrible person that swung me a little
00:49:56.860 bit the other way and so this has been back and forth but there's not really any libertarians
00:50:00.700 here in the uh in the uk but i guess i would define myself as somebody who is libertarian
00:50:08.140 leaning but also thinks that there are some services that the free market can't really
00:50:13.400 provide very well and i have joked before i think i made a joke on twitter that said
00:50:17.340 if you want to keep your male partner away from women encourage him to get into libertarian
00:50:22.700 because libertarians are very very yes like 75 percent male a very very few women libertarian
00:50:30.120 is that right yeah because i would say this this podcast is fairly libertarian i'm certainly
00:50:35.900 libertarian I don't know where I just swim back and forth I don't know what I am anymore but one
00:50:41.540 thing I was actually going to ask is this do you think you touched on it very early on at the
00:50:46.980 beginning of the podcast do you think I mean in a world we'd all love to live in a socialist
00:50:50.600 paradise where we're all equal everybody's happy we all listen to each other yeah but do you think
00:50:55.640 essentially human nature those are fundamentally incompatible with the basics of human nature
00:51:01.540 Listening to each other.
00:51:02.640 Yeah.
00:51:03.220 What?
00:51:05.000 But socialism in many ways, you know, this idea, you know, socialism, communism, the far left wing ideas of how we should live.
00:51:12.760 I think it's a quote attributed to E.O. Wilson about socialism, which is a great idea, but the wrong species.
00:51:18.160 Yeah.
00:51:18.600 You know, because ants do that really well.
00:51:20.700 Yeah.
00:51:21.040 Yeah.
00:51:21.280 And naked mole rats.
00:51:23.320 But ants are all related, aren't they?
00:51:24.900 Yeah, that's right.
00:51:25.620 And so are naked mole rats, right?
00:51:27.040 I just think it's very difficult, especially if you think about something like Twitter or soundbites, when people are trying to talk to one another.
00:51:34.980 The tweets that are most popular are tweets that are saying, like, well, no one else is brave enough to advocate this moral position, but I'm really strongly advocating this moral position in a totally un-nuanced and direct way.
00:51:46.480 And so if you want to have a tweet go viral or you want to have a lot of people listen to you, then you have to give a fairly un-nuanced position.
00:51:55.860 and nuance is really not encouraged in really any culture
00:52:01.220 and it's a little bit alarming to think that it's probably more encouraged
00:52:04.760 in our current culture than it has been at other points in history.
00:52:09.100 And what you have to do is you have to play to the whole IQ distribution, right?
00:52:13.220 You have to play to people who would not understand nuance
00:52:15.660 if it bit them in the face
00:52:16.620 and then you have to play to people who really only speak in very nuanced terms
00:52:20.520 and have very complex ideas about things.
00:52:22.860 And so there's always going to be that problem with populism
00:52:25.840 is that people can have nuanced positions,
00:52:29.000 but they have to present them in this other way.
00:52:31.820 And then I think that something that happens with leftist politics
00:52:34.000 is that they want to throw out people with their ideas.
00:52:38.800 So there's this kind of hygiene, epistemic hygiene,
00:52:43.100 where if I think that you have, like, let's say you and I were talking
00:52:46.960 and you expressed some kind of opinion that I thought was bad,
00:52:50.040 you think everybody should eat as many chickens as possible.
00:52:53.560 Like, you said something like that.
00:52:55.840 then I might say, now I think that all of your ideas are bad
00:52:59.340 and I'm going to completely reject you.
00:53:01.180 And even ideas that you have that I might otherwise agree with
00:53:03.720 because I can't think about you as a group of ideas.
00:53:08.680 I have to reject you as a person.
00:53:10.520 And that's something I think is really difficult
00:53:12.340 in the kind of current climate
00:53:13.700 is this desire that people have to reject, you know.
00:53:17.480 So Jordan Peterson, I disagree with him about a lot of stuff.
00:53:20.080 I think he's actually very traditionalist in a lot of ways
00:53:22.120 that I don't think are very helpful.
00:53:23.580 and there's some things I disagree with
00:53:26.200 I don't know like Sam Harris about
00:53:27.320 there's some things I disagree with Jeffrey Miller about
00:53:29.240 there's a lot of people that I admire
00:53:30.680 and I have fundamental disagreements with them
00:53:32.840 but what happens now is that
00:53:35.340 the more and more different attitudes are moralized
00:53:38.480 for example
00:53:39.400 if I said I don't think there should be any welfare state
00:53:42.000 I don't know if I could totally advocate that position
00:53:44.080 but that's the kind of thing that I could say
00:53:46.120 that would make somebody say
00:53:47.300 well now I'm also going to reject veganism
00:53:49.960 I'm also going to reject evolutionary psychology
00:53:52.080 I'm also going to reject all these other ideas that I think are clustered together with that.
00:53:57.140 And I have this philosopher friend named Amanda Askell, and she said that one really good way of getting around that is to communicate to people that you're talking to that you share values.
00:54:07.700 And so if you're talking to somebody about, I don't know, welfare or socialized medicine or whatever, you'll say, I don't want to see people die in the street.
00:54:14.100 I also care that people live and have happy lives.
00:54:16.640 And then you start on that kind of same page, whereas I think what happens a lot of times is people will think, well, you're not advocating the same position that I am, so you are a moral monster who also doesn't care if people die in the street and who doesn't care about human flourishing.
00:54:31.120 And if you kind of start off saying, no, we have similar values, then you can then talk about how you're going to arrive at those values in different ways.
00:54:39.060 But unfortunately, the ways that we arrive at those things, like, for example, if I said I believe in sex differences and I believe they're pervasive and I believe that there's a lot of things that we can't really do much about in terms of sex differences, then people might also ascribe to me the belief that I think that women should be kept at home or they should be told what to do when I don't have that value at all.
00:55:00.040 i just have a different idea about how we might arrive at uh gender equality then and i have a
00:55:07.200 different view of what that means than other people would for example but isn't it also
00:55:10.700 intellectual laziness so for instance if you say one thing then instead of me actually taking the
00:55:16.660 time to find out everything about you your points of view how on certain instances you're on the
00:55:21.120 right certain instances you're on the left and certain instances you're centrist it's far easier
00:55:24.980 to go oh you're this this and this and this therefore i want nothing to do with you yeah
00:55:29.260 Yeah, absolutely. And, you know, the same way that you were saying, like, people exclude Catholics and, you know, that there's been these ideas about making in groups and making out groups. It is some degree intellectual laziness, but also it's much easier to kind of push somebody away who has a bunch of nuanced positions and then say, now we stand strong, me and the other people who are in my ideological in group.
00:55:51.520 And if there's this threat from people with these other kinds of positions, then you can become much tighter knit.
00:55:57.880 And you see that in all kinds of different groups and on Facebook.
00:56:02.340 I haven't been on Facebook in a year and a half, actually partly because I had a really horrible argument with one of my best friends on Facebook publicly.
00:56:09.660 And so I actually prefer Twitter because even though people are – actually, I've cultivated a really reasonable following.
00:56:17.880 Even if I say, this person said something stupid, very rarely do my followers go after them in a way that I'm embarrassed about.
00:56:24.320 So maybe I'll undo that.
00:56:26.520 Maybe you're biased with thresholds too much.
00:56:28.920 But I don't really care.
00:56:30.160 People say crazy things to people on Twitter.
00:56:32.680 But on the other hand, those responses have to be short.
00:56:38.240 Somebody responds to you 20 times, then they and everybody else who's watching them knows that they're probably going on too much.
00:56:44.940 whereas Facebook I just didn't want to read novel length responses from people anymore about
00:56:49.380 stuff that I had posted about so I haven't really been on there also my mom didn't want her
00:56:54.820 I don't want to have political discussions with my mother if at all possible yeah that's never a
00:57:00.240 good idea uh well listen let's we'll wrap up now tell me about this current climate that you
00:57:05.000 refer to do you think you know we talk about free speech a hell of a lot you know comedians
00:57:08.820 naturally drawn to that issue anyway is we hear from America that people like Brett Weinstein
00:57:14.500 Heather Haring, et cetera. There are issues, apparently, at American universities where
00:57:20.260 certain things are becoming impossible to research or to do. Do you find that that's
00:57:24.220 the case? Do you think that's A true? And do you think it's the case here in the UK?
00:57:27.400 Is it affecting science, this climate that we have now?
00:57:29.840 I mean, where I teach, I can say pretty much whatever I want. And I've even, I was on some
00:57:35.280 podcast, was it? Oh no, I was on some late night, now it's on late night reruns. I told
00:57:40.200 an incredibly dirty joke on television, and nobody really gives me any flack about anything
00:57:44.720 at all at my university. I also teach psychology of human sexuality, and I teach about some quite
00:57:49.860 edgy things, and I've never had any pushback about anything whatsoever. Some of my students have said,
00:57:55.600 Diana teaches us things from a certain perspective, and I don't always agree with that perspective,
00:58:00.140 but of course I'm going to teach things from a certain ideological perspective,
00:58:03.280 and if somebody writes in their exam paper something different than my perspective,
00:58:06.460 If they back it up with evidence, I would still really appreciate that, right?
00:58:11.320 I don't get that impression over here.
00:58:13.940 And in the United States, I know a lot of people, you know, Jonathan Haidt, Jeffrey Miller, Sam Harris, Christine Hoff Summers, people like that who've had some kind of pushback.
00:58:25.360 But I also think they wouldn't be nearly as famous as they are now if they hadn't had protests and pushback.
00:58:31.560 And one thing that I really like that Jordan Peterson said is that if you scheduled a talk for him at 8 a.m. or 9 a.m., there would be no protests.
00:58:40.140 Because it's very hard for these people to, I mean, I'm not going to disparage them too much, but it's pretty hard for them to aggregate very early in the morning.
00:58:49.100 They're not that well organized, really.
00:58:50.440 That's a lovely way of saying lazy.
00:58:52.660 Yeah.
00:58:54.260 So there's, I mean, I'm still pretty agnostic about that.
00:58:58.220 I haven't lived in the States in some time, and I also am in a sort of privileged position.
00:59:05.060 I'm a woman.
00:59:05.880 I can say kind of what I want.
00:59:08.140 Sorry.
00:59:09.580 Wow.
00:59:10.180 So I think that, yeah, I just feel like I can kind of say what I like,
00:59:15.000 and very few people have ever given me pushback about those things.
00:59:19.420 Did you think if you were a man, you wouldn't be able to say some of the things that you say?
00:59:23.120 Yeah.
00:59:24.500 I don't think so.
00:59:25.380 I felt like I could talk about sex differences in the kind of fast and loose way that I have in today, you know, on this podcast, if I was a man, because people would feel like I was ideological biased, you know, in favor of my own sex.
00:59:40.620 Whereas as a woman, it would really behoove me to say women are very oppressed and I should be given kind of more advantages or concessions because of that.
00:59:50.940 That would actually be the best position for me to take in terms of my own self-interest.
00:59:56.960 And so if I'm arguing kind of against my self-interest, people are more likely to trust that.
01:00:01.160 And then also, you know, it's, yeah, I think that, sort of petering out here.
01:00:10.800 Yeah, I'm sort of, I'm Latina as well.
01:00:13.260 So I think I also have a bit of leeway about things in that respect.
01:00:18.080 But, you know, I think that what happened to Brett Weinstein is terrible,
01:00:22.840 but I also think that he was at a very weird university.
01:00:25.820 Yeah.
01:00:26.120 And I went to a really super lefty liberal arts university.
01:00:30.140 I was not politically involved in anything at that time, really,
01:00:34.700 and so I didn't really see much of that.
01:00:36.200 But I'm pretty sure that if I had started inviting libertarians
01:00:39.160 or Christine Hoff Summers or anybody like that to come and speak,
01:00:42.020 I would have also experienced that kind of pushback.
01:00:45.660 But at the current climate, and this is something that people who like libertarians and people who like the free speech people are not going to appreciate me saying, I think that they're actually getting a whole lot more benefit out of being protested than they are experiencing drawbacks or detriments.
01:01:02.160 I would agree with that. But I think the point that they would probably make is that they're not after fame. They're actually trying to achieve something in terms of making sure that people can speak freely. Do you know what I'm saying?
01:01:12.760 So they personally benefit, absolutely.
01:01:15.380 But I think a lot of them do so almost without intending to.
01:01:19.900 Well, if they're trying to get to a bigger audience,
01:01:22.920 certainly they're getting to a bigger audience that way.
01:01:26.200 And I think that it obviously behooves them to say,
01:01:30.540 oh, I'm really just trying to speak truth
01:01:32.100 and these people are trying to prevent me from speaking.
01:01:35.140 I think what happened to Christine Hoff Summers,
01:01:36.600 I watched some of that,
01:01:37.760 or people unplugging the microphones and things.
01:01:40.600 Unfortunately, if I was involved in leftist politics or I was involved in Antifa,
01:01:44.820 I would be embarrassed about how childish people in my in-group were coming across in those kinds of interactions.
01:01:50.020 And so it's really making people who have sometimes very edgy political positions look really good by comparison.
01:01:57.180 And I wouldn't really pull too much on that thread because it's working out very well, I think, thus far.
01:02:05.580 But there are certain things, I think, that are very difficult to research.
01:02:09.700 And I think that that's going to kind of unwind at some point.
01:02:14.640 We're going to have to, you know, behavioral genetics is really making leaps and bounds.
01:02:18.960 Previously, you know, just looking at somebody's genes, you really wouldn't know that much about their personality.
01:02:23.560 And now we're going to, you know, I think in 10 or 15 years, be able to give a pretty tight confidence interval about how agreeable, how intelligent, how promiscuous, like all kinds of things about people based on their genes.
01:02:37.120 And we're going to know how much things are genetically determined and how much things are influenced by environment much more so than we do now.
01:02:44.980 And there just won't be any way around that.
01:02:47.800 So I think people who are ideologically motivated against that position now, because there's not a whole lot of evidence that, you know, genetics is so deterministic in all these different respects.
01:02:59.620 But I think as that kind of the evidence mounts, you're going to see that.
01:03:04.040 So I think a lot of these problems will kind of sort themselves out.
01:03:08.700 Very good.
01:03:09.340 Thank you so much for coming on.
01:03:10.480 Listen, the last question we always ask our guests is,
01:03:12.980 is there one thing that you think no one's talking about that we ought to be talking about?
01:03:17.900 Or if you can't think of something like that,
01:03:19.620 is there one thing we haven't talked about today that we should have talked about?
01:03:22.960 Okay.
01:03:23.920 Yeah, I think that kind of the last point that I was making
01:03:26.220 about the kinds of strides that behavioral genetics are making.
01:03:29.580 And then also, you know, in China, they're trying to figure out ways of giving people sort of a social reputation score.
01:03:38.260 Social credit, yeah.
01:03:38.960 Social credit, things like that.
01:03:40.240 And they're trying to run their society in a sort of tighter way to see, I mean, they're basically trying to promote social order in a way that's totalitarian.
01:03:48.900 And I think that one thing that people aren't really talking about enough is that when all nations have access to that kind of technology, when you can, with a DNA sample or a picture of somebody's face, figure out kind of the likelihood that they're going to have certain kinds of characteristics, how are people going to use that kind of information?
01:04:07.660 And how can we promote a free society when that kind of technology is really available to everybody?
01:04:11.920 And I'm not sure how soon that's all going to happen.
01:04:13.620 and I don't know a ton
01:04:14.720 about it right now
01:04:15.540 but I do think
01:04:16.860 that it's super important
01:04:17.700 and people aren't
01:04:18.180 talking about it enough
01:04:18.920 oh it's already
01:04:19.560 yeah it's already happening
01:04:20.500 Pippa Malgram
01:04:21.300 who I mentioned
01:04:21.800 that we had on the show
01:04:22.620 a few weeks ago
01:04:23.800 she's an expert in technology
01:04:25.500 and she was talking
01:04:26.200 about the fact
01:04:26.820 that it's already happening
01:04:27.760 yeah
01:04:28.200 it's already
01:04:28.760 they're already doing it
01:04:29.820 yeah
01:04:30.040 it's scary stuff
01:04:31.060 yeah
01:04:31.380 scary stuff
01:04:32.060 well that's a nice
01:04:32.740 positive note
01:04:33.280 all right
01:04:35.260 Dana Flashman
01:04:36.300 thank you so much
01:04:36.960 for coming in
01:04:37.320 it's been a pleasure
01:04:38.120 tell us
01:04:39.520 what is your Twitter handle
01:04:40.700 it's sentientist
01:04:42.260 sentientist
01:04:43.500 Yeah, so somebody who prioritizes sentience as a basis for morality.
01:04:48.240 And, yeah, dianafleishman.com is where you can see my publications and other.
01:04:52.840 And you're writing a book?
01:04:54.420 I am writing a book proposal.
01:04:56.480 Ah, a book proposal.
01:04:57.520 Yeah, I'll be blogging and stuff probably over the summer.
01:04:59.560 Well, come back and talk to us when the book is getting ready.
01:05:01.620 I'd love to, yeah.
01:05:02.260 Yeah, we'll talk about genetics more.
01:05:03.720 That would be fascinating, I think.
01:05:05.200 All right.
01:05:05.780 Just before we go, if you enjoyed it, could you please rate the podcast?
01:05:10.860 I'm enjoying how Francis is looking at the wrong camera.
01:05:12.820 Yeah, I know.
01:05:13.500 I'm nailing this, yeah.
01:05:15.060 Could you please write the podcast, five stars,
01:05:18.420 leave a review, or even better, tell a friend
01:05:20.400 that you really enjoyed it.
01:05:21.620 And if someone wants to follow you.
01:05:23.320 At Failing Human.
01:05:24.660 And I'm at Constance and Kissin.
01:05:26.400 Follow us at TriggerPod on Twitter.
01:05:28.580 Subscribe to our YouTube channel.
01:05:29.840 We're building up our audience.
01:05:31.420 And we'll be back next week with another episode.
01:05:33.180 Thanks very much.