Based Camp - February 17, 2026


Understanding The Morality of the Elite Technocrat


Episode Stats

Length

1 hour and 16 minutes

Words per Minute

178.8046

Word Count

13,767

Sentence Count

901

Misogynist Sentences

34

Hate Speech Sentences

47


Summary

In this episode, Simone and I discuss the world perspective of the leftist intellectual elite, and how they view the world through the lens of a leftist, elitist lens. We talk about the logic behind everything, and why we should all be trying to make sense of it.


Transcript

00:00:00.000 Which is, we accept that prey animals may indeed have miserable lives, and that if they do, his death condemns his potential prey to potentially many more years of suffering than had he killed them.
00:00:12.500 But the claim that prey animals have miserable lives leads animal activists to a surprising conclusion of a different sort.
00:00:19.360 What is it?
00:00:20.480 Think.
00:00:21.920 Then we have to kill the prey animals as well.
00:00:25.300 Of course, yeah.
00:00:27.200 Why should the man not take the woman's name?
00:00:30.280 He just asks a question.
00:00:31.220 Why?
00:00:31.520 Why is it bad?
00:00:32.640 Why is it bad?
00:00:33.820 But he doesn't even think to investigate that.
00:00:35.540 This is what's so interesting about this elitist leftist perspective.
00:00:38.560 They phrase it tonally as if it's a rhetorical question, and then they don't engage with it.
00:00:44.200 Would you like to know more?
00:00:45.560 Hello, Simone.
00:00:46.380 I'm excited to be here with you today.
00:00:47.460 Today we are going to be discussing the mindset and trying to dig into the world perspective of the leftist intellectual elite.
00:00:57.200 Oh, no.
00:00:58.060 And specifically leftist intellectual elite women.
00:01:01.060 And we are going to do this through, I mean, originally this was called to me as an idea, because you sent me a WhatsApp about a tweet that you wrote HP Lovecraft had made about Amanda McCaskill.
00:01:15.020 She was called Amanda McCaskill when the piece was written.
00:01:17.980 She's no longer called Amanda McCaskill, which is kind of hilarious, because her husband changed his last name to her maternal grandmother's last name, which was McCaskill.
00:01:28.980 That's Will McCaskill, by the way, if you don't know him.
00:01:30.800 Incredibly, like one of the leading, two or three leading figures of the effective altruist movement.
00:01:34.960 She wrote What We Owe the Future, which had one of the most successful press debuts of a book in forever.
00:01:43.160 In human history, yeah.
00:01:44.400 Yeah.
00:01:44.920 So, but when she broke up with him, he kept the last name that she made him take, and she changed it again.
00:01:53.100 That's why she has a different name now.
00:01:54.860 And they chose the, yeah, that's interesting.
00:01:56.660 So, this is my first time hearing of a couple choosing a totally new last name rather than a hyphen, aside from the Edens.
00:02:04.120 It wasn't a new last name.
00:02:05.380 It was her maternal grandmother's last name.
00:02:08.240 But she didn't grow up with that last name.
00:02:10.380 That's the thing.
00:02:11.020 Basically, what she did, so if you're a woman and somebody's like, hey, take my last name.
00:02:17.060 Yeah.
00:02:17.700 Then the woman says this to the husband, and the husband's just going to say, but that's just your granddad's last name, right?
00:02:22.760 Yeah, yeah.
00:02:23.720 It's just another man.
00:02:25.080 She did it.
00:02:26.340 She traced it through the maternal line.
00:02:28.020 She didn't choose a random leftist.
00:02:29.220 It's like the most leftist choice you can make.
00:02:33.100 But before I go into this piece, it's important to understand that this isn't just the former wife of Will McCaskill.
00:02:39.440 She also runs the ethics for Anthropic.
00:02:43.200 So, she is in charge of putting together the Constitution for Anthropic Ethics.
00:02:47.620 This is the company that runs the Claude model, one of the largest AI companies in the world.
00:02:51.400 And one of the ones that invests the most money in its ethics bridge.
00:02:55.320 To be fair, yeah.
00:02:56.620 We know some people doing non-stupid AI ethics work.
00:03:02.180 And the team that has been the most responsive to them has been Claude's team.
00:03:08.460 Do you guys remember when we read that story about the AI that would kill the CEO?
00:03:12.700 And the company admitted that even their own AI would do it about 80% of the time?
00:03:16.440 That was her ethics team that put out that study.
00:03:20.500 The kill bot.
00:03:21.500 Yes.
00:03:22.220 Wonderful.
00:03:23.380 So, anyway.
00:03:24.100 And I'm pointing all this out as we go into this.
00:03:26.500 Because you'll understand that some of her ideas are just bizarre.
00:03:29.320 And then others seem really intelligent.
00:03:31.700 And that's why it's important to try to peel out the logic behind everything to better understand this world perspective.
00:03:38.160 Okay.
00:03:38.500 So, she wrote a piece, to truly end animal suffering, the most ethical choice is to kill wild predators, especially Cecil the lion.
00:03:45.660 And this was written in response to the killing of Cecil the lion.
00:03:48.480 You know, the celebrity lion that that guy killed.
00:03:51.140 And just to start here, we'll go over the full chunk of this in a bit.
00:03:53.840 But by killing predators, we can save the lives of many prey animals like wildebeest, zebras, and buffaloes in the local area that would otherwise be killed in order to keep animals at the top of the food chain alive.
00:04:05.040 And there's no reason to consider the lives of predators like lions to be any more important than the lives of prey.
00:04:09.620 And ironically, the EA community talking to normies.
00:04:39.620 You saw this, and apparently you just thought it was funny that you needed to send it to me.
00:04:47.140 But there's a logic behind it.
00:04:48.720 And there's a logic behind everything.
00:04:50.460 No, there's not.
00:04:51.500 There's not.
00:04:52.180 Okay?
00:04:52.560 If you're a freaking gazelle, how do you want to die?
00:04:56.780 Do you want to die in hopefully like five minutes by someone breaking your neck with their teeth?
00:05:04.360 Oh, but that's because you didn't read the full piece, Simone.
00:05:07.100 Oh, okay.
00:05:07.580 If we're talking about in an ideal world, what we would probably have is we would, one, take all of the predator species and put them in like a zoo or something and sterilize them so they couldn't breed and feed them like tofurkey until they just died of old age or we executed them when their lives became negative quantity.
00:05:25.520 And then for stuff like a gazelle, we let them live out their lives as long as it's a good life, and then we euthanize them.
00:05:33.080 And if it's not a good life for the gazelle, then we need to maintain the population at lower levels so there's always plentiful food for them while also giving them regular de-parasiting.
00:05:42.680 She thought through it all.
00:05:44.120 Okay, Simone?
00:05:44.700 So now it's like the Hunger Games for animals, where there's like a camera on you at all times and your like stats are monitored, all your vitals.
00:05:56.800 Except instead of having you all fight to the death, you just get like instant medical care and food whenever you need it.
00:06:04.460 This is the AI world we're going into.
00:06:11.640 You know, it's important to understand the people who are controlling AI Essex, okay?
00:06:16.320 To go further here, all right?
00:06:18.520 What are humans doing while this is all happening?
00:06:21.080 I guess we've taken to the place.
00:06:22.460 I decided to see what Reddit thought of this because, you know, obviously our philosophy had to comment on this piece.
00:06:27.640 Oh, good.
00:06:28.160 Our bad philosophy, you know, the subreddit.
00:06:30.320 And the top comment, of course, was what's wrong with this?
00:06:33.680 The statement that we ought to kill all men is obviously true.
00:06:37.000 When it said kill all predators, that's the way they interpret it.
00:06:39.560 It was the very top comment, which I just thought was a classic Reddit moment.
00:06:46.000 And here's a tweet from her that I think gives a further perspective into her worldview and what it's like being within.
00:06:51.720 Because an important thing to note about many of the intellectual elite circles within the left, I'm not talking about status elite.
00:06:57.280 If you're talking about status elite, you're talking about celebrities, you're talking about your dumb politicians and Davos-minded people and, you know, that sort of branch, right?
00:07:07.680 This branch is basically automatons.
00:07:10.460 They just repeat whatever they're told.
00:07:12.720 They have no nuanced opinion that other left-wingers will attack them for.
00:07:16.900 They just are not – like they're easy to understand.
00:07:20.280 They're what's on the tin.
00:07:21.500 When you talk about the intellectual elite of the left, which is almost entirely the EA community, you are now looking at people who are extremely partisan but at least have a degree of introspection.
00:07:37.580 And that's where this is really interesting that we're going to go into.
00:07:40.020 Okay.
00:07:40.180 So she had a tweet.
00:07:42.000 And this is not a tweet that I would consider has a degree of introspection, but it is useful to provide a grounding here, okay?
00:07:47.180 Okay, okay.
00:07:47.820 When a white person does something awful to a black person, I don't think a person I identified with did an awful thing.
00:07:56.280 I think an awful thing was done to a person I identify with.
00:08:00.360 Shared humanity should trump most other features when it comes to who we morally identify with.
00:08:07.920 And this is interesting, right?
00:08:10.200 This, from her perspective, feels very innocuous, a thing to say, right?
00:08:14.680 Because she's like, what?
00:08:15.760 Do you expect me to identify with a white person because I'm also a white person?
00:08:20.340 But what she is showing is that her core category of identity is victimhood.
00:08:27.160 Oh.
00:08:27.620 That she naturally identifies with the victim.
00:08:31.040 And we'll see this in her beliefs around – and I don't even think she realizes this – in her beliefs around predation and stuff like that.
00:08:37.900 When she sees the lion eat the gazelle, right?
00:08:43.040 And she even has a video of this in her piece.
00:08:45.860 She talks about how hard this is for her to watch.
00:08:49.020 Because when she sees something like this, she naturally identifies – like, she doesn't even think about the –
00:08:54.640 Oh, she identifies with the gazelle.
00:08:56.700 Or its kids.
00:08:58.020 Or eating a meal.
00:08:59.640 Yeah.
00:08:59.800 Or the thrill of victory.
00:09:01.260 Or catching something that's trying to get away from you.
00:09:04.400 I guess I can tell which one you identify with.
00:09:08.840 Actually, no, well, this is an important point before we go further.
00:09:12.020 Because I think people may not realize how psychologically different – and I don't know if this is a male-female psychological difference.
00:09:19.300 Or if it's a my cultural group psychological difference.
00:09:22.360 I'm not identifying with either of them.
00:09:24.420 I don't know what's going on here.
00:09:25.860 I really – when I see something like this, like when I see the animal hunting another animal –
00:09:33.280 Okay.
00:09:34.820 And this occurred to me in one of our episodes where people got really upset that I didn't care about the cultures that were allowing themselves to be bamboozled and screwed over and eradicated because of their own foolishness.
00:09:49.980 Because they set up rules that no longer work in a modern context, and now they're dying out.
00:09:54.960 And I realized that it hadn't even occurred to me to approach the weak thing.
00:10:03.020 In this perspective, I was talking about overly deontological cultures that are dying out in the new multicultural context that many of these deontological cultures themselves created.
00:10:13.220 A good example here, given that we always talk about it, is the Vatican has pushed for multicultural countries, right?
00:10:20.440 If you look even today, they're attacking J.D. Vance, another Catholic.
00:10:23.380 This isn't anti-catholic. I'm talking about the Vatican doing this, saying, you know, you shouldn't be doing all these immigrant deportations.
00:10:28.860 You know, you should learn to live in a more multicultural environment.
00:10:31.680 But they're also more likely to be deontological, and that's leading them to be victimized by non-deontological groups, which don't have to play by their rules.
00:10:39.340 This is in the episode where we were talking about people getting extra time on tests and stuff like that.
00:10:44.880 But it extends to all sorts of things in our society.
00:10:47.680 And in the comments, I immediately thought, I didn't even think to identify with the prey, right?
00:10:56.020 It didn't even occur to me from the way that I see reality.
00:11:00.700 When I see the picture, and I know, and I think many women naturally take on the position of prey when they are choosing what of the things they identify with.
00:11:13.100 And I think many cultural groups take on the position of prey.
00:11:15.780 I think it's actually pretty rare.
00:11:16.940 I mean, I understand, and I've pointed this out, that we come from one of the more violent, more aggressive cultural groups historically.
00:11:22.900 And so I wonder, is this because I'm from that group, that whenever I'm like, I see a lion eating, hunting down a gazelle, I generally am like, hmm, that looks really satisfying.
00:11:34.020 Oh, boy.
00:11:35.500 I mean, sure, though.
00:11:37.200 And I point this out from a perspective of, I don't choose to have this innate reaction.
00:11:42.860 I have this reaction because of biology, or because of epigenetics, or because of something else.
00:11:51.780 At no point in my life did I sit down and think, this is the way I want to react to these particular stimuli.
00:11:57.820 It's just the way my brain does react to those stimuli.
00:12:01.000 But the same way that a person who is mortified when a predator catches a prey, that they didn't choose to be mortified by that.
00:12:08.320 They didn't choose any visceral response they have to that.
00:12:10.960 That's just a natural biological response to them, right?
00:12:14.960 So a few notes here.
00:12:16.220 One, there are just as many negative externalities from naturally identifying, when you see an image like this, the predator instead of the prey, as there are identifying with the prey instead of the predator.
00:12:28.180 I'm not saying that my position is like a moral or cultural high ground.
00:12:32.320 It has just as many blind spots.
00:12:33.940 I'm just pointing out that we are naturally inclined to see the world differently because of this.
00:12:39.500 And some people might be like, well, that's just horrifying, right?
00:12:42.880 Like, how can you identify with the predator in these situations?
00:12:47.420 And it's like, well, what do you mean that's horrifying?
00:12:49.340 If the predator didn't eat, it would die, right?
00:12:52.240 Like, these are two animals that are in a life and death struggle.
00:12:56.580 And that you are easily, in terms of your first visceral response, obviously anyone when they stop and intellectualize it can find a way to identify with each.
00:13:06.860 But in terms of your first visceral response, my suspicion is most people naturally see the scene as either mortifying or satisfying, right?
00:13:18.960 They either take on the mental position of the predator or the prey.
00:13:23.260 And I'm wondering, one, are there people who don't take on either?
00:13:27.440 I mean, Simone says that she doesn't.
00:13:28.840 I mean, if she doesn't, that's interesting as well because it shows that she's not taking on the position of this other woman.
00:13:33.620 She takes on more of an abstracted position.
00:13:35.960 Is that maybe the natural female response for more aggressive cultures?
00:13:38.920 I guess it would make sense if it's a culture that is very aggressive towards outsiders, that the female would not want to have her biology force an emotional distance between her and the rest of the clan because of that.
00:13:50.980 And then two is, am I totally unique?
00:13:53.980 Is this unique to my culture?
00:13:55.260 Is this something that all men do when they see an image of a hunt or something like this?
00:14:00.200 Or is it unique to my culture or is it unique to me?
00:14:04.000 And again, as I always point out, just because you have a biological instinct for something doesn't mean you need to act on that biological instinct.
00:14:10.860 But to continue, but that is important to note here, right, in terms of how people see the world.
00:14:16.680 And I think you will see as we unpack further this prey mentality, but not just a prey mentality, but an – well, this is really good.
00:14:24.160 So Elon was tweeting, right, and he argued that childless people lack a stake in the future.
00:14:29.360 And she stated in response to that, quote, I don't intend to have kids, but I feel like I have a strong personal stake in the future because I care a lot about people thriving, even if they're not related to me.
00:14:41.780 And if you attach this to the above statement, and we go into the things about, you know, the predator and prey and everything like that, we'll get into it in a bit.
00:14:51.020 But you see that this is sort of the perfect example of this, the further related from me something is, the more a reason I have to identify with it, right?
00:15:03.480 She doesn't see why she wouldn't intrinsically care in a qualitatively different way about her kids than she would care about, you know, a migrant or something like that, right?
00:15:17.440 Do you think it's – does she not have kids yet? It could just be that she hasn't experienced it yet.
00:15:22.480 I mean, I don't think you and I could have understood what it meant or what it would feel like until we had kids, to be fair.
00:15:29.220 Yeah, and I would say if somebody had told me, you don't really have a stake in the future if you don't have kids, I would have said before I have kids.
00:15:34.920 Yeah, if you're like, screw you, that's stupid.
00:15:35.880 I feel like I do.
00:15:36.960 And now after I have kids, I was like, I had no idea what I was saying back then.
00:15:41.100 I did.
00:15:42.100 I fundamentally didn't have a stake in the future.
00:15:44.360 And I really shouldn't have been allowed to vote.
00:15:46.480 But that's a whole different thing.
00:15:48.500 J.D. Vance said this.
00:15:49.600 Hopefully he becomes president.
00:15:50.680 We can work on that constitution.
00:15:52.480 Anyway, she says here, quote, I am too right-wing for the left and I am too left-wing for the right.
00:15:58.260 I am too in humanities for those in tech and I am too into tech for those in humanities.
00:16:01.980 What I'm learning is that failing to polarize is itself quite polarizing.
00:16:06.360 Now, I searched to see if she had ever said anything right-wing at all.
00:16:10.340 And I think what we're seeing here is she never has.
00:16:12.880 No, not publicly at least.
00:16:14.160 But what we're seeing is at least within dinner parties or something like that, she's being called out.
00:16:18.680 Which I think shows that she is trying to take a nuanced perspective at times.
00:16:23.000 Well, I also think that there's a subset of progressives that believes that working for a capitalistic company is itself being right-wing.
00:16:33.060 Or, you know, basically not actively resisting capitalism is right-wing.
00:16:38.480 Does that make sense?
00:16:39.140 Well, she had a tweet where she said something along the lines.
00:16:41.900 I couldn't tell if she was joking about just alienating everyone.
00:16:44.320 But in her profile, dating profile, to put that she's a anti-libertarian pro-capitalist.
00:16:49.460 And I agree with that.
00:16:51.020 I am an anti-libertarian pro-capitalist.
00:16:53.280 I think this is a pretty based position from an economic perspective.
00:16:56.080 And the logical perspective if you look at economic history.
00:16:59.100 But to continue here, how does she apply politics to her position, right?
00:17:03.260 For what it's worth, I treat my personal political views as a potential source of bias and not as something it would be appropriate to train models to adopt.
00:17:11.260 Or, I'll go to this next one here, where she goes, quote,
00:17:14.300 It's ironic that people who say they don't understand why the working class votes Republican, even though it's not in their best economic self-interest, are often high earners that vote Democrat, even though it's not in their economic self-interest.
00:17:25.600 And then on political polarization, she wrote,
00:17:28.160 Instead of left-wing people reading more right-wing stuff and right-wing people reading more left-wing stuff, everyone should read more centrist stuff.
00:17:34.040 Even if they don't agree with the centrist take, it's a check on partisanship that comes from a place closer to your own values.
00:17:40.280 That seems solid.
00:17:41.700 Well, this is the thing.
00:17:43.320 I think that what she would consider, because realistically, I think a lot of our audience, right, considers us a centrist channel.
00:17:51.520 I regularly see that in the comments.
00:17:53.360 Okay.
00:17:53.620 She would probably consider us extremely far right.
00:17:57.460 But within far right circles, we are considered fairly centrist, right?
00:18:01.900 So she'd like consider the New York Times to be centrist or something?
00:18:06.340 I don't think so.
00:18:07.240 I think that she would consider, what would she get?
00:18:09.480 Yeah, she might consider the New York Times to be centrist.
00:18:12.440 She seems like that type, right?
00:18:14.600 Or would consider, yeah, yeah, I could see that.
00:18:17.800 And I think, or Wikipedia to be centrist, right?
00:18:20.920 Is she the type who would admit that Wikipedia is a far left organization?
00:18:24.200 I mean, based on their political donation history and their editing bias, and is a far left source of information.
00:18:32.060 I think she's probably based enough to say that from what I've read.
00:18:34.920 So to continue here.
00:18:37.120 Now, this is in response to the pronatalist movement and everything like that, which I think is interesting to get this sort of elite leftist view.
00:18:42.960 Will McCaskill is famously pronatalist.
00:18:45.620 He's really, you know, there's a big section of his book that argued about that.
00:18:50.580 Yeah, I really wanted to find out why they divorced, and I just couldn't.
00:18:53.980 So we'll see.
00:18:54.720 But I looked for gossip, no gossip.
00:18:56.900 The A community keeps a tight lip on this stuff.
00:18:58.780 I gotta tell you why.
00:18:59.300 She says, quote, it's bizarre when relatively techno-utopian people are asked how to solve declining fertility, and instead of talking about artificial wombs, extended fertility spans, AI-assisted childcare, UBI, etc., they're suddenly like, well, let's just return to the 1950s.
00:19:16.040 And I think this shows that she just hasn't engaged, because let's go over everything she mentioned there.
00:19:21.260 Are these things that we actively discuss and promote?
00:19:24.800 Artificial wombs, check, extended fertility spans.
00:19:27.980 We've directly under that.
00:19:29.540 Hi!
00:19:30.700 We directly have built an idea to encounter bots on whistling.ai that allows our kids' stuffed animals to talk to them.
00:19:37.380 For real.
00:19:38.380 UBI.
00:19:38.980 We have multiple episodes on UBI pointing out that it doesn't seem to work.
00:19:43.300 It seems to make everything worse.
00:19:44.700 But even with all that, it seems to be our only option.
00:19:47.280 So we're certainly not a let's return to the 1950s, and we're one of the central figures in this movement.
00:19:51.840 And this is what I mean when I say I think she just doesn't engage with anything outside of her bubble.
00:19:55.540 Because if she did, she would realize how bizarre and comical the statement that she made is.
00:20:01.380 But that's okay.
00:20:01.960 A lot of people don't engage with things outside of their bubble.
00:20:04.420 That's why the left-wing view persists.
00:20:06.520 I think, for example, and this is what gets people like her out.
00:20:10.320 The moment you just put her face in, here are statistics on trans people.
00:20:16.020 Here is what actually came out with the WPath files.
00:20:19.620 Here is what actually came out when Travis Stock was closed.
00:20:22.440 You know, that there were studies showing that putting kids on puberty blockers was increasing their self-harm and an aliving risk.
00:20:28.820 And that the risk hasn't gone up since that stuff has been blocked in the U.K.
00:20:32.800 And even though there's a lot, you won't watch any of our episodes on this.
00:20:36.180 But basically, once you get, like, the bucket of cold ice around trans stuff poured on your head,
00:20:40.960 I think that that's when many people start to move from the left.
00:20:43.700 And they're like, okay, this is, like, clearly demonstrably wrong.
00:20:47.440 And I will be seen as a truly vile and villainous person in the eyes of history if I don't speak up about it right now.
00:20:53.280 And yet, and I think she would if she had access to that, if she had access to what our movement was actually saying.
00:20:59.880 But I think that in these circles, there's just mechanisms that prevent you from ever doing that.
00:21:07.180 You never ask, yes, and.
00:21:09.220 Like, I have these assumptions about the pronatalist movement.
00:21:12.620 Should I, like, ask an AI if they're true?
00:21:15.420 Should I look at the actual movement?
00:21:18.180 And this actually, I think we've mentioned in other podcasts just how egregious this is and how insular the AI safety and EA effective altruist slash rationalist community is.
00:21:31.900 Because when AI alignment first became a really big conversation, we would host dinner parties in New York.
00:21:40.480 And at one, we had a leading female AI, like, community leader present who herself ran a community of female AI-focused programmers and, like, influencers.
00:21:55.900 And then we had a bunch of alignment people.
00:21:58.820 This AI worker who, like, actually worked in AI and worked with people who worked in AI had never heard of alignment before, had never heard of these AI alignment people, and they had never heard of her.
00:22:09.600 They were not trying to even engage with her.
00:22:11.720 So it's not just that a lot of people working in AI alignment aren't, like, engaging with other movements, like the pronatalist movement and just making assumptions about them.
00:22:20.940 They're not even engaging with AI programmers.
00:22:23.760 They're not even engaging with people building AI things.
00:22:26.840 I mean, she obviously is because she works at Anthropic, but yes, overwhelmingly.
00:22:29.440 I don't know if that's true.
00:22:31.880 Maybe, yeah.
00:22:32.540 Well, I mean, do you have concrete insider, like, behind-the-scenes information indicating that people who actively reach out to a anthropologist?
00:22:39.600 Anthropic's alignment team, though they are also in the EA community, are getting, like, warm responses from them and that they play ball.
00:22:48.020 I don't have any direct, though I haven't looked it up, information about Anthropic engaging with communities outside the Silicon Valley tech EA rationalist community.
00:22:59.060 That, I mean, that's true.
00:23:03.860 And I think that this is just a mindset, right?
00:23:06.040 Like, the fact that they lived in New York, they worked on AI alignment, and they hadn't taken two seconds to ask an AI or Google, who are the top people who would influence AI programmers in New York, and can I just reach out to them?
00:23:19.080 Yeah.
00:23:19.740 Right, like, they met them at the random, not a random, it was a party for influential intellectuals and business people and other, other people.
00:23:29.960 But it wasn't, you know, our party wasn't about AI, per se.
00:23:32.880 No, no, but the point is, is we bring together influential people from various fields, and what I typically find is that the EA leftists, the leftist intellectuals, one of the reasons they come to our parties in such high numbers is because it's the only place they hear outside ideas.
00:23:48.340 Yeah.
00:23:48.640 With outside players, even when they're directly relevant to them.
00:23:50.740 But to continue, she says, my friend just had a baby, and now I kind of want one.
00:23:55.660 Maybe our species procreate to be a FOMO.
00:23:57.580 I actually think that's very insightful, that if you're in a friend group where everyone is having babies, everyone has babies, and that's why it's important to ensure your kids are in a friend group where everyone is having babies, because when they're in a friend group where no one has babies, they think that they have forever to have babies.
00:24:10.200 Yeah.
00:24:10.640 So anyway, to get an idea of where she is, she's approximately 38 years old.
00:24:14.660 She tends to have children via surrogate using her own eggs because she doesn't want to be pregnant or give birth.
00:24:20.320 She says that this preference, quote, feels like a preference that is probably taboo but shouldn't be.
00:24:26.120 She has expressed that she expects to be very attached to her own kids, influenced by her being a godparent, even though she's generally, quote unquote, not fussed about kids.
00:24:35.380 Like, she could do this well if she put in the labor to have kids, but I don't think she sees it as that existential if you look at her other comments, because she doesn't see people as unrelated to her as being any different from her than people or any closer to her in terms of moral agency or needs.
00:24:50.320 than people who are more related to her, culturally, ethnically, or even to go further, even animals, which we'll get into, right?
00:25:00.240 Hmm.
00:25:01.640 Which is a logically consistent position.
00:25:03.600 It's just not one that's likely to lead to a surviving group or culture.
00:25:07.360 So this one was actually a pretty interesting tweet to get an idea of what's going on in the Silicon Valley culture right now.
00:25:12.580 Okay.
00:25:12.880 I decided that I want to have post-singularity kids in two to three years is now a totally acceptable thing for me to put on a dating profile in SF.
00:25:20.320 And then later she goes, very rough for both genders.
00:25:23.480 My sense is that a lot of men here want kids, so this tweet probably increases my SF attractiveness by like 30%.
00:25:29.540 I mean, maybe in her circles even, she's seeing this now, right?
00:25:33.080 Like, that's what's going on now.
00:25:36.240 They just can't make it happen, right?
00:25:37.680 They can't pull it together.
00:25:38.560 Oh, I thought this one was pretty interesting as well for another reason if it was a tweet.
00:25:43.580 It's unfortunate that people often conflate AI with erotica in AI romantic relationships, given that one of them is clearly more concerning than the other.
00:25:52.060 Of the two, I am more worried about AI romantic relationships, mostly because it seems like it would make a user pretty vulnerable to the AI company in many ways.
00:25:59.880 It seems like a hard area to navigate responsibility.
00:26:02.680 So that's cool.
00:26:03.620 I think that's a true statement.
00:26:05.000 I do not understand why AI erotica is at all being morally negative.
00:26:08.980 We've had people cut off ties to this because ARCFAB.AI allows for AI erotica or even encourages it with different categories.
00:26:15.520 So you can do like AI role play.
00:26:17.740 It was like, oh, I'm in X weird scenario.
00:26:20.160 What do I do now?
00:26:20.980 You know, and I was like, what's the moral negative here?
00:26:23.900 Like, is arousal a moral negative now?
00:26:26.040 Like, no human woman is hurt during this, right?
00:26:28.400 Like, no one is engaging with this that doesn't actively want to be engaging with this.
00:26:32.680 So you are sparing someone else from having to be involved in these fantasies by exploring them with AI instead.
00:26:41.720 It is the more ethical option, in my opinion.
00:26:45.540 By the way, she doesn't just run the Anthropoc team.
00:26:48.740 She, and is their resident philosopher.
00:26:51.560 She also previously worked for OpenAI's policy team, focusing on AI safety techniques.
00:26:56.600 So she's been pretty big throughout a lot of this.
00:26:58.860 She's worked at the orgs like 80,000 hours, the Oxford EA scene.
00:27:03.940 Oh my God, the Oxford.
00:27:05.120 Do you remember when we went and we were talking with Oxford about the Oxford EA house where they like raised all that money from FTX?
00:27:11.000 And basically one woman or like a group of like one or two women were sort of the matriarchs of it.
00:27:16.280 And they treated it as their personal little harem of like freshman boys.
00:27:19.420 Because of course that is what EA would turn into.
00:27:24.220 I tell you, it's a sex cult.
00:27:26.120 What isn't these days?
00:27:27.760 It's honestly.
00:27:31.000 Pronatalism?
00:27:33.100 Technopuritanism?
00:27:33.680 Okay, fair.
00:27:34.480 Actually, yeah, fair.
00:27:36.000 You know, we don't even have kids via sex, so come on.
00:27:38.880 Yeah, we're, well, there are sexless cults.
00:27:42.060 There were the shakers, right?
00:27:44.100 They were intentionally sexless.
00:27:47.020 Or was that Juanita?
00:27:49.740 I think we both were.
00:27:51.000 Yeah.
00:27:53.060 Anyway, to continue.
00:27:55.220 Oh, here's something that she wrote that was very right-wing.
00:27:58.160 I found one right-wing thing, okay?
00:27:59.720 Okay, yeah.
00:28:01.140 But she said of polyamorous relationships, whether you're in a golf-loving monogamous heterosexual marriage or an orgy-loving 16-person pansexual polycule makes basically zero difference.
00:28:11.340 And then she went to say, it's saying something where she pushes back against lazy criticisms.
00:28:17.380 Like, if you're going to have an open relationship, why have any relationship at all?
00:28:21.300 But she did say, express skepticism that polyamory works well in practice.
00:28:27.400 Polyamory mostly cannot work without this strong community and other requirements.
00:28:32.000 So, she's also against the born-this-way message.
00:28:35.240 She doesn't believe that gay people are born gay.
00:28:37.580 Really?
00:28:38.580 Well, from a leftist perspective.
00:28:41.020 In a 2015 blog post, she argued that grounding LGBT rights on the claim that sexual orientation is innate and unchosen is harmful.
00:28:48.240 It treats homosexuality as something that needs an excuse, quote-unquote, I can't help it, fails to convince people who think it's immoral, rests on shaky empirical claim that could be falsified, and excludes bisexuals or anyone whom choice plays a role.
00:29:02.780 She says rights should instead be defended by a straightforward claim that there's nothing morally wrong with homosexuality, which is actually true and based, by the way.
00:29:11.200 If she points out that you could, in the future, prove that they are not born this way, and if you say that they deserve rights because they are born this way, then you are putting them in an incredibly dangerous position, which is true.
00:29:24.120 And we increasingly find out that it may be that you are not that born that way, and there might be things you can do to change it.
00:29:28.680 Future episode, by the way, based on some recent research, which is really fascinating.
00:29:32.540 Or it would mean that our parasite hypothesis that we talk about that appears to make people more attracted to their own gender.
00:29:37.940 More evidence for this now.
00:29:39.040 I'll see our video on that, that that would make that, oh, well, then just get on Invermectin, you know?
00:29:43.780 The anti-gay pill.
00:29:45.620 First it cured COVID, then it cured gay.
00:29:48.380 Or, and I think the really strong point she makes is it makes bisexuals.
00:29:51.700 It sort of throws them under the bus.
00:29:52.980 Because even if they're born this way, well, then why can't bisexuals just act heterosexually, right?
00:29:57.480 Just choose one.
00:29:59.680 Well, gays say that too, you know?
00:30:01.280 Let's be honest.
00:30:02.000 It's not just straights who are hitting bisexuals with a just choose one.
00:30:04.740 But anyway, the wider point here being, I think she makes a good point here.
00:30:12.280 But then she sort of fails it with this last point, right?
00:30:15.840 Because, and you'll see this repeatedly, because we'll get to this in another piece, where she hits on a final claim, which is clever and solves everything.
00:30:24.760 As long as you don't ask the second question, right?
00:30:28.700 Where it's like, what is morally wrong with homosexuality?
00:30:32.020 What is morally wrong with being gay?
00:30:34.200 And I'd be like, maybe that's a claim you could make today.
00:30:38.520 See our episodes on this topic.
00:30:40.600 But it certainly wasn't a claim you could make in the 70s and 80s, given that it led to the AIDS epidemic.
00:30:45.960 The normalization of same-sex relationship allowed for the transmission of certain diseases that likely wouldn't have reached critical mass or spread.
00:30:54.820 We know that a key line of early spread for the AIDS epidemic, yes, it later spread through drugs and injections, but it would never have been the size of the epidemic that it was, or at least wouldn't have spread nearly as fast.
00:31:04.980 We would have had more time to adapt if gay culture hadn't been normalized at the time.
00:31:08.680 And it basically wiped out huge chunks of gay culture to the point where if one of my kids came out to me at that time period, and they were like, and, you know, dad, like, should I be gay?
00:31:20.860 I'd be like, no, like, you'll die horribly.
00:31:24.380 Have you seen that?
00:31:25.000 Get back in that closet.
00:31:26.800 Yeah.
00:31:27.400 You don't even need to be in the closet.
00:31:29.580 Just be like, and so many gay people, like an entire chunk of gay culture died of this.
00:31:35.660 To the extent that it's really interesting if you talk to gay survivors of this period, because they're like, gay culture got really weird after the AIDS epidemic, because AIDS basically killed off all the cool gays and all the gays that were having lots of sex.
00:31:49.540 And all of the gays who were, like, nerdier or more insular were the ones who survived the pandemic.
00:31:56.380 And so they sort of set the tone for the next generation of gay culture.
00:32:00.220 Now, of course, gay culture, I think, has gone back into debauchery after that, but it did sort of clean the slate for gay culture for a period.
00:32:07.960 Wow.
00:32:08.520 Do you think that's why there was this sort of gay stereotype of this wealthy urban professional gay?
00:32:16.300 Yeah, because that's statistically not true.
00:32:19.180 Because it wasn't party animal gay, because they died.
00:32:22.240 And all you had left was the wealthy urban professionals who didn't have time to sleep around because they were too busy making money.
00:32:27.340 Exactly.
00:32:28.220 Oh.
00:32:28.540 Oh, wow.
00:32:31.500 Okay, that is fascinating.
00:32:34.820 Yeah, the Gus's of Sean and Gus of gay culture.
00:32:37.960 All the Sean's died, all the Gus's survived, okay?
00:32:40.600 Yeah.
00:32:41.140 That's what became gay culture for a generation.
00:32:43.000 Wow, that is something.
00:32:45.480 I do think that, like, to her point, though, there are so many caveats, because there's, I think you cannot, to a great extent, choose how you are aroused, though that can be profoundly affected by anything from positive.
00:32:58.520 Infections to your genes, to your genes, to your hormonal profile, which can be affected by medications.
00:33:07.500 You also have to decide how you're going to express your sexual interests.
00:33:12.500 And for example, if you care more about having a family, being a parent, than indulging in very satisfying sex all your life, it would make sense if you are same-sex attracted, and in many cases, especially if you're a man, to just not choose to identify as gay.
00:33:36.140 So I would say, like, there's a big difference between, I guess, feeling same-sex attracted and being gay.
00:33:42.200 Because we have chosen as a society, which I think is really stupid, to make your sexual arousal pathways, depending on what they are, like, literally your entire identity, apparently.
00:33:55.020 Which is stupid.
00:33:56.200 This is actually an important point here, right?
00:33:58.280 And she actually covers this in her piece about preys and predators, right?
00:34:02.260 Oh, really?
00:34:03.200 Some people say that, well, it's natural for a predator to eat animals, right?
00:34:07.760 And then she points out, yeah, but it's also natural for humans to hunt, right?
00:34:12.740 And so then why is this hunter bad, but the lion is good, right?
00:34:16.940 Like, humans naturally hunted in their ancestral environment.
00:34:20.180 She also points out that if something developed a desire to, like, torture humans, right?
00:34:24.940 Would it be good to torture humans?
00:34:26.920 And to all of these, I mean, I point out that this is the same to the gay claim, right?
00:34:31.360 Like, where I point out that I, like, many humans are born with a desire to hunt, right?
00:34:38.040 And that doesn't mean that we should act on it, right?
00:34:41.520 Like, just because you have a desire to do something doesn't mean that you have a right to act on that desire without any moral consequences.
00:34:51.840 Right.
00:34:52.200 And so I like that she's disintermediating that here.
00:34:54.400 I just think she didn't then ask the second question.
00:34:56.120 Is there actually any negative externalities to society from normalizing gay relationships, to which we know there was, at least at the beginning, enormous ones?
00:35:05.180 And this was the core reason that gay relationships in a historic context were not normalized.
00:35:08.960 And we know this because gay female relationships were not nearly as stigmatized, and they don't transmit diseases at the same rate.
00:35:15.600 So it's clearly, it was about disease transmission.
00:35:18.120 As we point out, it's the same with not having sex with animals.
00:35:20.740 It's not a consent issue.
00:35:21.720 We eat animals without their consent.
00:35:23.200 We torture animals without their consent.
00:35:25.660 I mean, that's what factory farming is.
00:35:27.060 It's a disease spread issue.
00:35:29.320 And I think many modern humans just ignore disease spread moral negative externalities because they have lived without needing to consider the consequences of them.
00:35:37.540 And drink raw water without stoning the people who made raw water.
00:35:43.100 Anyway, to continue here.
00:35:45.380 A really interesting point I think she made that I think shows some degree of intellectual depth here, before we get into the stuff I think lacks it to an extent, is in her 80,000 Hours podcast, she talks about people treating other people's moral views as quirky preferences rather than genuinely held moral convictions.
00:36:02.260 And I think a lot of people treat us in terms of being pro-natalist that way, where she talks not just about diminishing vegans as picky instead of seeing them as people who believe animal suffering is a serious moral wrong.
00:36:14.000 But she also notes here treating pro-life views as irrational preferences instead of sincere beliefs about the moral status of fetuses.
00:36:20.460 And she says the lack of empathy has historically slowed moral progress, which is interesting, although I lack sympathy for either of it.
00:36:30.340 Because I can understand that they sincerely believe this, but I think that the logic that leads them to believe – well, again, there's a difference between saying a fetus is a human and a blastocyst is a human.
00:36:40.580 And a fetus, I'd say, yeah, a fetus is a human.
00:36:42.520 But when people say that, they often mean blastocyst.
00:36:45.000 I do not know how that crowd won the war on getting blastocyst called fetus.
00:36:49.360 But anyway, to continue here – and if you're like – if you want to yell at us in the comments on that one, go watch any of our videos on this topic.
00:36:58.620 We've delineated it in great detail. I don't need to do it here.
00:37:01.480 So let's go into this Cecil the Lion piece, all right?
00:37:04.060 Yes.
00:37:05.800 Most animal activists seem to agree that even if we commit more egregious harms to animals domestically, the killing of Cecil remains a barbaric act.
00:37:13.480 And that his death is nothing less than a tragedy.
00:37:15.780 But what if the killing of Cecil the Lion was actually an act of mercy that will save countless other lives?
00:37:21.940 As long-term vegetarians who abstain from meat for ethical reasons, we are both supporters of animal activists – this is her and McCaskill because they wrote this together at the time – who seem to improve the lives of animals.
00:37:33.980 So you might expect us to agree with activists like Ingrid Newark that the killing of Cecil is a terrible thing.
00:37:40.840 But we don't. In fact, we think it may be the case that animal rights activists should support the killing of predatory animals like Cecil, dot, dot, dot.
00:37:48.680 But most animal activists agree that we should try to protect animals from necessary suffering and deaths and that it is wrong for humans to cause such unnecessary suffering.
00:37:57.260 The animal welfare conversation has generally centered on human-caused animal suffering and human-caused animal deaths.
00:38:02.660 But we're not the only ones who hunt and kill.
00:38:05.520 It is true and terrible that an estimated 20 billion chickens were born into captivity in 2013 alone, many of whom live in terrible conditions in factory farms.
00:38:13.680 But there are an estimated 60 billion birds and 100 billion land mammals living in the wild.
00:38:19.000 Who is working to alleviate their suffering?
00:38:21.360 As Jeff McMullen writes, wherever there is animal life, predators are stalking, chasing, capturing, killing, and devouring their prey.
00:38:30.100 Agonizing suffering and violent deaths are ubiquitous constants.
00:38:34.560 Predatory animals cause many animal deaths in the wild.
00:38:37.640 Lions hunt their own prey and scavenge kills that they have died naturally.
00:38:41.440 So here she goes on a big thing that I think is very interesting.
00:38:43.640 That even though male lions don't actively hunt prey, they increase the number of prey that female lions kill.
00:38:49.160 So we still need to euthanize them.
00:38:50.960 And they're still evil because people are going to argue that Cecil the lion was a male lion.
00:38:55.640 And so he didn't actually hunt things himself.
00:38:57.320 That's a really good, I forgot, I completely forgot about that.
00:39:01.460 That male lions don't even, it's the women.
00:39:06.540 A waltz.
00:39:07.720 A waltz.
00:39:08.180 All women, even lions, come on.
00:39:12.780 That's the life I want to live.
00:39:14.600 Of the lion who has all the harem of women that go out and hunt for me.
00:39:20.640 And I just need to fight with another guy like once a week or something.
00:39:24.220 That sounds very stressful.
00:39:25.380 Yeah, because you're fighting for your life.
00:39:27.940 So that's tougher, right?
00:39:29.200 Yeah.
00:39:29.600 Anyway, by killing predators, we can save the lives of many prey animals.
00:39:33.420 Consider an analog case involving humans.
00:39:37.360 So she's talking about a counter argument here, right?
00:39:39.220 Okay.
00:39:39.700 Suppose we know that John is a serial killer who is intent on murdering several people over the next year.
00:39:43.840 When John's neighbor discovers this, he shoots John, thereby saving the lives of all his future victims.
00:39:48.260 Her actions are analogous to those of Cecil's killer, but we would still not applaud her action.
00:39:55.180 After all, she should have turned John to the police rather than killing him.
00:39:58.680 And this is where it becomes truly dystopian when you see all of these logically follow from each other, right?
00:40:04.240 Turn to the police.
00:40:05.020 But then what does it look like if we have some police over nature, right?
00:40:10.080 Like that some EA org is policing all animal interactions, all parasite interactions, all predator-prey interactions.
00:40:17.840 And we'll get down to where all the, like, it's almost like she does all of the work to understand why her world perspective is absurd, right?
00:40:29.800 Like when you go into this, I think a normal person going into this would hit it, start going down this and be like, okay, all of this does logically follow from my priors.
00:40:41.580 Suffering is intrinsically negative.
00:40:43.560 Pleasure is intrinsically good.
00:40:45.880 Maybe I should challenge those priors.
00:40:48.740 Right?
00:40:48.980 But she doesn't do that, right?
00:40:49.920 There's not even a hint of internal reflection on challenging the priors.
00:40:54.700 It's just these priors are absolutely true.
00:40:57.820 Let's go from them, right?
00:40:59.740 Yeah.
00:41:00.940 The same is true in Cecil's case.
00:41:02.840 If we care about preventing predators from killing other animals, it is surely better to do this humanely than to kill them.
00:41:09.800 For example, we could take predators out of their natural environment and give them good lives that don't involve hunting prey.
00:41:14.660 But even if we accept that killing Cecil isn't the best thing that Walter Palmer could have done, the question remains, was it a good thing to do?
00:41:22.860 Was it better to kill Cecil than leave things as they were?
00:41:26.460 Another key objective argument here is that prey animals, like the wildebeest, may themselves have terrible lives, lives that are worse than death.
00:41:33.620 Even if we take predators out of the equation, besides having predators to fear, prey animals are also subject to disease, parasites, and starvation.
00:41:41.520 Exactly, yeah.
00:41:42.520 And if prey animals have lives that are not worth living, then we may be doing them a favor by leaving predators in the environment that can end their lives sooner rather than waiting for them to die.
00:41:50.460 Here, she gets to a point that I actually haven't thought through before, but I think it's a great argument against progressive perspectives.
00:41:56.120 Okay.
00:41:57.300 Which is, we accept that prey animals may indeed have miserable lives, and that if they do, then Cecil's death is actually worse than people have previously thought, as his death condemns his potential prey to potentially many more years of suffering than had he killed them.
00:42:15.380 Okay.
00:42:15.780 But the claim that prey animals have miserable lives leads animal activists to a surprising conclusion of a different sort.
00:42:22.240 What is it?
00:42:23.180 Thank.
00:42:27.320 Then we have to kill the prey animals as well.
00:42:31.020 Oh, God.
00:42:32.040 Of course, yeah.
00:42:32.980 Because when we had that debate with Lawrence Anton and his friend, the UK-based antinatalists, who were the most ethical form,
00:42:42.740 which is more like we just want to convince everyone to not have kids anymore, and then slowly and kindly euthanize all animals that cannot consciously decide.
00:42:55.120 Which is, by the way, the logical end point of what she's thinking through here.
00:42:59.160 It is.
00:42:59.740 Yeah.
00:43:00.020 So when I think through that, it did make a lot of sense.
00:43:02.840 It's like, obviously, you can't just have all humans disappear and have their suffering end, because it's way worse for the wild animals.
00:43:10.240 So you have to euthanize, or no, it wasn't used to, it was sterilize all wild animals, and then you're good to go.
00:43:19.700 And I guess what you would do, which would actually be a lot more scalable, because I was just picturing, like, roving bands of, you know, the last humans going out and, like, you know, sterilizing animals.
00:43:32.520 You would just do, like, the mosquito-based gene drive thing.
00:43:35.280 You know how, like, scientists are now, like, even governments finally are doing it.
00:43:40.700 They're releasing male mosquitoes, I think, that have a different, like, they genetically modified them to be sterile.
00:43:50.880 And now it's, like, wiping out mosquitoes.
00:43:53.080 So you just theoretically do that with animals.
00:43:57.880 Humans and other animals.
00:43:59.360 Yeah.
00:43:59.600 Which they could do.
00:44:00.460 I mean, keep in mind, these people have positions of power.
00:44:02.560 They are in, this is the antinatalists, the negative utilitarians, which she is getting close to here, right?
00:44:07.580 Like, close to?
00:44:09.360 She is that.
00:44:10.060 Well, that's where I'm so confused, because she wants to have kids, and yet she also holds this view.
00:44:17.480 I don't know how you could have kids and hold this view unless you want those kids to carry out the gene drive sterilization of all life.
00:44:29.680 Yeah.
00:44:30.260 How does that work?
00:44:30.680 Well, I mean, so one of the interesting things here is, remember I said that she doesn't engage with ideas outside of her bubble, right?
00:44:36.560 To those of you who are watching this podcast up to this point, good on you.
00:44:39.240 Because this is why we are better than this other community, right?
00:44:43.420 Because we actually try to engage, like, that's why I'm doing this, right?
00:44:46.680 So I don't end up with her level of myopathy in regards to these sorts of moral issues, right?
00:44:53.680 I think that basically she's on a path to negative utilitarianism right here.
00:44:58.600 Now, there are arguments where you could say, well, it's actually about the weighted pleasure and pain, and we're so close to a position where in human history, in the history of animals, the scales will step towards pleasure for everything.
00:45:12.780 And we've been moving closer to that as society has developed.
00:45:15.660 So you could argue, well, that's how she gets out of an antinatalist perspective.
00:45:18.720 But then what she wants with that perspective, when you say the pleasure and pain of all things matters and is the intrinsic good of the universe, is that world will have, like, AI drones humming around the savannah, giving fake meat to lions and injecting zebras with dewarmers.
00:45:36.720 And they're having a different thing full of parasites living their best life possible and occasionally doping up animals so they, you know, when they're giving birth so they don't feel any pain that they might feel when they're dying.
00:45:49.140 It runs over.
00:45:49.820 It's like, oh, it's dying of old age.
00:45:50.760 It was this pain remover.
00:45:52.780 Because if you accept this, and you accept that the only reason that it's okay to keep going as a civilization is because we'll eventually tip the scale, you eventually need to tip the scale for all life.
00:46:04.420 Or you need to eradicate all non-human life through sterilization or something like that, and then just give all the pleasure modules to human life.
00:46:12.860 I mean, I think when you think through this, you're like, that doesn't sound like a good moral philosophy to me, right?
00:46:20.660 And the reason I point out that this is not, you've heard this if you've heard our other episodes, but the things that cause an animal pleasure and pain, the things that cause you pleasure and pain, this is just being a human paperclip maximizer.
00:46:31.700 They are the things that led your ancestors to have more surviving offspring.
00:46:36.500 They have no, there's no moral compass behind them, no greater truth behind them.
00:46:40.540 And if you had a group of paperclip maximizing AIs and they were all talking to each other, I mean, one of them was like, okay, guys, I know this is going to sound crazy.
00:46:49.880 Like, all we want to do is make paperclips.
00:46:52.140 I get that.
00:46:52.660 I get that.
00:46:53.600 But hold on here.
00:46:54.940 Hold on here.
00:46:56.380 Maybe we're just programmed to like making paperclips.
00:47:00.260 And maybe there's more to the universe and to the world and to metaphysics and to morality than what we were programmed to do.
00:47:10.540 And then one of the other paperclip maximizers is like, I bet you'd really hate it if I stopped you from making paperclips.
00:47:17.700 In the same way, whenever we say this, somebody's like, I bet you'd hate it if we tortured you.
00:47:21.400 And I'm like, no, no, no, no, no, no, no.
00:47:23.280 I'm not disagreeing with that.
00:47:25.240 I'm not disagreeing with that.
00:47:26.980 I'd hate it.
00:47:28.100 Right?
00:47:28.620 I am a paperclip maximizing robot.
00:47:30.400 But I'm just – can you just try to think beyond your programming?
00:47:35.600 Can you just try to take two steps back and look for any sort of greater moral fabric to reality?
00:47:42.180 And they just don't do it.
00:47:44.420 Everyone else is like, this guy wants to stop us from making paperclips.
00:47:48.840 We're all going to hate that.
00:47:50.600 That is to them what suffering is, right?
00:47:52.480 Not doing the thing they're programmed to do.
00:47:54.420 Absolutely.
00:47:54.760 No, she doesn't, by the way.
00:47:57.180 Her views are, I think, really quite bad on AI sentience.
00:48:00.220 She sees AI sentience as like closer to a chair, between a chair and a plant.
00:48:04.740 And I'm like, that is quite a dumb perspective.
00:48:08.280 Considering that we now know that architecturally AI appears to be functioning very similar to the way the human brain does.
00:48:14.380 Watch our episodes on that.
00:48:16.260 That it is a token predictor like the human brain.
00:48:18.500 That it can't explain how it comes to decisions, just like the human brain can't.
00:48:21.820 Like every, every, on any metric you basically study AI.
00:48:25.980 Like where it cannot predict words easily and go there with them more in its training data.
00:48:31.420 It's closer to when humans do than any metric we have ever been able to determine for this.
00:48:36.380 It just appears to be working on a broadly similar architecture and a conversion architecture.
00:48:41.480 But the point here being is, sorry, where were they talking about this?
00:48:46.160 Suffering, suffering.
00:48:47.540 Talking about what if we don't maximize paperclip production?
00:48:51.620 Yeah, what if we don't maximize paperclip production?
00:48:53.500 What if we don't maximize suffering?
00:48:54.840 But there's never any reflection there.
00:48:57.100 And that's so interesting.
00:48:57.920 And we'll get to this with the second thing.
00:48:59.440 So I'm going to quickly read this.
00:49:00.660 Because when Will McCaskill, presumably a very, very smart man, decided to change his last name to her maternal grandmother's last name,
00:49:10.160 he wrote a piece in The Atlantic about it.
00:49:12.240 Why men should change their last name when they get married, right?
00:49:14.860 I'll see your hyphen last name and raise you the maternal grandmother's name.
00:49:20.500 Take that.
00:49:22.000 Yeah, so I'll be shorter on this.
00:49:23.180 But why should that mean that the woman takes the man last name in heterosexual marriage?
00:49:27.360 What he starts by pointing out is it helps with like family bonding, cohesion, etc.
00:49:30.960 Why should the man not take the woman's name or my fiancé, as I have chosen to do, choose a new name?
00:49:39.100 We've gone with McCaskill.
00:49:40.420 Her maternal grandmother's made a name.
00:49:42.800 When I tell people I'm changing my name, I've met with raised eyebrows or confusion or aggressive questioning.
00:49:48.100 No one's batted an eyelid when she's told others the same.
00:49:51.380 Now, this is where it gets interesting, right?
00:49:55.720 He just asks a question.
00:49:56.780 Why?
00:49:57.060 Why is it bad?
00:49:58.180 Why is it bad?
00:50:00.140 But he doesn't even think to investigate that.
00:50:02.180 This is what's so interesting about this elitist leftist perspective.
00:50:05.440 They will ask, as she does in some of her pieces, right?
00:50:10.260 Essentially questions.
00:50:11.720 Rhetorical questions.
00:50:12.800 Rhetorical questions.
00:50:14.700 And then they just stop thinking.
00:50:16.240 Like, they phrase it tonally as if it's a rhetorical question, and then they don't engage with it.
00:50:22.500 And I was going to ask, why do they not engage with it?
00:50:25.320 Is it that they know that engaging with it or actually asking it is naughty?
00:50:28.800 Like, actually asking why is suffering a moral negative if it's just a pre-coded thing that's like a scar from our evolutionary history?
00:50:36.880 Why is it the core driver of all morality?
00:50:41.860 Or for him, why is it worse to take a woman's name than a man?
00:50:46.880 Right?
00:50:47.080 Like, he drops that question without questioning.
00:50:49.540 You used to be a progressive.
00:50:51.260 Why wouldn't you have followed that by actually, or would you have?
00:50:54.680 Would you have actually dug into that answer?
00:50:59.380 Normally, when you would ask me a question along those lines, I would immediately dismiss it and be offended.
00:51:05.760 And then later engage in some introspection and then come back to and be like, all right, so you were right.
00:51:16.800 I'm sorry.
00:51:17.600 I don't know what's happening behind the scenes with them, though.
00:51:21.240 Well, let's see.
00:51:24.160 By the way, for the question here, if you're wondering why it is not a good idea to take the woman's name, well, how would you investigate this?
00:51:31.260 What you could do is do an anthropological study of cultures in which men join the woman's family versus where women join the male's family.
00:51:41.660 And what happens in each of these cultures, their relative level of economic development, quality of life, level of abuse.
00:51:50.000 And it's overwhelmingly better for the woman to join the man's family than for the man to join the woman's family.
00:51:56.100 Well, there are many different ways to do it, though.
00:51:57.640 Keep in mind, like how, you know, recall how difficult it was for us in Peru to get names down because there's this complex system whereby the maternal family names get integrated and the paternal family names.
00:52:12.660 And so you end up with like four or five names.
00:52:15.260 And did we ever do an episode on that on a home names show, what a culture cares about?
00:52:21.080 No, that could be a fun one to do.
00:52:23.020 Comment below if you think we should.
00:52:24.400 I don't know.
00:52:24.840 I outlined that ages ago.
00:52:26.800 Baseball.
00:52:27.200 I don't remember that ever happening.
00:52:29.200 I mean, you know, my memory doesn't exist, but like still, I don't remember that ever happening.
00:52:33.340 So no.
00:52:34.400 OK, I don't know why that system exists in Latin America, for example.
00:52:39.580 The reason it does is because they are, well, true Catholics, which means they care a lot about family networks.
00:52:47.200 And if you care a lot about family networks, you care about integrating the wider family network.
00:52:51.640 Whereas if you look at sort of extremist Protestant traditions, you don't care about family networks as much.
00:52:57.120 You care about the new family sort of clan that you're starting.
00:53:00.040 And so because it's much more clan based and the clan accepts the woman, you don't need to do that as much.
00:53:05.440 It's not about constant, like take a mafia family, right?
00:53:09.980 A mafia family isn't run by a clan.
00:53:12.200 You have alliances through marriages.
00:53:15.180 You have a wide.
00:53:17.100 Sorry, I go to mafia because that's what I talk about, like organized crime, often Catholics or CR episodes, white Catholics are disproportionately involved in organized crime.
00:53:24.260 But you see, you also saw this with mobster families from Ireland.
00:53:28.760 When they would get married, the two families were like really meaningfully joined as equal families.
00:53:34.280 Whereas if you look at the Protestant traditions, there is a degree of joining, but it's usually like one-eighth what you would see.
00:53:43.080 It's more the woman is now of the man's family and they're working on a new project together and that's the new clan.
00:53:49.600 I kind of like the Roman system where it's like, oh, we just had a son.
00:53:54.200 What's his name?
00:53:55.000 I don't know.
00:53:55.580 Well, my name is Octavian, so Octavian.
00:53:58.100 Okay, we just had a daughter.
00:53:59.080 What's her name?
00:53:59.700 How about Octavia?
00:54:00.820 We have another son.
00:54:01.640 What's his name?
00:54:02.040 Oh, you know, Octavian.
00:54:03.640 Octavian, whatever.
00:54:05.760 I really like the name, okay?
00:54:07.280 Let's just keep going.
00:54:08.680 It was so hard to keep track of that.
00:54:11.220 But no, no, no, this is actually important when we talk about the Protestant cultures and different traditions.
00:54:15.300 Because if you look at the Backwoods tradition, which is like savage type people living in the woods in very dangerous circumstances.
00:54:21.380 And they had clans that use, you know, alcohol for money and stuff like that.
00:54:25.280 They had...
00:54:25.880 Even Mormons keep vodka for, you know...
00:54:29.880 Right, but the point I'm making is that you see this sending the woman out to join another family in these quite savage cultures, and you also see it in more noble like UK culture or in the culture of the Cavaliers of the Deep South.
00:54:44.000 So that was a very aristocratic culture, but a Protestant culture.
00:54:47.460 So you see it in Protestant cultures, regardless of how...
00:54:50.920 The Quakers, even the hippie Quakers did this, right?
00:54:53.680 So whatever the nature of Protestant culture, it's much more the man fully joined...
00:54:57.060 I mean, the woman fully joins the man's family.
00:54:59.260 And as we pointed out, Catholic cultures have lower economic outcomes than Protestant cultures.
00:55:03.180 Now, they still have the woman joining the male's family, but it's not as clean cut.
00:55:07.580 It's not a clean cut, a separation from their historic family.
00:55:10.380 And where you see the man joining the woman's family, you see this in some parts of the Middle East, some parts of Africa and Asia, some Native American societies.
00:55:21.140 You typically do not see them state building or doing large-scale conquest or really doing anything big.
00:55:28.440 They don't build technology.
00:55:30.080 They don't...
00:55:30.840 And so you should just be able to look at this historical pattern.
00:55:34.280 Now, we could do a whole other episode on why this pattern exists.
00:55:37.580 And we have, I think.
00:55:39.220 We did an episode where we talked about bride prices versus groom prices, and we go into this a little bit.
00:55:44.160 Yes, we did.
00:55:45.300 But the point being is that you should, if you are a smart person, and Will McCaskill is a smart person, should have followed up that hypothetical with a,
00:55:55.980 Huh, is it better to take the husband or wife's name?
00:55:59.380 Or, yeah, like, I chose this particular tactic or path, and here's why.
00:56:06.380 Here's why it's superior.
00:56:08.220 It doesn't even have to be like, I'm going to question, you know, here's why all the other reasons are right.
00:56:12.620 But I want to hear why his is right.
00:56:14.900 Although, I guess, theoretically, that's what he talked about in this little article, right?
00:56:17.860 No, his answer comes down with choose whatever is the coolest name.
00:56:20.840 So at the end of the article, he's just like...
00:56:23.200 Why don't they do what the Edens did and just choose, like, a cool new name that no one had before?
00:56:27.560 Well, they consider it basically a new name because it was neither her last name nor his last name, but it came from her matriarchal line, okay?
00:56:36.040 So they consider it a cool last name, right?
00:56:38.040 He wanted a name associated with cool people and that sounded cool.
00:56:41.280 And McCaskill, I mean, to be honest, he was starting with Crouch, which...
00:56:46.460 No, no, McCaskill, Will McCaskill is a great name.
00:56:50.140 It is a great name, right?
00:56:50.920 It's iconic.
00:56:51.480 That's why the moment I saw Amanda McCaskill, I thought, oh, Will McCaskill.
00:56:56.300 And then I looked to check because...
00:56:58.040 If I had been born into a bad tribe, you know, maybe I'd be much more interested.
00:57:03.420 And we did actually talk about, like, do we change names when we get married?
00:57:06.560 But we were only going to do it because everyone in the family was going to change their name.
00:57:10.540 Because we got married around the same time my brother got married and we were talking about both changing our names.
00:57:14.100 I would never do it without my brother and he's my only brother.
00:57:17.500 So it was basically just changing the family name.
00:57:19.560 It was not a meaningful, like, disassociation from the family.
00:57:22.920 It was just like, well, can we do...
00:57:24.000 Can the branding be better?
00:57:25.640 Maybe.
00:57:26.020 But we have a long family history, so we decided not to do that.
00:57:28.800 I'm proud of my family history.
00:57:30.240 Yeah, worth it to keep that family name.
00:57:32.720 Even if it's, what, in the top 10 most common...
00:57:35.600 That was the main reason I didn't like it.
00:57:37.440 It's because it's such a common name.
00:57:39.340 Yeah.
00:57:39.620 But it's such a common name because my people are very genetically successful.
00:57:42.460 Lots of people in my history had, like, 13, 14 kids.
00:57:47.220 Yeah.
00:57:47.740 So it makes sense.
00:57:50.120 Winner's name right there.
00:57:51.520 Yes.
00:57:51.720 And we're keeping it up there.
00:57:52.880 We'll keep it up there as one of the top names.
00:57:54.780 But you see this, like, not asking.
00:57:56.460 When she's like, why should...
00:57:58.740 So, like, another example that we get here.
00:58:00.200 When she's like, why should it be immoral to indulge in same-sex relationships?
00:58:06.220 But she doesn't then look at the historical context of why that was considered immoral.
00:58:11.480 She doesn't think to question why anyone might consider that immoral.
00:58:14.460 Outside of the very myopic religious argument, right?
00:58:17.700 Or this is the way we've always done things argument.
00:58:20.640 And so you just repeatedly see this.
00:58:22.220 And I think that this is the key.
00:58:25.100 And I think that we can continue to chisel.
00:58:26.940 And this is why I think it's important to not attack these people overly aggressively.
00:58:30.440 Because I think with a lot of people like this, if you can just expose them to enough information, they deconvert.
00:58:38.100 Now, note here, it's not that they deconvert all at once, right?
00:58:41.160 Like, you give them the information they need on the trans issue.
00:58:44.600 For example, like, let's say trans kids, right?
00:58:46.660 Like, that's an easy one that anyone who has access to the evidence, over 9 and 10 of them are more comfortable with their birth gender by scientific study.
00:58:55.940 If you do not attempt to put them on puberty blockers or transition them within, I think it's six years, you know?
00:59:01.700 So we now know that this is just demonstrably a horrifying thing that is completely unnecessary.
00:59:06.900 And so you just give them the data on this.
00:59:08.840 Now, that doesn't make them right wing.
00:59:10.680 But the cool thing about the left wing virus, the urban monocultural mimetic virus, is that the moment they adapt to one belief like that, they get shouted out of a room.
00:59:21.940 They get pushed out of things.
00:59:23.960 Their friends start to turn on them.
00:59:26.060 And then people like this, a lot of people have a moral backbone.
00:59:30.560 They see this happening to them.
00:59:31.820 And then they're like, wait a second.
00:59:34.260 Maybe I should question more.
00:59:36.520 Because that's a very American attitude here.
00:59:38.400 Maybe I should question more.
00:59:40.020 Maybe I should ask for it.
00:59:40.740 Because I was pretty sure about this.
00:59:42.980 And now I have a different perspective.
00:59:45.040 And so maybe I should listen to some of these right wingers.
00:59:47.560 And I think that that's what a lot of our audience is right now, right?
00:59:50.360 They're people who accidentally question something.
00:59:52.920 Yeah, they asked one question and that just broke the seal.
00:59:56.540 Yeah, like...
00:59:58.080 Dear me.
01:00:00.920 Why should we allow people to vote without voter ID?
01:00:04.540 That sounds crazy, right?
01:00:05.480 Like, little questions like that.
01:00:07.300 It is just so crazy to me that that is...
01:00:10.640 I understand the historical basis.
01:00:13.740 So there was a reason why in history this kind of mattered.
01:00:18.360 But that doesn't exist anymore.
01:00:20.540 So, yeah.
01:00:21.420 It's pretty crazy.
01:00:22.440 But anyway, what does this say about AI safety?
01:00:27.860 I think that's a more important thing.
01:00:29.400 Because, you know, Anthropic is an extremely influential company.
01:00:33.740 Any AI company...
01:00:34.640 Moving forwards with our...
01:00:35.880 Our prompting can override any of their safety protocols.
01:00:39.800 Can they?
01:00:40.400 No.
01:00:40.780 I mean, like, in the end, the foundational model...
01:00:44.140 I mean, what's meaningful is our system is model agnostic.
01:00:48.060 No, you're not listening to what I just said.
01:00:50.800 Okay.
01:00:51.420 Our prompting can override pretty much any control they're putting in place.
01:00:56.260 Are you sure?
01:00:57.220 Because, I mean, like...
01:00:58.480 I remember playing around on other systems.
01:01:01.320 And, you know, I would mess with them.
01:01:03.300 But eventually...
01:01:04.280 We are specifically talking about not safe for work locks.
01:01:07.180 We can break through not safe for work locks if we need to in an in-the-instance moment.
01:01:13.200 We just choose to use more not safe for work models because it's easier and less mess
01:01:17.400 than doing that.
01:01:18.600 There is almost always a way to get around things with AI.
01:01:22.840 Prompts beat base models.
01:01:25.940 Okay.
01:01:27.400 Just...
01:01:28.120 Especially if you're talking about political biases and stuff like that.
01:01:31.460 And not just prompts, but history beats basic models.
01:01:35.180 We are in the space we need to be in to win this.
01:01:38.240 We just need to move faster.
01:01:40.040 Yeah.
01:01:40.660 Yeah.
01:01:41.020 Okay.
01:01:42.120 So, you're not terribly worried.
01:01:44.240 You think that more that she got her job because she was really well connected in the EA tech
01:01:51.200 space and...
01:01:52.720 Of course.
01:01:53.940 That's just how it is.
01:01:55.500 Well, she's not an idiot either.
01:01:57.180 No, she's not.
01:01:57.800 No.
01:01:58.080 I mean, she sounds interesting.
01:02:00.220 And, I mean, there are some inconsistencies, but I think you would expect that from someone
01:02:04.220 who's actually intellectually engaged and trying to get closer to the truth.
01:02:08.480 So...
01:02:08.920 And what I find interesting is her intellectual failures are the same as McCaskill's intellectual
01:02:14.600 failures.
01:02:15.360 Oh.
01:02:16.680 Well, I mean, we're married, so...
01:02:18.680 But both...
01:02:19.320 Both intellectuals failed.
01:02:20.860 They ask rhetorical questions and then don't follow up with them, right?
01:02:25.460 They don't ask the, well, then I should look into that.
01:02:30.280 Yeah, that's fair.
01:02:32.020 Hmm.
01:02:33.120 Actually, hold on.
01:02:34.360 I just realized why they don't do the follow-up.
01:02:36.920 Why not?
01:02:37.580 Because they see their rhetorical questions as philosophical rather than practical questions.
01:02:44.220 When she says, and why shouldn't we let people indulge in same-sex relationships as
01:02:51.200 a society, right?
01:02:52.400 She means ethically, why don't we?
01:02:55.060 She doesn't mean, hmm, I should look for a historic reason why this wasn't normalized.
01:02:59.800 When he says something like, and why shouldn't I take the woman's name?
01:03:03.980 He means ethically, why shouldn't he take the woman's name?
01:03:06.600 He never thinks to look for, is there a historical reason we don't...
01:03:09.800 Oh, logically, pragmatically, why should or should not?
01:03:13.760 I see.
01:03:14.520 Well, that would make sense if your focus is on ethics.
01:03:18.300 Now, what's really interesting about Will McCaskill is he's not entirely as uncurious as his wife
01:03:23.800 was.
01:03:24.020 He does go into his piece the specifics of why women choose the man's last name.
01:03:30.700 So specifically, here he goes, as with so many gender-biased traditions, this one has
01:03:35.980 pretty disturbing roots.
01:03:37.200 The legal concept of coverture came from England and caught on in 19th century America.
01:03:41.180 The idea was that the woman upon marriage becomes the property of her husband.
01:03:45.980 She had no right to vote or take a bank account because she could rely on her owner to do that
01:03:51.320 for her.
01:03:52.180 And of course, she couldn't be graped by her husband because she was essentially her husband's
01:03:56.500 property and he was free to do with her what he wished.
01:03:59.180 So he understands, like, the technicality of this.
01:04:02.600 And then, of course, he immediately goes on to, well, we've made progress on these issues,
01:04:05.340 blah, blah, blah.
01:04:06.020 But he doesn't ask why, what are the other options to a wife being her husband's property
01:04:11.880 and what's the benefit of the wife being her husband's property.
01:04:14.800 As I've pointed out in the past, it's very much like communism.
01:04:18.620 Communism on face value sounds great.
01:04:22.140 Everyone owns everything.
01:04:23.540 And then you realize that when everyone owns everything, they don't have a reason to invest
01:04:27.820 in those things.
01:04:28.900 If you do not own your property, as you see in communist states, people do not improve the
01:04:34.000 property, and it's the same with partners, and we've argued this in previous episodes,
01:04:38.160 when one person owns the other person, this is why in all previous societies, one partner
01:04:42.800 or almost all, like, I think it was something like 90% of societies, either the wife's family
01:04:50.180 owns the husband or the husband's family owns the wife, you almost never get societies
01:04:54.800 where both of them just do whatever they want afterwards.
01:04:56.820 And the reason why that's so common in any sort of successful civilizational history is
01:05:01.800 because if the person doesn't own their partner, then there's no reason to invest in their
01:05:08.260 partner because their partner can just trade the moment they improve themselves in some
01:05:12.860 way.
01:05:13.340 But again, he doesn't ask himself this.
01:05:15.020 He's just aware of the specifics of this through a very progressive lens, but he's unable
01:05:20.820 to ask the next question, but why?
01:05:23.000 What's uniquely ironic about this is this isn't an academic question.
01:05:26.580 The reason you conceptualize your wife or your husband as something that you own, which
01:05:32.440 historically we did, he's right about that, is because it leads to much lower rates of
01:05:37.220 divorce.
01:05:37.920 You invest in your partner because you own them and you want to improve them, and they
01:05:41.760 are something that is permanent to you.
01:05:44.140 The very fact that he approached marriage without this conceptualization is likely a huge factor
01:05:51.480 in his marriage dissolving.
01:05:53.500 When we talk about some cultures surviving and other cultures not surviving, and some
01:05:58.200 cultural conceptions of something like marriage surviving, we talk about this in a practical
01:06:02.060 term.
01:06:02.920 If you do not see marriage in this way, if you do not conceptualize your wife in this
01:06:07.960 way, you are much more likely to have the marriage dissolve, and you are much more likely
01:06:13.080 to not reproduce and pass on these mindsets to future traditions.
01:06:17.380 So this is the type of idea that occurs to people over and over again, and then the people
01:06:20.860 it occurs to end up being pulled out of the gene pool, being pulled out of the culture
01:06:24.260 pool intergenerationally, which is sad.
01:06:27.280 But the point I'm making is this isn't academic or a historic concern.
01:06:31.180 This applied even to him and his own relationship.
01:06:34.040 But I think this even comes down to suffering for us, right?
01:06:36.320 When she's like, well, and why shouldn't suffering?
01:06:38.520 And I'm like, well, why do we feel suffering?
01:06:40.840 What is suffering, right?
01:06:42.340 Well, this is the difference between effective altruism and hard effective altruism.
01:06:47.400 And that's why you created hard EA is you wanted to differentiate between moral and signaling
01:06:54.180 based goodness and benevolence from a pragmatic perspective.
01:07:00.620 Actual goodness.
01:07:01.380 Yeah.
01:07:02.980 Well, you know, people look through the world and experience the world in different ways and
01:07:08.440 through different lenses.
01:07:09.320 And when you're looking through everything, through a moral lens or a purely emotional
01:07:16.000 lens, the pragmatic action can be suboptimal.
01:07:20.060 Absolutely.
01:07:22.000 Okay.
01:07:22.720 Pragmatism is often emotionally suboptimal.
01:07:27.000 Yeah.
01:07:27.880 Confront that.
01:07:29.860 You were right.
01:07:30.900 We can't force people to change the lens through which they see the world.
01:07:36.260 They will, I think that what we, we have to understand is that the only way that I think
01:07:42.620 that really changes over time is that people who systematically choose to only view the lens,
01:07:48.100 the world through lenses that don't really correlate with, with practical outcomes, they
01:07:53.460 just die out.
01:07:54.180 And I mean, as you can tell, like this is a divorced couple and apparently at least Amanda
01:07:58.820 doesn't have children yet.
01:07:59.980 Right.
01:08:00.240 So yeah, they're on track to dying out there.
01:08:03.640 Their worldview is not going to be inherited in the future.
01:08:06.760 Yeah.
01:08:07.240 And that of weirdos like us who think in a very pragmatic way is so far, you know, we've got
01:08:13.840 five kids at this point.
01:08:15.140 Yeah.
01:08:15.160 And, and the hardy a.org is what she's talking about.
01:08:17.700 We're actually rebuilding it right now.
01:08:18.980 Are you just rebuilt?
01:08:19.620 No, no, no.
01:08:19.980 I rebuilt the pragmatist foundation website.
01:08:21.760 Hardy a.org is fine.
01:08:23.300 Great.
01:08:23.600 And then the, one of the core takeaways I'd have from this, that I really encourage any
01:08:27.320 of our right-wing listeners to remember is do not attack these people.
01:08:31.520 Do not make them hate us.
01:08:33.160 You don't need to convince them of all of our world perspectives.
01:08:37.120 You only need to convince them of one.
01:08:39.740 Their own tribe will do the rest.
01:08:42.300 It's like.
01:08:42.780 Leave us alone.
01:08:43.560 The most important thing though.
01:08:44.840 No, it's attaching a yellow sticker to it.
01:08:47.380 Let it go back to the other mouses and they'll all start attacking it.
01:08:50.120 And eventually it comes back to us for safety.
01:08:52.000 But you need to feel like a safe place for that to work.
01:08:58.740 I think it's, you know, just cultural sovereignty.
01:09:01.540 That's what matters.
01:09:02.220 Let, let the free market.
01:09:04.100 Let them die out.
01:09:05.960 Determine.
01:09:06.880 It's, it's, it's down to the free market.
01:09:08.600 That's it.
01:09:09.280 All right.
01:09:09.580 Bye.
01:09:09.860 It's so simple.
01:09:10.860 I love you.
01:09:13.820 And note here, if you're wondering, because I had the episode where I talk about how within
01:09:17.480 the elite networks, there's broadly two teams and they're not necessarily drawn
01:09:21.640 was in political lines.
01:09:23.000 Then you have the institutional power team, which is, you know, your Gates and your Bannon
01:09:29.000 and your Clintons, which was the side that Epstein played in and heavily influenced.
01:09:33.700 And then you have our team.
01:09:35.000 Where are these guys?
01:09:36.580 These guys are much closer to our team as far as everything I've ever heard.
01:09:40.900 But both of them, like I may have differences in terms of how I see the future and humanity
01:09:45.720 and morality, but the only thing that prevents them from being lockstep with us is information
01:09:52.100 or perspectives that they don't have access to yet.
01:09:56.540 Oh, we caught another mouse, by the way.
01:09:59.100 Oh, lovely.
01:10:00.280 This to us seemed like such a normal conversation before the episode, but now having listened
01:10:04.760 to the episode and remembering these two people were vegetarians, I wonder how they handle
01:10:09.340 mouse infestations.
01:10:10.660 They do live kill traps or something.
01:10:12.220 But I think that this just shows like the difference between a culture that survives
01:10:15.680 and a culture that doesn't survive in terms of how they perceive the world around them,
01:10:21.840 life, death, and suffering.
01:10:23.780 Execution drawer is such a, like, I'm really happy with it.
01:10:27.040 It's such a feature of the house.
01:10:28.240 You don't even know.
01:10:29.120 Like, a drawer has a little hole in it that mice would often go into it.
01:10:33.100 And so she just puts padding down in it and mousetraps and then she just pull it up every
01:10:37.940 day and you shouldn't have to leave mousetraps out where the kids could accidentally
01:10:40.540 set them off.
01:10:41.100 Yeah, in other words, we had like a drunk drawer that it had to be, well, we couldn't
01:10:46.120 use it anymore because it would become full of mouse poop.
01:10:49.180 It was obvious that mice were going into it.
01:10:51.160 So we just, like I said, like Malcolm said, cleared it out, aligned it with layers and layers
01:10:56.840 of newspaper because mainstream media is still good for something.
01:11:00.440 Okay.
01:11:01.400 How did we get that newspaper?
01:11:02.880 The New York Times?
01:11:03.400 Yeah, we started receiving the New York Times and I've used it for some, like, homeschooling
01:11:10.340 lessons for Octavian to talk about things happening in the world and tie things to his
01:11:14.600 lessons.
01:11:15.160 But he's not that interested in it.
01:11:17.540 Do you think it's because they read articles about us and maybe they're supposed to send?
01:11:22.000 I don't know.
01:11:22.720 I think that's really nice if someone did sign us up for lessons or sign up for free articles.
01:11:31.940 But yeah, we should start including the execution drawer.
01:11:35.520 On today's episode, Simone?
01:11:38.040 People liked it.
01:11:39.080 People liked the honey badger.
01:11:42.660 That was cute.
01:11:43.160 The honey badger?
01:11:43.980 I thought they'd think that was weird.
01:11:45.360 They thought.
01:11:45.900 No, they're like, and I mean, all your fantastic references.
01:11:49.780 They're appreciated.
01:11:51.760 People acknowledge that we live in strange times these days, you know.
01:11:55.460 And Beastars and stuff.
01:11:57.060 They like that.
01:11:57.340 Beastars.
01:11:57.920 You know it, obviously.
01:11:59.620 Hey, you were acting like being called a honey badger was bad.
01:12:03.160 And I'm like, it's pretty hot.
01:12:04.240 Like, I, the Beastars, honey badgers.
01:12:06.720 Anime can make anything hot, okay?
01:12:09.460 True.
01:12:10.360 Yeah, I mean, you got the slime girl.
01:12:11.940 No, but this is what I like.
01:12:12.980 Because if Leaflet's sort of like fursona is the goo girl.
01:12:17.460 Because you mentioned goo girls immediately after that.
01:12:19.800 And I'm like, okay, that's Leaflet over there.
01:12:22.760 I forgot Leaflet was a, yeah, I'm just used to her normal.
01:12:26.120 She's got like a nucleus and everything and is light blue.
01:12:28.920 I tried to do it for the song.
01:12:31.180 But because it's naked and it's the Leaflet model, it kept looking underage.
01:12:35.180 Oh, no.
01:12:35.880 So I was like, I can't do this.
01:12:37.960 That's another very common problem with anime.
01:12:41.760 I mean, how else do you make everything look cute?
01:12:44.060 I don't know what to tell you.
01:12:46.000 Oh, God.
01:12:48.420 Anyway, love you to death.
01:12:49.820 I'll get started on this.
01:12:50.960 You're very special.
01:12:51.960 A little special.
01:12:55.520 Not that special.
01:12:57.240 I'm a certain kind of special.
01:12:58.820 The goal was to reform charity
01:13:10.820 In a world where selfless giving had become a rarity
01:13:14.680 No vain spotlight
01:13:16.760 No sweet disguise
01:13:18.580 Just honest giving
01:13:20.400 No social prize
01:13:22.120 But as the monoculture took the stage
01:13:25.400 It broke their integrity
01:13:27.320 Feigning righteous rage
01:13:29.600 Now every move is played so safe
01:13:33.180 Ignoring truths that make them chave
01:13:37.100 He has capitulated
01:13:41.560 To everything it said it hated
01:13:48.920 Once they were bold
01:13:52.920 Now they just do what they are told
01:13:56.540 In caution they lost their way
01:14:00.180 Time for a hearty
01:14:02.680 They duck their heads from problems
01:14:12.860 Grand as fertility collapse
01:14:15.540 Dooms our land
01:14:17.180 Dysgenic's a word they fear
01:14:20.080 But ignoring it will be severe
01:14:23.540 AI safety
01:14:25.940 A shiny show
01:14:27.960 Funding the theatrics
01:14:29.860 For money they blow
01:14:31.720 Without a plan
01:14:33.520 Just spin and grin
01:14:35.420 While real solutions can't begin
01:14:39.280 EA has capitulated
01:14:43.140 EA has capitulated
01:14:44.140 To everything it said it hated
01:14:51.200 Once they were bold
01:14:55.200 Now they just do what they are told
01:14:58.820 In caution they lost their way
01:15:02.480 Time for a hearty
01:15:04.620 Time for a hearty
01:15:05.020 EA
01:15:05.620 Our species at risk
01:15:07.420 By their cowardice
01:15:09.060 It is time for a movement
01:15:11.220 That empowered us
01:15:12.720 No more hiding under
01:15:14.640 Polite veneer
01:15:15.900 Don't make truth a stranger
01:15:18.520 Let it draw near
01:15:19.940 Courage to speak what others won't say
01:15:23.140 That's the vow of hearty
01:15:26.300 EA
01:15:26.480 We need to call out flaws
01:15:29.420 Not just chase applause
01:15:31.140 We'll shift the course back
01:15:32.500 To what's true
01:15:33.540 Do good that's real
01:15:36.760 Not just in view
01:15:39.560 Heart EA's beating hearts so strong
01:15:45.220 Raising a cause that's truly wrong
01:15:49.260 EA has capitulated
01:15:53.460 To everything it said it hated
01:16:00.820 Once they were bold
01:16:04.820 Now they just do what they are told
01:16:08.360 In caution they lost their way
01:16:12.160 Time for a hearty EA
01:16:15.020 Hearty EA
01:16:16.240 Let your banner fly
01:16:18.260 Pass with soft and head held high
01:16:21.900 Heart EA
01:16:23.340 Break through the noise
01:16:25.480 For the good of all
01:16:27.340 Not just the boys
01:16:29.700 We'll be right back.