Based Camp - December 16, 2025


Cyberfeminism, Xenofeminism, & The Cyborg Manifesto


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

176.79892

Word Count

12,100

Sentence Count

827

Misogynist Sentences

78

Hate Speech Sentences

38


Summary

In this episode, we re talking about Cyberfeminism, Black Cyber Feminism, and xenofeminism, which are terms I had never heard before. They all apparently stem from a foundational text, which is so foundational that people have forgotten to mention it in academic contexts, and so everyone thinks that everyone knows it but they don t know it.


Transcript

00:00:00.000 Hello, Malcolm. I'm excited to be with you today because I just discovered cyberfeminism and Black
00:00:07.980 cyberfeminism and xenofeminism. And these are terms I had never heard before. And they all
00:00:15.780 apparently stem from this foundational text, which is so foundational that people have forgotten to
00:00:23.360 mention it in academic contexts. And so everyone thinks that everyone knows it, but they don't know
00:00:28.260 it. And so now there's just, it's like almost becoming this like forgotten Rosetta Stone of
00:00:33.460 like modern feminist ideology, because it is lost and people have just forgotten to mention it and
00:00:40.520 explain what it is. And this is why you read this? Well, it all started with a post on X by Grimes,
00:00:49.300 as so many things do, right? In this post, she refers to a cyborg manifesto, this foundational
00:00:57.700 text as quote, one of the greatest things ever written. She added, what's crazy about cyborg
00:01:04.160 manifestos, even if you pretend it has nothing to do with feminism, it is still a masterpiece of
00:01:08.740 general philosophy and is filled with banger poetry. And so I checked it out because she's also the
00:01:13.720 person who turned me on to Ian Banks culture series, and it changed the way I view AI and the future of
00:01:19.260 humanity. So like, I don't know, she has a good track record of introducing like really good ideas.
00:01:25.400 And I actually, this is really funny, because she has like a lot of like Marxist and socialist and
00:01:30.120 commie followers who are like, hardline leftists who just really like, obviously her music and her
00:01:36.340 musical style, but then also her philosophy in general is like really inspiring. But then they
00:01:40.220 get really mad because, because she had a kid with the bad man. And it just like they're so they
00:01:46.220 really struggle with it, because she she's just you can't stay away from her, even if you want to hate
00:01:51.080 her, because she has so many great ideas, although we never did hate her. And we love her, because she's
00:01:54.580 amazing. Friend of the show. Anyway, though, it turns out that a cyborg manifesto, which was
00:02:00.140 originally published in 1985, like I said, is so well known in academic circles, it's almost never
00:02:06.280 discussed as it's assumed to be tacit knowledge, as at a little internet puts it on x in the thread that
00:02:13.880 Grimes posted. It's seen as kind of cliche reference in academic contexts or lectures, because it's
00:02:20.700 assumed everyone has already read it, which is probably why you don't encounter it. I think in
00:02:25.720 general, it's a shame with this when this happens to important works, because young people, etc,
00:02:30.600 might not know it. And I think that's the case, because I don't know, like I took philosophy
00:02:35.320 classes, I was not, you know, I, I took a lot of uni classes that were more on like the humanities or
00:02:41.660 social studies, like end of the spectrum. And this is actually like a foundational required reading in a
00:02:46.800 lot of academic contexts. Like it is actually considered to be one of the most
00:02:50.280 influential essays in feminist theory, and science and technology studies, and post humanities. And I
00:02:56.580 have a master's in technology policy, like I just, how have I never heard of this. And it's also just
00:03:03.540 one of the most cited essays in the humanities and social sciences worldwide. But have you heard of
00:03:09.280 this before?
00:03:09.300 I have never heard of this. No, but I have a feeling as like the urban monoculture emerged,
00:03:15.180 there have been a number of pivotal works to it. Like I've talked about the ones that Mondami's dad
00:03:21.840 wrote, right?
00:03:22.640 Oh, yeah, yeah.
00:03:23.240 On like how the world can be devised into like colonizers and victims.
00:03:27.520 Yes.
00:03:28.220 And the victims need to overtake. And it was going through these works that helped me better understand the way they
00:03:33.240 actually see the world.
00:03:34.940 Yeah. And you're like, wait.
00:03:36.200 I was like, oh, now this makes sense. And this makes sense. Like now colonial narrative, like where did that come
00:03:41.780 from? And we should have thought like when people are talking about colonizers and decolonization, that there was some
00:03:46.640 kind of foundational text, but we didn't think to like go into it. Yeah. I think when you get into these
00:03:52.760 foundational texts, you can be like, oh, this is where all this. And I only hope that with the next social
00:03:58.140 movement that comes along, Base Camp is seen as one of the foundational texts. So there are essays and
00:04:05.060 stuff.
00:04:05.540 Oh, no, like the tracks, the Techno-Puritan tracks.
00:04:08.620 The tracks.
00:04:09.780 The tracks.
00:04:10.940 Anyway.
00:04:11.920 I have like three more written that we haven't done yet.
00:04:14.280 I know. Well, maybe we can, we can work on those over, over this, this sort of winter break that
00:04:19.200 we have with the kids. Yeah. But so back to the Cyborg Manifesto, this particular foundational
00:04:24.380 text, basically like the gist, if you don't want to listen to this, argues that the cyborg, a hybrid
00:04:30.860 of machine and organism is a powerful metaphor for breaking down rigid boundaries, like human versus
00:04:36.960 animal or organism versus machine or physical versus non-physical male, female, nature, culture.
00:04:42.780 And it rejects essentialist identity politics and traditional socialist feminism in favor of affinity
00:04:49.660 politics, like coalitions based around shared interests rather than fixed identities. And it also embraces
00:04:56.560 irony and partiality and blasphemy against origin stories, both religious and secular, which is really
00:05:03.080 interesting. And then some key quotes that are endlessly repeated from this are, I would rather be a cyborg than a
00:05:09.460 goddess. And the cyborg is a creature in a post-gender world that is no truck with bisexuality, pre-edible
00:05:16.580 symbiosis, unalienated labor, or other seductions to organic wholeness. Also, we are all chimeras theorized in
00:05:24.940 fabricated hybrids of machine and organism.
00:05:27.280 This sounds like it was actually written in like a fever dream. Like, honestly, I'm going to go into the origin stories of this and you're going to, like, everything's going to fall in place.
00:05:38.320 It sounds like one of our female listeners wrote this. Is this, like, not that dissimilar?
00:05:42.480 Because our female listeners are freaking awesome. I love this. But no, trust me, when I get into the origins,
00:05:48.400 this is, it will all be explained. But this, this manifesto gave birth to cyber feminism, which gave birth to xenofeminism and black cyber feminism. And when I heard about that in the thread where Grimes brought this up in the first place, I was like, okay, this is so intriguing. We have to dig in.
00:06:09.440 So we're going to start with the Cyborg Manifesto and then, you know, explain to you, so that you are an informed citizen of the world, what this manifesto, this foundational text is about, because people have forgotten to explain this Rosetta Stone of culture and philosophy. And also, cyber feminism, xenofeminism, and black feminism.
00:06:28.280 Xenofeminism and black cyber feminism.
00:06:29.720 Black cyber feminism, yeah.
00:06:31.020 So, hold on, but you liked xenofeminism, which I think our audience is going to have a problem with, because those xenos need to be burned in their name.
00:06:39.440 Guys, what if we're the bad guy?
00:06:56.640 No, I think, no, you'll see, xenofeminism is my feminism. I've decided.
00:07:01.720 No, hold on, actually, I'm going to, I'm going to make an argument. Actually, we should do a separate episode on this, because I think it's a great idea.
00:07:06.840 The concept of globalist nationalism, or what's a better way to say it? I call it Terran nationalism.
00:07:16.240 Terran nationalism.
00:07:17.480 So, Terran nationalism is what you see in something like Starstrip Troopers, where you have this idea of, we as humans are great, I love humanity, I love the species, and the other is aliens or machines or something like that, right?
00:07:37.360 And in our fab, I have one of my favorites.
00:07:41.760 Oh, yeah, Terran empire is one of the contexts.
00:07:44.040 Terran empire is one of the presets, or the sons of man, where, you know, it's humanity and all the things that have come from our species, whether it's AI.
00:07:50.400 And for context, in reality fabricator, that the AI, like narrative engine and chatbot platform that Malcolm has built with, with Bruno, friend of the pod, Bruno, is, you can go through, like, if you don't want to sort of create your own prompts for stories or characters, these amazing menus that Malcolm has created, where like, you can choose the settings and the tropes.
00:08:11.980 Like, do you want Isekai? Do you want vampires? Do you want, like, this kind of dynamic, like a power dynamic or whatever?
00:08:17.520 And like, yeah, one of the settings is Terran empire.
00:08:20.380 Like, some other weird settings you have in there, like, like, like, my favorite game to play is Terran empire versus gay space communism.
00:08:29.600 Yeah, oh yeah, gay space communism. Yeah, yeah, yeah, yeah, yeah.
00:08:32.020 So, like, it's, it's, it's fun.
00:08:33.580 But anyway, yeah, because I think this idea of, like, Terran nationalism, anyway, I'll, I'll get to it, but continue.
00:08:38.380 Yeah, let me, let me cook.
00:08:39.920 Okay, okay. So, Cyborg Manifesto, because the people must be informed.
00:08:44.760 So, the full title of it is A Cyborg Manifesto, Science, Technology, and Socialist Feminism in the Late 20th Century by Donna Haraway.
00:08:54.860 So, who is this woman? Who is Donna?
00:08:57.340 What is this fever dream of this poetic banger poetry masterpiece that Grimes loves and that, that so many people love?
00:09:05.100 All right, well, let's, let's start with it.
00:09:06.440 And now I think you're going to see why and how this all makes sense.
00:09:08.880 And if not, I will explain to you because a little bit of a cultural and locational context is needed.
00:09:13.300 So, Donna Haraway, sorry, Donna Haraway is a UC Santa Cruz professor, which, if you know Santa Cruz, explains everything instantly.
00:09:24.980 She was on drugs.
00:09:26.440 No, it's, it's more than that.
00:09:28.700 So, so to me, personally, and I think anyone who knows Santa Cruz will probably agree with me, Santa Cruz epitomizes a culture that is unmoored from history and origin stories, which is kind of a core point of her manifesto here of, like, origin stories.
00:09:44.820 It's like, get rid of them.
00:09:45.820 It's like, get rid of them.
00:09:46.260 No, like, what are we even in any way?
00:09:48.140 And, and here's the thing is, is in, in Santa Cruz, especially in California in general, you've got modern and historical transplants of people who repeatedly rejected and importantly forgot the cultures of their homeland.
00:10:05.100 So consider my family's history by way of California, like twice over, I mean, different, you know, branches, but like one branch moved from like Norway to Germany, to Ireland, to New York, to Chicago, to California, and then multiple locations in California.
00:10:24.160 Every time they're getting up and moving, they're kind of reinventing themselves and kind of intentionally unmooring themselves from their historical inherited cultures and roots and past.
00:10:33.800 They're like, they're like, they're not trying to bring it with them.
00:10:36.520 Like, I think that the Collins family is different.
00:10:38.820 Like your family seems to have really clung to a very strong sense.
00:10:42.120 Yeah, we lived in the same area for seven generations.
00:10:44.600 So my family, like very intentionally, like laundered its self and identity in California as a place in America.
00:10:55.620 People need to understand, like, you know, we've talked about the role of these foundational cultures, like the Scots-Irish versus the Quakers versus the Cavaliers.
00:11:05.380 What we haven't really talked about is the way that certain behavioral bottlenecks have played in informing some types of Californian culture.
00:11:16.760 And Bay Area culture is, and that is to say San Francisco, Silicon Valley, Bay Area culture is uniquely one of people who have really intentionally kind of a flashback to themselves so many times that they've forgotten where they've come from.
00:11:31.200 And that makes them a very interesting kind of cultural blank slate that is exactly the kind of cultural blank slate that would produce this manifesto.
00:11:41.080 So I just, I just want to, but what makes Santa Cruz unique, even within this ecosystem of people who have like warped and completely cleaned away multiple times, laundered their, their culture and identity.
00:11:54.620 Is that it's also, it is crunchy, very crunchy.
00:11:59.040 We're talking like, you know, you're wearing socks under Birkenstocks while walking through the Redwood forests and then surfing in the morning, et cetera.
00:12:05.780 Right.
00:12:06.000 Like it's, it's the UC Santa Cruz where this woman teach taught and is now an emeritus professor.
00:12:12.500 Teached.
00:12:13.240 She, she, she, she, she, it's just, it's, it's this beautiful university campus.
00:12:18.780 It's just embedded within a Redwood forest with like some partial views of the ocean.
00:12:22.700 Like it's very naturey and very crunchy.
00:12:25.880 And yeah.
00:12:26.140 Okay.
00:12:26.480 Yeah.
00:12:26.760 Like, yes, there are drugs, but like, not in like a way that people are associate, like typically think of drugs.
00:12:32.920 It's more just like part of life.
00:12:36.180 It's not a thing you do.
00:12:37.760 It's, it's the air you breathe.
00:12:39.780 I don't know how else to put it.
00:12:41.000 You know, it's just like, so of course your life is going to be colored by a strange days.
00:12:45.760 And even if you literally don't do any drugs, which I never did as a kid, like I still had a very trippy childhood, if that makes sense.
00:12:52.400 Yeah.
00:12:52.660 So that, this is just some context.
00:12:54.800 Okay.
00:12:55.420 So she was born in 1944 in Denver, Colorado.
00:12:57.880 And then, you know, sort of herself, you know, laundered her identity further.
00:13:01.180 Denver, because Denver was already the kind of place that, you know, selected for people who are really running away from their past.
00:13:07.760 You know, it's like minors and other people who are like very high risk and just like, forget everything that I came from.
00:13:12.240 I'm going to rebuild.
00:13:13.120 And then she came to California.
00:13:14.700 As of 2025, like I said, she, she's not an active professor anymore.
00:13:18.160 She's an Amerita Distinguished Professor of the History of Consciousness and Feminist Studies at the University of California, Santa Cruz, UCSC.
00:13:26.560 She looks very much like a Santa Cruz lady.
00:13:29.580 I had, I spent a lot of time in Santa Cruz as a kid because my grandparents and I, and, and aunt and cousin were there and we're, we spent tons of like all my holidays were there.
00:13:40.280 But what's interesting about her is she trained in biology.
00:13:42.920 She has a PhD in biology from Yale and zoology and philosophy.
00:13:46.880 So she's kind of like you.
00:13:48.920 And I think this is, you can kind of see, I think some of the most interesting thinkers philosophically have backgrounds in biology, like you, like her, because they're able to think in a cross-disciplinary fashion.
00:14:00.060 That's really, that challenges a lot of the really trite norms.
00:14:04.340 And that is genuinely novel because when people have backgrounds in, in philosophy and then they contribute to philosophy, they're really just kind of copying and pasting with slightly different words.
00:14:14.120 You know, whereas like someone with biology is going to come in and be like, well, I don't know, like based on the way that like DNA double helixes work, you know, like, I think that, you know, what I did not have a background in biology.
00:14:26.680 I'm sorry.
00:14:27.520 But she, she was also deeply influenced by Marxist feminism and science fiction and Catholic symbolism.
00:14:34.400 Because she grew up Irish Catholic and post-structuralism.
00:14:38.220 So you just got like the perfect storm of a good manifesto in there.
00:14:42.360 And so we also need to think about the context of this piece, because I think a lot of people are like, well, this is amazing, like great poetry.
00:14:49.860 Like, but no, like we have to, it just, to quote one Kamala Harris or, you know, paraphrase, it didn't just fall out of a coconut tree.
00:14:58.740 You have to consider it within the context of the world in which it exists.
00:15:02.120 So the essay originated in response to a 1983 call from the Socialist Review, which is a West Coast leftist journal, asking feminists to reflect on the future of socialist feminism amid Ronald Reagan's presidency and the rise of the new right and the decline of traditional leftist movements in the U.S.
00:15:21.920 and escalating Cold War tensions, including the Strategic Defense Initiative or Star Wars program.
00:15:26.840 And Haraway aimed to revitalize socialist feminism by addressing what she called the informatics of domination, how new technologies of communication and control and production, which is really what, you know, anyone in Silicon Valley, which is roughly where Santa Cruz is, is experiencing, were reshaping power and labor and gender and identities in ways that older feminist frameworks just couldn't grasp.
00:15:51.900 And so she, basically what happened though, like, just to break it down for you, a Marxist publication wanted feminists to butthurt about conservatives.
00:15:59.460 And instead she like, went off the reservation and it was glorious.
00:16:05.060 Like instead of Reagan wants me to become a housewife and spoon feed jelly beans to my husband.
00:16:09.880 She's like, let's become cyborgs and form special groups around our artistic special interests.
00:16:14.900 Our autistic special interests.
00:16:16.320 So I'm just like, yes, go lady.
00:16:20.360 She's invited to the picnic.
00:16:22.180 I know she is, she is 100%.
00:16:23.860 Like my kind of lady.
00:16:25.280 This is why we get along so it was Grimes.
00:16:26.860 Yeah.
00:16:27.600 Cause she's into this stuff.
00:16:28.660 She gets it.
00:16:29.340 She gets it.
00:16:30.320 I'm like the problem.
00:16:31.300 And actually I feel, oh crap.
00:16:34.240 I'm realizing now, like you're going to see the whole arc I have for this episode.
00:16:38.800 And for the, and for the cyborg manifesto, the, the conf that it undergoes is the same conf of Grimes.
00:16:48.580 And you're going to see why.
00:16:50.720 Okay.
00:16:51.360 Go for it.
00:16:51.660 No, I mean, I'm already seeing why it feels like Grimes wrote this to be honest.
00:16:56.020 No, but also like the way it played out and was interpreted and was appropriated is also like the same struggle and unfair treatment that Grimes has undergone.
00:17:05.760 Yeah.
00:17:06.540 So, yeah.
00:17:07.300 So just so, so people understand, cause not a lot of our, more than half our audience is outside the U S I just want to make it clear the context of this.
00:17:15.540 It's the eighties when this was written.
00:17:17.360 It was, it was written one year before Malcolm was born two years, two years before I was born.
00:17:22.220 This is the Silicon Valley Bay area at the very beginning of this period of biotech and personal computing, just having an insane upswing.
00:17:30.320 And these are the first people who are smelling it.
00:17:32.760 These are the canaries in the coal mine who were like, Oh God, the whole world is about to change.
00:17:38.760 And there's this crisis also at the same time on the left as the 1960s and seventies, social movements are beginning to fragment.
00:17:46.060 So for example, second wave feminism, which emerged in the 1960s and peaked in the seventies was, was starting to falter.
00:17:53.980 It was often called the women's liberation movement as well, if that's how you've heard of it more, it grew out of experiences in civil rights and anti-war activism.
00:18:02.120 And then by the eighties, it fractured due to what people call the feminist sex wars, which were debates over pornography and sexuality and power.
00:18:10.040 And there were also critiques of white middle-class dominance, excluding women of color and lesbians.
00:18:15.300 And there were splits between liberal and radical and socialist branches.
00:18:18.560 And in her piece, in her manifesto, Haraway explicitly critiqued how taxonomies of feminism policed official women's experience, leading to endless splitting.
00:18:29.700 Like she really, really, really hated identity politics.
00:18:32.880 Unfortunately, they've won, right?
00:18:34.120 But so the, also in another movement that was sort of falling apart.
00:18:37.340 And this is one thing that the, the socialist magazine that prompted this essay wanted to see addressed was the new left, which was this broad student and youth driven movement encompassing the anti-Vietnam war protests and free speech campaigns.
00:18:51.400 Like at Berkeley, like my dad went to those protests.
00:18:54.380 No, my dad went to those too.
00:18:55.680 He was, he was well-noticed.
00:18:56.680 It's something they bonded over.
00:18:58.020 It's so sweet.
00:18:58.760 He, he, my dad went in a, would, would go in a suit to those protests.
00:19:02.360 And everybody thought he was like the guy in a suit, the very posh guy at the anti-war.
00:19:06.680 My dad's like vandalizing, et cetera.
00:19:09.120 And your dad's showing up.
00:19:09.800 But it's funny that he really wanted to go to the war in the beginning.
00:19:12.540 I know.
00:19:13.180 And that's so funny that you, oh my gosh, your dad's.
00:19:15.620 The only reason for people to know my dad actually got out of the war by being too eager to go into the war, which is he applied really early in the war process, like before they needed a draft or anything like that.
00:19:28.460 So they were actually still incredibly selective with who they were taking.
00:19:31.580 And this was to, during like the ROTC and stuff, but he had a, a, a medical issue tied to his leg.
00:19:38.800 Knees.
00:19:39.340 He said they were Rice Krispie knees or something.
00:19:41.240 They snapped, crackled, and popped.
00:19:42.520 Well, he, he got them broken playing lacrosse as a kid.
00:19:45.160 Oh, freaking posh can you be.
00:19:47.000 I have a posh, right?
00:19:47.720 I broke my legs in lacrosse and I go to my protests.
00:19:51.420 That's where he goes.
00:19:52.420 He tried to go super early into the war while they're still being really picky.
00:19:56.040 And apparently he got like a permanent ban from being out of the war.
00:19:59.440 Permabanned from war.
00:20:00.740 Poor, poor daddy.
00:20:03.200 Oh my God.
00:20:03.860 No, no, but like truly honorable.
00:20:05.860 I mean, I also appreciate that my dad was out there, you know, vandalizing stuff too.
00:20:10.740 Very toasty.
00:20:11.800 Now I know where he gets it.
00:20:13.500 Anyway, but then like basically after there were a lot of clampdowns on these protests and like there were the Kent State shootings.
00:20:20.740 And then the Vietnam War ended in 1975.
00:20:23.860 And the whole movement kind of fragmented into a bunch of sectarian groups and identity-based politics again.
00:20:30.760 And then of course, then there was the civil rights and black liberation movement in the 50s to the 70s, which also fell apart in the 80s because it sort of transitioned in the late 1960s into black power, like the Black Panthers.
00:20:43.860 And it emphasized racial separatism and pride.
00:20:46.240 And by the 80s, there were all these internal divisions over integration versus nationalism.
00:20:51.100 And external oppression contributed to a more fragmentation.
00:20:57.940 And it just sort of became like messy.
00:21:01.480 So how is this piece framed by academics today?
00:21:04.720 Well, they frame it as a canonized but contested classic, which is telling because it's required reading and women's gender studies and science and technology studies and media studies and literary theory and philosophy and anthropology and art theory.
00:21:16.760 It's often paired with Judith Butler's Gender Trouble, which I've also never read, but it's considered one of the twin pillars of 1990s anti-essentialist or post-structuralist feminism works.
00:21:28.040 And it has like a lot of sort of contradictory interpretations.
00:21:33.620 There are lots of post-humanist and trans-humanist readings, and they celebrate it as this early manifesto for leaving humanity behind.
00:21:40.640 And then there's some critical race and decolonial scholars who both use and criticize it for its relative silence on race and colonialism.
00:21:52.760 But here's where worlds collide, right?
00:21:54.840 You're talking about Mom Donnie's dad.
00:21:57.300 And here we have the, you know, decolonial scholars being like, well, I don't like that she doesn't talk about identity politics in her anti-identity politics manifesto.
00:22:05.640 But again, it like, like I sort of going back to the premise of this, it is framed as the origin text of cyber feminism and xenofeminism, which really rose in the 90s, sort of five years after this was originally published.
00:22:20.020 It was republished again in 1991, and then it sort of picked up from there.
00:22:23.440 And it has genuinely inspired shifts and course adjustments in various fields.
00:22:28.680 Like in feminist theory, it shifted feminism away from women's experience or biological essentialism toward constructivist coalition-based affinity politics.
00:22:38.400 Like we don't need a totality in order to work together.
00:22:41.360 From post-humanism and critical animal studies, it was one of the earliest and most poetic arguments that the boundary between human and non-human is politically constructed and historically contingent.
00:22:53.900 And then when it comes to queer and trans theory, this also really played a foundational role because it prefigured the sort of concept of being non-binary and fluid in terms of your understanding of identity.
00:23:05.940 Interesting.
00:23:06.460 And to be clear, Carraway was not herself writing from a, an LGBT perspective in 1985, but it would be obvious why people who were gender fluid would see what she's talking about and be like, oh, this can apply to me.
00:23:21.580 Like that, and, and, and I think this is where a lot of people may kind of misunderstand like grime stance vis-a-vis like trans and everything.
00:23:32.480 It's not about identity politics. I think it's more rhymes with, I'm not going to put words in her mouth though, but like, I just feel like she, she more has the, the, this manifesto in her heart than like any particular, like, I like this group.
00:23:46.080 I like this group. Let's all do identity.
00:23:47.980 But anyway, let's, let's talk about the actual text, which is fascinating.
00:23:51.200 I'm not going to go into it because it's long and poetic and a little convoluted, but the chapters provide a peak, right?
00:23:57.900 So here are the chapters of it. One at first chapter, an ironic dream of a common language for women in the integrated circuit, and then fractured identities, and then the informatics of domination, and then the homework economy outside the home, and then women in the integrated circuit, and then cyborgs, a myth of political identity.
00:24:18.300 Then there's bibliography, and then there's bibliography, but I have to say it starts, oh God, it starts with this illustration that it's like, part of me was like, I just opened the URL.
00:24:32.880 Okay, yeah, I'll send you a link, and I will describe it for those who are listening, audio only, so don't worry.
00:24:39.720 My, my description is going to be highly accurate. So Malcolm, actually, I'm going to have you listen.
00:24:43.280 You can tell me how accurate I am, and then, and then the audience can either feel comforted or not, but I'm just going to, I'm going to give my best here.
00:24:50.040 So, oh God, it is a picture of a woman who's sitting in space with her back to a framed grid displaying galaxies and mathematical equations at a 3D digital map, and she is on her head wearing a white, a white, I'm going to send it, but you get to hear my description first, because you can tell the listeners how accurate I am.
00:25:10.820 On her head, she's wearing a white, glowing tiger cub, and its arms are draped over her shoulders with its arm bones glowing through its ectoplasmic flesh, and on her chest sits a circuit breaker with green lines emanating from it, and they sort of terminate in blue nodes.
00:25:28.760 And then her fingers are lying on what look like typewriter keys, maybe like super old school computer keys, set atop a diorama of an Egyptian desert, complete with pyramids in the foreground.
00:25:42.540 There's a blue mountain range in the background, and her face, get me, get me here, because we're in, it's, it's holiday time over here right now.
00:25:48.920 Her face strikingly resembles that of Neil from the movie The Santa Claus, like the new husband. Picture Neil.
00:25:56.280 Oh yes, I remember him, yeah, nerdy.
00:25:58.220 Yeah, but, but she has the coloring and long dark hair of perhaps an indigenous American woman.
00:26:03.320 Okay, so I'll send you the link now.
00:26:04.900 What the heck? Did you have an AI describing?
00:26:06.940 Tell me how accurate I am, tell me how accurate I am.
00:26:09.620 You didn't describe that it looks like it was drawn by a child.
00:26:13.100 I didn't say that.
00:26:14.420 No, you didn't, but I think it looks like bad.
00:26:18.560 It looks like it was drawn on a computer in the 1980s with what you had available to you at the time, okay?
00:26:24.480 Okay, I'll buy that, yeah.
00:26:26.640 I thought she was like on a spaceship, but she's in a...
00:26:28.620 No, she's, she's floating in space.
00:26:30.600 There's stars behind the framed image.
00:26:34.040 And you see the glowing tiger on her head.
00:26:36.280 Yeah, but she does have the thing on her head with the glowing bones and gelatinous body.
00:26:44.460 Gelatinous body.
00:26:45.540 I think it's ectoplasmic, but gelatinous is, is, is probably more accurate.
00:26:50.020 This is a thing.
00:26:51.180 I do love the Native American look.
00:26:53.480 Yeah.
00:26:53.560 I know, but no, but her face looks like Neil from a Santa Claus.
00:26:57.120 I'm holding to that.
00:26:58.100 I'm going to, I'm going to drown that.
00:26:58.980 I think her face looks like one of those anagrams of like many women faces to me.
00:27:04.140 No, her, her, her eyes are Neil's eyes.
00:27:07.360 Just, you know, Google image it at some point.
00:27:09.660 At any rate, I'm sorry.
00:27:11.160 We're done with art critique corner.
00:27:13.160 I hope you enjoyed it.
00:27:14.500 The text argues that in the late 20th century, humans have become cyborgs.
00:27:19.380 And we totally are now.
00:27:20.560 I mean, like, of course, like in 1985, we're, they were just getting warmed up.
00:27:24.240 We're hybrids of organism and machine and our identities and bodies and politics are shaped
00:27:28.680 by information technologies and global capitalism.
00:27:30.980 And this could not be more true, especially now.
00:27:34.340 Do you want me to read the first paragraph here for people?
00:27:36.400 I find this fun.
00:27:36.960 Oh, you think it's, yeah, no.
00:27:38.220 I mean, I, yeah, I mean, we could spend the whole, like, we could spend a month going through
00:27:41.860 just this thing.
00:27:42.660 This essay is an effort to build an ironic political myth faithful to feminism, socialism,
00:27:48.080 and materialism.
00:27:49.680 Perhaps more faithful as blasphemy is faithful than as reverent worship and identification.
00:27:56.440 Blasphemy has always seemed to require taking things very seriously.
00:28:00.000 I know no better stance to adopt from within the secular, religious, evangelical traditions
00:28:05.480 of the United States politics, including the politics of socialist feminism.
00:28:11.700 Go on.
00:28:13.620 Contemporary science fiction is full of cyborgs.
00:28:16.400 Creatures simultaneously, animal and machine, who populate worlds ambiguously natural and
00:28:22.420 crafted.
00:28:22.760 Modern medicine is also full of cyborgs, a coupling between organisms and machine, each
00:28:28.440 conceived as coded devices in an intimacy and was a power that was not generated in the
00:28:34.880 history of sexuality.
00:28:36.720 The whole thing, the whole thing is like that.
00:28:38.300 I find funny about this, and I really love that we've seen this within the conservative
00:28:42.760 movement, is you can see sort of our vision of futurism.
00:28:46.500 For a long time, the futurists were really owned by the progressive movement, right?
00:28:52.360 And they build these wide worlds and stuff like, you know, Star Trek, and they really
00:28:57.640 only take time to have conservative futures when they are making fun of us, like the mirror
00:29:01.960 world in Star Trek, where the women are forced into skimpy outfits.
00:29:06.200 Or the-
00:29:06.560 Come on, like they aren't already.
00:29:08.080 Come on.
00:29:08.600 You should see our Star Trek episode.
00:29:09.940 If you haven't seen, our Star Trek is like a total dystopia, and Starship Troopers is actually
00:29:14.460 a really good universe to live in, but they would only engage with us in like mocking,
00:29:19.260 right?
00:29:19.520 Like in Starship Troopers, that's how it became like this conservative futurist icon.
00:29:23.160 But now you've got stuff like the Warhammer sort of wider community, which is clearly
00:29:26.920 like very conservative coded.
00:29:28.100 Like we're going to be in space, but like a post-Christianity, like theocratic ship on
00:29:35.020 big, you know, like cathedral-like megaships.
00:29:38.820 Um, and, and we represent that inversion of their ideology, but in terms of like cyborg
00:29:45.700 futurism, which I love.
00:29:47.040 Let's get back to the piece though.
00:29:48.060 Let's get back to the piece.
00:29:48.780 So basically the, the, the cyborg, which is sort of the thing around this, the concept
00:29:53.740 around which all this pivots is proposed as this mythic political figure that rejects
00:29:59.020 fixed essences like woman or nature.
00:30:02.120 Again, this is a rejection of identity politics.
00:30:04.260 And instead it raises partial and fractured and coalition based identities to build new
00:30:09.160 forms of socialist feminist politics.
00:30:11.520 So Haraway claims that advanced capitalism has shifted from an organic industrial order
00:30:17.080 to an informatics of domination where everything is understood as information coding and systems.
00:30:22.100 So in this world, boundaries between body and machine are totally blurred.
00:30:26.680 They're kind of, they don't matter anymore.
00:30:28.460 Control operates through communication and data flows and women are deeply integrated and exploited
00:30:32.660 within global circuits of production and reproduction.
00:30:34.860 But honestly, I think she only talks about the, the way in which women experience this
00:30:39.400 because like literally the prompt was like, as a feminist, write this.
00:30:44.080 Like, so her essay wouldn't have been accepted if she didn't do this through a feminist lens.
00:30:48.560 Yeah.
00:30:49.000 It's also interesting that she seems very interested in blaspheming as like a goal, right?
00:30:53.400 Like, yeah.
00:30:54.460 Like, it's sort of like a reverend people.
00:30:56.380 That's very Bay Area, but also like, keep in mind, she was raised a Catholic and she's playing
00:31:00.100 with this, you know, yeah, no, she like, this was one of her big influences.
00:31:05.580 Also, big influence of Grimes, right?
00:31:08.400 Wasn't Grimes raised Catholic, I think?
00:31:10.320 Yeah, she was raised Catholic too.
00:31:11.340 So anyway, yeah, I just feel like all this rhymes with her, her sort of struggle and legacy.
00:31:15.640 So communications technologies and biotechnologies are framed and seen in this piece as tools
00:31:21.280 that both enforce domination, but also open new possibilities for resistance.
00:31:26.060 And I think it very much, like, we would endorse that, like, it's, you know, this can be used
00:31:31.580 against you, absolutely, but you can 100% and must 100% use this to move forward and you
00:31:36.800 can't pretend they're not there.
00:31:38.640 And Haraway urges feminists to appropriate the cyborg figure to imagine non-natural, non-totalizing
00:31:44.560 political unities and alliances based on affinity and shared projects rather than on a supposedly
00:31:50.400 universal female nature.
00:31:51.840 And as I'm reading this, I, as like, uh, I think she's just saying, can we get over
00:31:57.820 feminism, please?
00:31:58.780 And like, just identity politics in general.
00:32:01.060 And she sees-
00:32:01.640 Yeah, it's really an anti-identity politics piece.
00:32:04.180 Yes.
00:32:04.460 Saying that the world of the cyborg is the world of-
00:32:07.120 Yeah, like, it's over now.
00:32:08.640 Like, we, we, there's, there are more options now.
00:32:11.320 Like, it's-
00:32:11.500 And this freaked out a lot of people, it's where black cyber feminism comes from.
00:32:14.800 Well, no, you'll, you'll see, yeah.
00:32:15.960 So in terms of, like, where, where I'm here, like, how, how, where I step away from this,
00:32:21.860 Haraway calls for, quote, pleasure in the confusion of boundaries and for responsibility
00:32:27.140 in their construction.
00:32:28.680 And I absolutely love that.
00:32:30.520 Like, it hints to a strong move away from identity politics.
00:32:33.140 Like, who cares if you're a man or a woman or whatever, just do your thing.
00:32:36.400 And we talk about that a lot.
00:32:37.460 And she's kind of arguing for 4chan before 4chan, like, no identities, just, just ideas
00:32:43.160 and affinity groups.
00:32:44.120 And that is capable of organizing people to do amazing things.
00:32:47.680 Like, I was just re-watching some of the internet historian recaps of, like, what 4chan
00:32:52.240 did with Shia LaBeouf's various projects.
00:32:54.540 That is great.
00:32:55.740 And with polls online, like, Bonnie McBoatface and sending, you know, Taylor Swift and all
00:33:02.840 sorts of wonderful things.
00:33:03.680 Go watch all of internet historian's videos if you don't know his stuff already.
00:33:07.460 Everyone knows his stuff, but who knows?
00:33:09.260 But yeah, I mean, I like this, but I'm also really amused that people invoke the name
00:33:14.140 of this manifesto and the creation of new identity groups, rather than affinity groups, which
00:33:20.140 is, like, the complete opposite of what the manifesto calls for.
00:33:24.440 On re-listening to this for editing and getting a chance to think through it, I think Simone
00:33:28.520 might actually be missing something here, which is to say that the modern feminists and progressive
00:33:33.300 and woke movement has actually moved in the direction that this piece predicted.
00:33:37.460 They just have moved so holistically towards affinity groups rather than identity groups
00:33:43.080 that they now see affinities as identity.
00:33:46.320 If you look at the debate between the two cutes and the, what were they called, the true
00:33:50.100 scum in the trans community, the trans people who thought that transness was just whatever
00:33:54.600 you felt like in the moment and you could just claim anything you wanted, and the group that
00:33:59.140 thought that you needed to be, like, officially diagnosed to something.
00:34:04.520 The officially diagnosed group lost completely and are generally seen as, like, adjacent to
00:34:10.080 TERFs these days.
00:34:11.460 So with one of the core progressive identities, and many of them, the various iterations of
00:34:20.500 queer, being largely opt-in, are they not now affinity groups?
00:34:25.520 It's just considered offensive to point that out.
00:34:28.100 And more broadly, they identify so holistically with their affinity, whether that affinity be
00:34:33.180 the things that turn them on, because what are the things that turn you on but your affinity,
00:34:37.460 or, you know, what they feel like in the moment, that they now see their affinity as the most
00:34:42.400 important part of their identity.
00:34:44.560 You see, in Simone's head when she's reading this, and when I first heard it from her, what
00:34:49.020 I heard was the word from my perspective, which is, identities don't really matter that much
00:34:55.280 anymore, therefore you should disregard them and focus on larger sort of intellectual, goal-based
00:35:04.000 affinities.
00:35:05.140 And that's how you should define yourself and identity.
00:35:08.240 When what's actually being said here is that you should focus on more basal affinities, like
00:35:15.400 the things that arouse you, like if you're aroused by same-sex individuals, then that
00:35:19.800 is the most important thing about you.
00:35:21.760 Or if you conceptualize yourself as a woman, that is the most important thing about you.
00:35:27.500 Whereas when we do the same cyberization or sort of breaking apart of the individual within
00:35:34.220 our world framework, we say the most important thing about you is your goals for human civilization,
00:35:39.860 your life, what you find purpose in, et cetera, not basal arousal patterns or self-conceptualizations.
00:35:48.200 So then how can this give birth to new forms of feminism at all, was my first question after
00:35:54.980 just reading about the manifesto and looking at the manifesto.
00:35:58.500 And then what are cyberfeminism and xenofeminism?
00:36:01.600 So let's get into that.
00:36:03.120 Cyberfeminism is a strand of feminist thought and activism that focuses on the relationship
00:36:08.300 between gender and digital technologies, especially the internet and networked media.
00:36:12.640 It looks at how technology can both reinforce existing power structures and be used to challenge
00:36:17.840 patriarchy and create new possibilities for identity, embodiment, and political action.
00:36:22.260 So it treats cyberspace, which we got, we should bring back that word.
00:36:27.040 I mean, information technology is not as neutral or inherently male domains, but as contested
00:36:32.180 terrains where gender, race, and power are negotiated.
00:36:35.080 So immediately the people who built on this concept brought back an identity.
00:36:41.620 They're like, nah, like this isn't, you know, forget origin stories.
00:36:45.200 No, like we're still women.
00:36:47.080 And yeah, the internet changes everything, but now we have to own it.
00:36:49.620 We have to own the internet.
00:36:50.700 Like we are legion.
00:36:53.420 We are Tumblr.
00:36:54.760 And it just, it really bothers me.
00:36:58.300 Cyberfeminists also aim to understand and intervene in how technologies are designed and who
00:37:02.920 controls them and how they shape everyday life from like work to surveillance, to sexuality
00:37:07.440 and social connection.
00:37:08.420 And they explore how online platforms support resistance and networked activism and in alternative
00:37:14.980 forms of community that disrupt rigid gender norms.
00:37:18.720 So in practice, cyberfeminism has included digital art and hacking and code-based interventions
00:37:23.440 and online collectives and analyses of social media movements and hashtag campaigns against
00:37:28.740 harassment.
00:37:29.400 So while cyberfeminism isn't exactly like, it is not, it is being responsible for me
00:37:37.360 too.
00:37:38.700 No, this is like, it's basically responsible for me too.
00:37:42.560 Hashtag me too.
00:37:43.960 Well, me too.
00:37:44.520 Was it that bad compared to Anita Sarkeesian?
00:37:46.600 She's also, yeah, it's also kind of behind Gamergate.
00:37:49.760 Like all, basically like the premise of cyberfeminism is what produced Gamergate.
00:37:55.400 And I'm like, oh my God, like, how can you take, how can you take the Cyborg Manifesto
00:38:00.280 and screw it up this badly?
00:38:02.360 Right?
00:38:02.920 Oh, oh.
00:38:04.080 But it gets worse.
00:38:04.920 It gets worse.
00:38:05.260 I want to hear more about the xenofeminism and black feminism, cyberfeminism.
00:38:09.160 Yeah.
00:38:09.400 So, so while, okay.
00:38:10.480 So cyberfeminism is, is linked to third wave feminism, which I had heard of before.
00:38:16.120 It just extends earlier struggles over rights and repression into digital environments.
00:38:20.820 That's basically all it became.
00:38:22.160 Like they just completely forgot about what Haraway was, Haraway was actually arguing.
00:38:26.840 In most recent currents, such as xenofeminism and black feminism sort of build on the critique
00:38:32.620 in very different ways.
00:38:34.300 But the, the, the focus is, is on, on centering intersectionality and queerness and global inequalities
00:38:43.100 and how technologies are built and used.
00:38:45.080 And I, I just, oh, like why, but it gets worse because then there's black cyberfeminism
00:38:51.640 and early cyberfeminism often focused on utopian possibilities of the internet for gender
00:38:57.240 subversion.
00:38:58.100 And it was criticized for centering white Western perspectives and overlooking race.
00:39:03.860 Like here, Donna Haraway is, and she's like, let's just forget origin stories.
00:39:07.500 And let's just like drop all boundaries and like build a better future.
00:39:10.980 And then, and then like other groups are like, that's so racist.
00:39:16.040 How can you, how can you do that?
00:39:18.760 But black cyberfeminism actually didn't arise until the 2010s.
00:39:22.340 So it's like quite a while after cyberfeminism in the 1990s.
00:39:25.720 And it addresses how racial prejudices persist online and how black women navigate, resist and
00:39:32.680 reshape digital spaces.
00:39:34.500 And it draws from black feminist traditions, like intersectionality and Afrofuturism.
00:39:40.100 And it's, it's really just described as an intersectional framework that extends cyberfeminism and black
00:39:47.400 feminist thought into digital spaces.
00:39:49.960 And I just, you know, I just, I, yeah, it bothers me, but xenofeminism, I feel like, I feel like
00:39:59.420 I can get on board with it.
00:40:00.440 So it also emerged at the same time that black cyberfeminism emerged in the 2010s and it emerged
00:40:06.780 around a new manifesto called Xeno Feminism, a politics for alienation by the, the collective
00:40:13.980 Laboria Kubonik, Kuboniks.
00:40:19.220 It presents itself as a techno materialist feminism and that it treats digital networks, biotech and
00:40:25.240 other technologies as tools that can be re-engineered to undermine patriarchy and capitalism and racism.
00:40:30.440 And, and other entrenched hierarchies rather than.
00:40:33.360 And now they're the entrenched hierarchy and that's why their own logic.
00:40:36.500 That's where I'm a little concerned.
00:40:38.140 Yeah.
00:40:38.400 Cause it rejects biological determinism and insists that gender roles and many so-called natural
00:40:43.780 differences or social constructs that can be reconfigured and that gender.
00:40:48.720 Reconfigured if you reconfigured a person's like genetics.
00:40:52.320 That's what I'm saying is like, I think that in on, on one hand, xenofeminism sounds really
00:40:58.280 cool.
00:40:59.200 Like just forget about gender, screw gender, biological sex limits you, or you have a problem
00:41:03.460 with it.
00:41:03.860 Just engineer a solution, like do whatever you want to do.
00:41:06.620 Yeah.
00:41:06.760 But here's, here's where I think our form of cyberfeminism or post-cyberfeminist theology
00:41:12.480 comes from.
00:41:13.980 Yeah.
00:41:14.260 So I'll lay this out.
00:41:15.060 So we agree with them that you can use technology to change genetics, to change the psychological
00:41:23.820 structure of males and females, to change their roles in relationships, to change the
00:41:29.820 way that they interact with each other.
00:41:31.340 However, we then take the secondary position, which is to say, but it turns out that the
00:41:38.940 two roles in society of male and female create homeostatic perfection.
00:41:45.980 And by that, what I mean is, yes, you could hypothetically redesign man and woman to be
00:41:51.940 something other than man and woman, but such a society would be less stable and weaker than
00:41:57.140 a society of men and women.
00:41:59.960 Yeah.
00:42:00.260 No, no, no.
00:42:00.680 No, I, I, 100%, because pretty explicitly, as you know, feminism argues for a reworked
00:42:06.480 intersectional universalism that can include all who are currently othered.
00:42:11.780 So across gender and race and class and species, it doesn't focus on narrow identity categories,
00:42:16.880 but it also kind of implies this like homogenization.
00:42:20.360 And I'm super against that.
00:42:22.160 And I think like extrapolating out from your male, female balance and homeostasis, that's
00:42:26.920 really valuable.
00:42:27.560 Like more broadly, we see variety as the key progenitor to flourishing and homeostasis,
00:42:34.800 but you can't get that without variety.
00:42:36.700 Like you need a diverse ecosystem to create flourishing rather than some kind of slurry.
00:42:42.840 And my concern with xenofeminism is there's this almost explicit, perhaps explicit if I dug
00:42:50.140 deeper into it, desire for homogenization and in some kind of new idealized version that
00:42:57.900 doesn't have identity.
00:43:00.060 But I think two problems with, and I still don't think it honors the spirit of the Cyborg
00:43:05.200 Manifesto because one, it very much has origin stories like I'm an oppressed, whatever, and
00:43:12.220 I need to fix that.
00:43:13.500 And also it, it implies homogenization when what she's really saying is like, I don't
00:43:19.660 care where you came from.
00:43:21.120 I don't care what you are, be what you want to be.
00:43:23.740 And like, let's organize around interest groups, just autistic adhocracies.
00:43:29.160 And I love that.
00:43:30.520 And I just, it really, I just, I love that Donna Haraway realized that biotech and the internet
00:43:38.580 could render gender wars and even identity politics, which politics, which are ultimately
00:43:42.860 not very productive, obsolete.
00:43:45.100 And I love that people found this inspiring.
00:43:47.640 And it makes me sad that others were like, eh, like, no, let's, let's keep identity politics.
00:43:53.340 No, they found a way to twist this into keeping identity politics.
00:43:57.060 Yeah.
00:43:57.400 She's sort of explicitly arguing that identity politics don't matter in the age of the, the,
00:44:02.580 the cyborg.
00:44:03.200 She's arguing they don't have to matter.
00:44:05.140 And also like the whole point of the prompt for the essay that she's responding to, like
00:44:09.180 she's, she's, she's responding to an essay.
00:44:11.620 That's like, oh man, like, what do we do about all these really important, like feminism and
00:44:15.260 leftist movements falling apart because of identity politics.
00:44:18.100 And she's like, here's my solution to identity politics, ruining everything.
00:44:22.640 And people are like, well, what a cool solution.
00:44:24.760 And they're like, oh, but, but identity politics.
00:44:27.260 So it's like really clear that identity politics have won out, but that's in the short term, because
00:44:33.000 if you look at people like Grimes and like us, we're above replacement in kids and we
00:44:39.540 are super against identity politics.
00:44:41.820 And we absolutely love this sort of like build whatever it is you want for your future.
00:44:46.340 We're building our own culture.
00:44:47.520 We love it when people like, like culture craft and culture jam and all that.
00:44:51.840 And, and I think Grimes does too.
00:44:53.620 And I think also sort of like the, the way you're making me realize this follows very
00:44:58.840 much the arc of, of her career too, because I love watching people do long commentaries
00:45:04.680 on like her career is that, you know, she, she very much sympathizes with a lot of like
00:45:11.340 feminist and trans and Marxist, et cetera, ideologies, because like she sees the merit in many of
00:45:16.860 their ideas and champions them.
00:45:18.680 And then they're like, oh, you're part of my identity group.
00:45:21.500 And then like, as soon as she says something, it slightly deviates from any of them or like
00:45:25.360 associates with someone who doesn't identify with some of them or critiques them.
00:45:29.800 Yeah.
00:45:30.620 Then they're like, wait, how could she, how would she possibly like move around with the
00:45:35.620 Palladium crowd or have children with Elon Musk?
00:45:38.580 Like she's, she's bad.
00:45:39.900 And then, but then they feel so conflicted because then she comes out with something else
00:45:42.640 brilliant and they want to, they want to engage with it, but she's, she said the
00:45:46.280 promotion thing and we must now reject her.
00:45:48.260 She's an out of the box thinker, like Ayla, you know?
00:45:50.420 She, well, yeah, no, that, that's the thing is she liked Donna Haraway totally rejects
00:45:55.280 identity politics and even origin stories.
00:45:57.520 Like she is, she is her.
00:45:59.820 And, and the funny thing is, again, when I like watch these long YouTube things on her
00:46:04.080 and like, oh yeah, you're right.
00:46:04.980 Like when I, when I like read commentary on Ayla too, like they all want to go, they want
00:46:10.100 to, they want to shoehorn these people into origin stories.
00:46:13.340 Like Ayla grew up in a strict household that abused her.
00:46:16.740 And that's why she became who she was, you know?
00:46:19.720 And it's not like, oh my gosh, like, no, even you, like, you know, people want to shoehorn
00:46:25.100 you in.
00:46:25.720 Yeah.
00:46:25.740 Like Steven Molyneux in the debate.
00:46:27.460 You're, you're, you're traumatized by your childhood.
00:46:30.580 You, you have deep scars, you know?
00:46:32.460 And like, yeah, like first shut up about the origin story.
00:46:35.180 Like, and, and all three of you really have, have engineered your own self.
00:46:38.840 My deep scars crafted me into a tool of military precision.
00:46:43.080 Well, then I, I think I very much epitomize Haraway's Santa Cruz, California culture, which
00:46:48.180 is like, I just don't remember my origin story.
00:46:50.700 I'm, it's very Peter Pan-esque.
00:46:52.520 Like.
00:46:53.040 You talk about your family's background all the time, Simone.
00:46:55.780 Like.
00:46:55.880 We talk about their background, but like.
00:46:58.040 Sort of only in the context of us trying to create a false myth.
00:47:01.520 I mean, not false myth, but like, you know.
00:47:03.100 Then just build our own new origin story.
00:47:05.440 It's a good one.
00:47:06.060 I know.
00:47:06.400 It is a good one.
00:47:07.160 Come on.
00:47:07.660 You know, I jumped onto your family as soon as they welcomed me.
00:47:10.900 But yeah.
00:47:11.240 Anyway, I just, I, my hope is that this whole post origin story, post identity thing wins
00:47:17.640 out in the longterm.
00:47:18.520 But unfortunately for the time being, this foundational piece, which is.
00:47:24.360 Wait, did you get to explain what black cyber feminism was?
00:47:26.860 Yeah.
00:47:27.220 It was basically like, no, I, I reject this concept.
00:47:32.100 But my identity matters more.
00:47:35.020 Woe is me.
00:47:36.720 They're like, I'm privileged.
00:47:38.360 How dare you take away my privilege?
00:47:40.640 Well, no, I mean, they just, they just want to point.
00:47:42.820 I mean, like classic example of like how being a black woman online sucks.
00:47:47.860 Okay.
00:47:48.020 Cupid stats.
00:47:49.060 Right.
00:47:49.380 Like there are plenty of examples.
00:47:51.220 There's plenty of examples of where you are genuinely disadvantaged.
00:47:54.280 I think the problem is they're missing.
00:47:56.280 The point is like, you don't, you don't have to be a black woman online.
00:47:59.920 And also like some of my favorite creators are black women, like on YouTube.
00:48:03.720 Like, I don't, I don't see them as necessarily doing really poorly.
00:48:07.000 So I don't know.
00:48:09.340 Like, yeah, I just like, I think they're a really classic example of this larger dynamic
00:48:15.820 of people really struggling to let go of identity politics, even though there are downfall.
00:48:21.860 And they're like the last thing that's going to help.
00:48:24.080 Well, I think then we need to coin post-cyber feminism.
00:48:29.320 And post-cyber feminism is that, yes, this, well, it's not, gender roles and gender isn't
00:48:35.460 entirely a social construct.
00:48:36.960 It's a social contract based on a biological reality.
00:48:39.520 But that biological reality is alterable in the future.
00:48:44.180 The question is, is are the factions of humanity that attempt to dramatically alter that, are
00:48:49.840 they going to be able to stay above replacement rate and technologically and economically productive?
00:48:53.940 And I think if we look at the existing groups that are experimenting with that right now,
00:48:59.220 the answer is no.
00:49:00.860 Should it be post-cyber feminism or hard cyber feminism?
00:49:05.720 Post-cyber feminism.
00:49:06.420 See, the problem is, like, I don't like, I don't like that it's called cyber feminism
00:49:10.360 at all because feminism, like, attaching feminism to it makes it an identity group.
00:49:17.480 And again, like I said, like, Donna Haraway, like, she, yes, she responded to the prompt as
00:49:23.060 a feminist and put, like, a feminist framing into the piece, but that's because it wouldn't
00:49:26.800 have been published otherwise.
00:49:27.880 Like, she clearly rejects the concept of, of identity groups in general.
00:49:34.740 And she's basically saying, like, I think you're missing the point.
00:49:38.060 So the point of cyber feminism is the cyber part modifies the feminism part by saying we
00:49:45.560 can promote the interests of women by tearing down gender roles.
00:49:49.980 Yeah, I mean, in short, if you love it, let it go, though.
00:49:52.900 Like, if you actually want women to thrive, allow them to become something beyond women.
00:49:58.140 Right, that's what she's saying.
00:50:00.220 And we're saying we agree, but we are post that because we have recognized that tearing
00:50:06.560 down the gender roles is self-destructive.
00:50:09.640 That society ends up disintegrating, I'm going to begin to guess, in most groups that do tear
00:50:16.360 apart gender roles.
00:50:17.520 Which is to say, even if women are no longer birthing of children and men are no longer the,
00:50:24.480 you know, they perform that role.
00:50:26.280 I still think in a society to have one gender that is optimized around, you know, basically
00:50:33.760 being disposable or being super valuable.
00:50:35.840 So you get longer tail distributions, being slightly more ambitious and aggressive and
00:50:42.300 being more open to militarism and have another group that is more spiritual.
00:50:48.080 Historically, women have always been more spiritual and was in conservative communities.
00:50:51.060 Women stay at higher rates than men do.
00:50:53.420 Surprising fact that a lot of people don't know.
00:50:55.020 So a lot of people think that the groups that women are discriminated in with, like, say,
00:50:59.020 the conservative Jews.
00:51:00.320 To me, it feels kind of wrong.
00:51:02.060 For the Mormons.
00:51:02.680 Just how, like, that's like, and I know you're going to be like, well, hormonally, that's not
00:51:07.460 true, but, like, women were often seen as the more sexually voracious group than men.
00:51:12.220 It's true, but they, even back then, they were seen as a more spiritual group.
00:51:16.080 Generally speaking, women have been seen as a more spiritual group.
00:51:17.560 I don't know, like, look at Jewish groups, like, men are the ones who get to do the studying
00:51:21.000 and women are the ones who...
00:51:22.000 True.
00:51:22.520 I didn't say religious.
00:51:23.820 I said spiritual.
00:51:25.600 Like, they're just more prone to believe this.
00:51:27.800 I would still say that Jews would be like, well, men, they would...
00:51:30.880 And we should see our episode on this before we actually go into it.
00:51:32.880 This is why when a large group of people began to deconvert from religions, the men became
00:51:38.020 atheists and agnostics and the women all became, like, Wiccans and pagans.
00:51:40.800 Oh, okay.
00:51:41.860 That's a fair, that's a fair point.
00:51:43.280 But yeah, we should do an episode on that.
00:51:44.640 At any rate, I really enjoyed this conversation with you.
00:51:47.580 Oh, no, but hold on.
00:51:48.480 Hold on.
00:51:48.840 I'm not done here.
00:51:49.640 And the women are more caregiver-y.
00:51:52.200 They maybe specialize in child-rearing more, in larger bureaucratic structures more.
00:51:56.940 And you could have a society that's optimized around that.
00:51:59.940 I know that some people want to, in our audience, want to try to create societies without women,
00:52:04.920 which is entirely possible.
00:52:06.560 But we'll see.
00:52:07.620 Maybe the ones whose women survive, maybe they don't.
00:52:10.520 I can tell you what, the all-women civilizations are definitely not going to survive.
00:52:13.300 So you don't even discuss them.
00:52:16.460 As shows really quickly.
00:52:17.760 Love you to Desimone.
00:52:18.720 You're a great wife.
00:52:20.280 And thank you for putting together this very informative episode.
00:52:23.020 So fun.
00:52:23.680 No, I mean, thanks to Grimes for always, always having something great.
00:52:29.660 Okay.
00:52:30.140 Which one do you want to do next?
00:52:32.280 Of course.
00:52:33.260 Tell me about the, oh, by the way, for people who, one of the episodes we're going to be doing
00:52:36.480 this weekend is on Reddit because we learned somehow, I could, I've been really bummed about
00:52:41.720 the episodes not doing as well recently with the new Gemini change.
00:52:44.520 And then I see a post on Reddit and I'm like, wait, okay, this is near the top of my feed.
00:52:48.840 It's a base camp post.
00:52:50.640 We don't have a base camp subreddit.
00:52:52.880 So I go to Reddit and our subreddit is wildly popular.
00:52:58.980 51K weekly visitors, 14K weekly contributors.
00:53:03.740 So to put that in context there, Hassan Spiker has 6.3K weekly contributors.
00:53:10.940 Our historians has 160 weekly contributors.
00:53:13.880 That used to be one of my favorite subreddits.
00:53:16.060 The Joe Rogan experience has 7.9K.
00:53:19.340 So we are, and if you're looking at weekly visitors for Joe Rogan, it's 273K versus our
00:53:25.780 50K, right?
00:53:27.060 But that's like not that much different from what I'd expect.
00:53:31.760 It's really not.
00:53:32.600 It's impressive.
00:53:33.160 It's 301 and three.
00:53:35.480 The only one that comes close to our outsized impact is the Red Scare subreddit with 30K weekly
00:53:43.020 interactions.
00:53:44.020 It's a super active subreddit.
00:53:45.980 The one that really got me was us having as much as 1 16th the weekly visitors as advice
00:53:51.780 animals and almost twice the weekly contributions as advice animals, where I proposed to my
00:53:57.300 wife and what I used to think of as the biggest subreddit.
00:54:00.420 Yeah.
00:54:00.860 Yeah.
00:54:01.100 So why is it us and them?
00:54:02.780 And so that's what we'll be going into on that one.
00:54:04.340 Yeah, we're going to.
00:54:05.500 But we're starting.
00:54:07.040 Asmogold we do about as well as.
00:54:08.920 Asmogold.
00:54:09.640 Oh.
00:54:10.000 Well, no, he's for, in terms of contributions, he's at 13K contributions, but he's at 356K
00:54:16.360 weekly visitors.
00:54:17.300 So wait a minute.
00:54:18.200 Because he's asmogold.
00:54:18.560 But not that much bigger.
00:54:19.700 Like, like 3X bigger when like on YouTube, he's like 100X bigger.
00:54:23.740 Yeah, that's true.
00:54:24.700 Yeah.
00:54:25.900 Okay.
00:54:26.180 We're going to give in tomorrow.
00:54:26.820 Not a 3X bigger.
00:54:27.540 It's like 10X bigger, 8X bigger.
00:54:29.220 Anyway.
00:54:29.740 I want to hear about the comments on today's episode.
00:54:32.360 Oh, gosh.
00:54:33.380 Okay.
00:54:33.800 Blanking.
00:54:34.000 Blanking.
00:54:34.160 And they liked it.
00:54:37.120 They, they, one person pointed out, and I think, which this is very astute.
00:54:42.020 This is the episode on the United States new national security strategy that Trump ultimately
00:54:48.020 is doing, like cares more about Europeans than the EU, basically.
00:54:52.440 Like as is turned in this strategy, which is basically like, I'm going to try to go around
00:54:57.240 the EU and talk with the non-terminal elements of these countries and we'll see what we can
00:55:03.060 do.
00:55:03.340 So yeah, people, people really enjoyed that.
00:55:06.740 We, we definitely have also our, our fans of, of deontological approach in the, in the
00:55:12.700 comments who reckon will they chime in.
00:55:14.340 I'm concerned about your consequentialist prioritization.
00:55:18.180 I mean, in the end, like if you take any, if you take a hard, unyielding line on either
00:55:25.440 approach, you're going to commit atrocities like pragmatism reigns supreme, but people still
00:55:31.740 are very much team deontology and we're still team consequentialism.
00:55:36.820 So ironically, yesterday we also did an episode on deontology versus consequentialism in Santa
00:55:42.080 Claus and lying to children.
00:55:43.280 And some people were like, well, lying is just a bad thing.
00:55:45.620 And I was like, well, you know, whether or not it's bad depends on the consequences.
00:55:49.100 And I think that this is a great encapsulation of my views on deontology, because to me, it
00:55:55.160 shows, you know, when I talk about the deontological leader of an army having to explain to the, you
00:56:00.940 know, mothers and wives of the soldiers, why he let them die instead of doing ambushes and
00:56:06.420 other tricky things to get the enemies.
00:56:08.500 And fundamentally, what I'm pointing out there is that what deontology also does is outsource
00:56:13.800 the cost of your moral grandstanding to other individuals.
00:56:19.380 I think Santa is a perfect example of that, saying, I will rob my children the experience
00:56:24.840 and wonder of Christmas and Santa Claus and magic so that I can maintain moral purity is
00:56:33.280 is a perfect encapsulation of this.
00:56:37.180 It is one of the most beautiful parts of childhood, you know, for me, for many other people, is
00:56:42.200 getting to spend part of your life in this magical world before you grow up into the dull black
00:56:47.960 and white world of adulthood to say, no, my kids don't get that because I don't want to
00:56:53.960 ever lie.
00:56:55.220 Outsourcing the moral consequences.
00:56:56.800 That's one of my biggest problems was deontological moral frameworks.
00:56:59.840 I mean, even if you don't see this as outsourcing the moral consequences, you have to admit
00:57:02.820 that sometimes you do outsource the moral consequences, because if there are cases in
00:57:07.500 which a consequentialist view would lead to less overall negatives, which is fundamentally
00:57:12.960 why you're a deontologist and not a consequentialist, you're saying, okay, yes, but even in those
00:57:16.700 cases, I would choose a deontological moral framing, and those consequences will not always
00:57:21.400 be borne solely by you.
00:57:22.680 So yes, it does universally outsource negatives to other people for individual moral purity.
00:57:29.600 For those that struggled with that, I'll word this another way to make it more clear,
00:57:32.720 it's a really important point.
00:57:34.480 The core difference between a deontological moral framing and a consequentialist moral
00:57:38.800 framing is I judge morality by the outcomes of an action.
00:57:42.840 An action was good if it led to good outcomes, bad if it led to bad outcomes.
00:57:47.080 Deontologists view morality by a list of rules.
00:57:49.940 X type of thing is bad, Y type of thing is good.
00:57:52.840 Like lying is bad.
00:57:54.220 Even if through lying to somebody, you may achieve some, what you perceive as in the
00:58:00.060 moment, some sort of positive outcome.
00:58:02.120 The core distinction between these two framings is would you, when you have judged an action
00:58:10.900 as usually a bad type of action, like lying, engage in that action when you know it leads
00:58:17.180 to a good outcome?
00:58:18.180 And you can say, no, I just wouldn't do that, right?
00:58:22.980 You are saying universally through this is that the core place where these two ideologies
00:58:28.400 conflict is in that specific question, which means that you are allowing for moral purity
00:58:35.900 of yourself, sometimes bad outcomes.
00:58:39.140 And the point I'm making here is that those bad outcomes are almost never only shared by
00:58:45.600 you.
00:58:46.680 And thus, it is a moral system that is almost to me the antithetical of morality because
00:58:52.220 it outsources the negative outcomes of your moral purity onto other people's lives and
00:58:59.140 existences.
00:59:00.000 And you can say, well, actually, if everyone adopts the deontological framework, or if I
00:59:04.500 adopt a deontological framework, long term that leads to better outcomes.
00:59:08.340 But if you make this argument for a deontological framework, what you're actually arguing for
00:59:12.820 is a consequentialist framework, just one that looks deontological in nature, because then
00:59:18.800 we are still having the argument about which framework leads to more morality overall, in
00:59:24.260 which case you're not a deontologist at all.
00:59:26.320 You just disagree about how to achieve the most morality.
00:59:29.680 And then you need to engage with us not like lying is wrong, but these are the effects it's
00:59:35.420 going to have on society if you choose this, this, and this, which then just makes you
00:59:39.200 a consequentialist.
00:59:39.840 And with that scenario, I mean, this is interesting because we're entering like the conservative
00:59:43.520 intellectual community.
00:59:44.460 And in the old days, if you look at like the way the political spectrum broke down historically,
00:59:49.120 conservatives were much more likely in place of the intellectual class to be a deontologist.
00:59:53.800 And the intellectual class that was more progressive was more likely to be consequentialist.
00:59:58.340 And this is really switched because the urban monoculture has adopted a deontological moral
01:00:02.680 framing.
01:00:03.060 And so the deontological conservative intellectuals are a little bit more confused, I think,
01:00:08.240 about where they fit in this new framing and structure, as many of the consequentialists
01:00:12.160 have moved to the conservative movement.
01:00:13.960 Yeah.
01:00:14.840 Oh, and final little anecdote.
01:00:16.640 I took the kids to BJ's, our local like big box store this morning to do two weeks worth
01:00:24.720 of grocery shopping.
01:00:25.520 And they, of course, always love encountering the restocking robot that goes by, has two
01:00:33.880 little eyes, it makes a little whistling sound, and it checks shelves to see what needs to
01:00:39.120 be restocked.
01:00:39.840 That's what the robot's there for, though they just think it's there for like, it's looking
01:00:43.500 for its family.
01:00:44.440 That's what they're convinced of.
01:00:45.860 And this time, because this is like a store like Costco where you buy huge amounts of things
01:00:52.260 and then show your receipt at the exit, you always have to pass by someone to, you know,
01:00:57.340 prove your purchases at the exit.
01:00:59.620 And this time the guy gave to our kids instead of stickers or stamps, which sometimes he give
01:01:05.620 to the kids, he gave them like coloring book pages of the robot in the store aisles.
01:01:11.860 Because it's clear that to kids who go to BJ's, this robot has become so much of the lore of
01:01:18.280 BJ's.
01:01:18.780 It's like the robot, and it's really not, like, it doesn't look humanoid.
01:01:23.080 It's just like a stick, like the pictures themselves.
01:01:26.160 You look at it, and if you don't know what it's about, you're like.
01:01:29.820 I was a CEO at BJ's.
01:01:32.020 I would take this to really dress up the store robots.
01:01:35.240 Make them look like reindeer in the winter or something.
01:01:37.980 Oh, no, totally.
01:01:38.380 Yeah, like me, and make them look, you know, outlandish and great.
01:01:40.680 Because then the kids are going to be like, I want to go to the robot store.
01:01:42.800 I don't want to go to Walmart.
01:01:43.940 I don't want to go to whatever.
01:01:45.160 Like, I want to go to the robot store.
01:01:46.680 No, like, they're really missing an opportunity there, though maybe it's starting to dawn on
01:01:50.660 them.
01:01:51.260 But I just, I think about this, and I think about the way that our kids are already relating
01:01:56.240 to robotics and AI.
01:01:58.740 And everyone's like, oh, you can't just have AI friends for your kids.
01:02:01.540 And I'm like, dude, like, this isn't just our kids either.
01:02:04.420 And I'm also looking at recent Pew study results around chatbot usage and youth in the United
01:02:12.120 States today.
01:02:13.140 And it's already so pervasive.
01:02:16.640 Speaking of, ArtFab is stable as of today.
01:02:19.860 Yeah.
01:02:20.620 Yeah.
01:02:21.000 Which is really exciting.
01:02:22.160 But I think we're doing bug things, but it's broadly stable now.
01:02:25.360 And I'm going to bug.
01:02:25.900 No, it is amazing.
01:02:27.060 If you haven't checked out rfab.ai yet, please do check it out.
01:02:29.800 It's, it's really fun, even if you're not into chatbots, but about three in 10 teens
01:02:34.480 say they use AI chatbots every day, including 16% who do so several times a day or almost
01:02:41.140 constantly, 16%.
01:02:43.100 So I think, yeah, people just don't really understand the way that, just how, like, I
01:02:50.060 think boomers can't understand a generation that sort of grew up online.
01:02:54.280 We and older generations cannot understand.
01:02:58.960 I can understand it.
01:03:00.120 I mean, you get it because you're super into it, right?
01:03:02.880 And like, just like your mom was on TikTok before anyone else, right?
01:03:06.320 Like, it's not, it's not your age necessarily.
01:03:10.100 It's just a strong correlation.
01:03:11.580 I think a lot of people are, including some young people too, who just happen to be.
01:03:15.900 An episode I really want to do, by the way, that I'm wondering we're not due today is
01:03:19.560 on how AI and not just chatbots, but like real girls and stuff they do as AI are going
01:03:25.300 to destroy the OnlyFans business model.
01:03:27.680 I'm looking for the online e-girl.
01:03:30.720 Well, somebody mentioned recently, and it really got to me when they pointed this out
01:03:34.140 because I like, it hit me.
01:03:35.280 Oh my God, this is true.
01:03:36.940 You don't see hot alternative girls working in customer facing positions anymore.
01:03:43.500 Oh, hold on though.
01:03:44.660 No, no, no, no, no, no, no, no, no, no.
01:03:46.020 There's this whole new drama that's been playing out online about GameStop stealing Best Buy's
01:03:53.620 girl.
01:03:54.140 Are you kidding me?
01:03:54.400 This is where I heard about this.
01:03:55.760 Okay.
01:03:56.420 Then what's...
01:03:56.860 So the point they were making is...
01:03:58.300 For context, by the way, is there used to be this like TikTok influencer who worked for
01:04:01.320 Best Buy and everyone just loved her.
01:04:03.460 And then GameStop recently, I think hired her and she like did an ad for them.
01:04:06.920 And everyone's like, oh my God, GameStop's still your girl.
01:04:09.680 That's why I'm getting cocked.
01:04:11.680 No, and it shows like she's got like a nose ring and everything.
01:04:14.000 But the point I was making was people are saying that historically, it used to be you
01:04:18.920 would go to your GameStop or you go to your Hot Topic and working the counter would be
01:04:24.520 a cute girl with a nose ring, you know, like your cute, you know, whatever girl.
01:04:28.600 They're like, these girls do not work consumer facing positions anymore because they're all
01:04:34.260 online doing some sort of sex hustle.
01:04:36.600 Oh, right.
01:04:37.780 So they used to be like sweet waitresses or working at the ice cream shop or whatever.
01:04:42.260 And I noticed, yeah, you don't see like hot alternative young girls working in customer
01:04:47.560 facing positions anymore.
01:04:48.460 Hot alternative.
01:04:50.400 Right.
01:04:50.820 So like maybe in the Midwest, you'll still find a nice Christian girl slinging eyes.
01:04:54.620 Right, right.
01:04:55.200 But you won't see nose ring or dyed hair girl.
01:04:58.280 Not cute goth girl, which is maybe why, maybe this is why GameStop girl is such a...
01:05:03.260 Exactly.
01:05:04.440 Oh, they're like, she's the last one.
01:05:07.140 The last.
01:05:07.840 Oh my God.
01:05:08.220 She's so precious.
01:05:10.460 We must...
01:05:11.080 Yeah, she's the new Helen of Troy.
01:05:13.600 The last hot goth girl working in retail.
01:05:19.600 But she's not really.
01:05:20.540 She's an influencer.
01:05:21.560 The face that launched a thousand online comments.
01:05:25.860 Okay.
01:05:26.960 I'm going to get into it now.
01:05:28.680 All right.
01:05:29.060 Speaking of cyber feminism, God, speaking of hot online girls, let me pull up the outline.
01:05:37.160 Let's see.
01:05:38.160 Hold on.
01:05:38.640 I have so many freaking tabs open like everyone does.
01:05:43.700 All right.
01:05:47.180 I mean, I have the coffee kids do too, but it's not like going to kill me.
01:05:51.760 Jeez.
01:05:52.680 All right.
01:05:53.100 Let me get it out though.
01:05:57.940 All right.
01:05:58.220 You ready?
01:05:59.440 You don't want to look inattentive or people will think that you're phoning it in.
01:06:03.840 Oh, yes.
01:06:04.520 Do I get that complaint ever?
01:06:06.960 Well, one person was like, someone's bringing it.
01:06:09.620 Like, step it up, Malcolm.
01:06:11.720 Maybe it's because you, you know, you're less good at pretending to look engaged when you're
01:06:18.880 reading Korean romances on the side.
01:06:21.180 When I do it, the thing is you just, no, I will, I will tell you the secret, Malcolm.
01:06:30.000 You put, you have it on your phone.
01:06:33.280 You need to put it on the screen and then you switch.
01:06:36.460 I can't.
01:06:37.260 The apps that I use.
01:06:38.400 Webtoons doesn't.
01:06:39.880 Yeah.
01:06:40.140 I have to use the phone.
01:06:41.560 Oh, that sucks.
01:06:43.760 Sorry.
01:06:44.340 All right.
01:06:44.700 Well, anyway.
01:06:45.440 All right.
01:06:45.760 You ready?
01:06:46.360 Cause I'm going to start.
01:06:47.500 I am.
01:06:48.180 Hey, we have the list now of the good Korean romance.
01:06:53.160 And I don't know if you.
01:06:53.700 I know I need to publish.
01:06:54.620 I haven't published it yet, but I will.
01:06:56.720 I should do that this weekend.
01:06:57.840 I'll certainly do it over the, our winter break thing.
01:07:01.160 All right.
01:07:01.460 Ready?
01:07:03.020 Okay.
01:07:04.760 No, but look, she fixed your house.
01:07:06.300 I want to break this so I can eat it.
01:07:08.300 I want a big, I want a big, I want a big, I want to break it.
01:07:12.040 Well, do you want to put a big window on your house?
01:07:14.660 Can I break this house apart?
01:07:15.940 No.
01:07:17.340 Yeah.
01:07:17.660 How would you do that?
01:07:19.260 I'm going to have a big.
01:07:19.760 You worked so hard to build it.
01:07:20.920 But how is that like in the mouth?
01:07:22.900 Well, she shouldn't have thought about that.
01:07:24.540 Well, now you want a bigger house.
01:07:26.860 I know we're going to try to go in the house, but it doesn't.
01:07:29.940 So maybe we can make a giant roof like this.
01:07:35.420 What do you think?
01:07:35.780 Finally.
01:07:36.140 Like that?
01:07:36.740 I still need a big house.
01:07:38.480 Okay, then we'll just add this giant roof on top.
01:07:41.280 Oh my gosh, but the roof might be too big, Titan.
01:07:44.020 Too big.
01:07:45.160 Let's find out.
01:07:45.260 Do you think it's too big?
01:07:46.420 Let's find out.
01:07:47.520 Let's find out.
01:07:49.100 Whoa.
01:07:50.420 Whoa.
01:07:51.580 I want it even, I want it even taller.
01:07:55.460 Even taller?
01:07:56.640 That's the tallest house there is.
01:07:58.200 Tighten, it'll lose its structural stability and fall over if you make it taller.
01:08:02.700 I want it even bigger.
01:08:04.920 That looks so taller.
01:08:07.620 You look so tall.
01:08:09.340 You look so tall.
01:08:10.180 I want it.
01:08:11.120 I want it even bigger.
01:08:12.980 It's taller.
01:08:14.520 Even bigger.
01:08:14.960 You're a demanding girl, Titan.
01:08:16.420 Very demanding, young lady.
01:08:18.440 Yeah.
01:08:20.080 Let's make it up a little tent house for you, Octavian.
01:08:23.560 Yeah, let's make it up a little.
01:08:25.720 I want to play.