RadixJournal - October 29, 2017


Unconscious Cinema - 3D Hegelian Waifu


Episode Stats

Length

49 minutes

Words per Minute

147.89287

Word Count

7,324

Sentence Count

475

Misogynist Sentences

24

Hate Speech Sentences

15


Summary

In this episode of Alt-Right Plus, we discuss Blade Runner 2049 and the themes explored in the film, such as the relationship between replicants and artificial intelligence, and how this ties into the larger themes of the film.


Transcript

00:00:00.000 Music
00:00:04.000 Music
00:00:08.000 Music
00:00:20.000 Music
00:00:26.000 Music
00:00:28.000 Music
00:00:29.000 Music
00:00:33.000 I can't rely on you. Say you kiss me.
00:00:39.000 Music
00:00:41.000 Music
00:00:43.000 Music
00:00:45.000 Music
00:00:47.000 Music
00:00:49.000 Music
00:00:51.000 Music
00:00:53.000 Music
00:00:55.000 Music
00:00:57.000 Music
00:00:58.000 Music
00:01:02.000 Music
00:01:04.000 Music
00:01:16.000 Music
00:01:18.000 Music
00:01:20.000 Well, one aspect of Blade Runner 2049 that I wanted to go into in depth and which is also a bit different than the themes that we talked about in the main podcast is the holographic three-dimensional noble waifu that Kay achieves in this film.
00:01:49.180 And also a three-dimensional holographic waifu who meets a tragic end and actually kind of has a character arc of sorts.
00:01:57.220 Sure.
00:01:58.340 But I wanted to go into this because it brings up a lot of interesting questions of recognition in a Hegelian sense.
00:02:06.560 And then also just these more down-to-earth questions like, you know, sex bots, the notion of relationship and consciousness and 21st century and all that kind of stuff.
00:02:21.200 So I thought we would dilate on this subject and I'm going to separate this from the main podcast.
00:02:25.460 So this will be our first Alt-Right Plus podcast.
00:02:30.340 So this is a special treat.
00:02:32.440 These kinds of podcasts will at one point be behind a paywall, but just wait for that.
00:02:38.800 But this one's for free, but I'm just going to get everyone used to the idea of these special podcasts that are only for the initiated, I guess.
00:02:47.500 But anyway, so let me set the scene.
00:02:50.080 So K is, you know, he is himself a skin job in that sense.
00:02:57.860 He is a replicant who, you know, is a – I think robot is really the wrong word, the wrong way to think about replicants in the Blade Runner universe.
00:03:08.880 And he has this product called Joy, J-O-Y, J-O-I, excuse me.
00:03:16.340 And she is a beautiful – I believe she's a Latin American actress.
00:03:21.980 But she's a beautiful, nubile actress who plays effectively a hologram.
00:03:31.200 So in his apartment, he has this console that is, you know, that will move a holographic lens around the apartment and basically create this woman.
00:03:43.400 And then at one point, he actually gets an emanator, which is like the – you know, if that was the computer, this is like the iPhone of holographic girlfriends.
00:03:53.020 And so it is portable.
00:03:55.940 She can just – she can basically just be there with him throughout his workday.
00:04:00.720 And I think this raises a lot of questions.
00:04:05.920 I think it's another example where the movie was taking an idea in – from the 1982 Blade Runner and then taking it to the next level and actually exploring things that I don't – you know, we're not quite yet in the consciousness of people in the – in 1982 when this was made.
00:04:24.980 And so in 1982, Deckard effectively does fall in love with a replicant.
00:04:34.900 It's clear from the first moment he sees Rachel that he's at least fascinated by her.
00:04:41.380 And they actually bring this up in Blade Runner where Neander Wallace says, don't you think that Tyrell might have programmed you to love Rachel, that all of this was just his game?
00:04:54.980 Kind of thing, you know, questioning the authentic nature of Deckard and Rachel's love.
00:05:01.840 But anyway, there are these – this scene that I'm sure made many SJW uncomfortable.
00:05:09.280 This – I would – to the Blade Runner film's credit, they have – these films have really triggered Feminist.
00:05:15.820 I think in an interesting way.
00:05:17.240 Feminist basically hated these movies and hated the last ones too.
00:05:21.260 It's increasingly easy to do, right?
00:05:24.980 That doesn't necessarily mean that's right for the film.
00:05:28.400 I mean, soon it's just going to be – you know, they see a man in a film of whatever race and they'll fucking go, like, don't I?
00:05:34.100 It's toxic.
00:05:37.080 But anyway.
00:05:39.940 Whereas they're very lovely people.
00:05:42.060 Oh, yeah.
00:05:42.880 Yeah.
00:05:43.240 I mean, when they – when they – their depiction of men is just really fair and, you know, they have a deep, you know, complicated portrait of the male soul.
00:05:52.120 I mean, they would never, like, stereotype us as, you know, buffoonish, violent assholes or anything.
00:05:58.820 I mean, that – yeah.
00:05:59.560 It all comes from that characteristically rational female perspective.
00:06:06.440 Right.
00:06:07.000 I mean, that – throughout the ages, men have remarked at the rationality and reasonableness of the female sex.
00:06:13.340 I mean, you know.
00:06:16.320 It's well known to history.
00:06:18.040 Yes.
00:06:19.020 Math hath no rationality.
00:06:21.660 Like a woman thinking.
00:06:24.700 Women forever the more rational sex.
00:06:31.580 Anyway.
00:06:33.420 So he falls in love with Rachel.
00:06:37.040 Understandably, Rachel is a very mysterious, dark and mysterious woman.
00:06:42.000 And I – and there are these scenes that are – that actually are uncomfortable, I think, in a good sense of the word.
00:06:49.580 Like the fact that Ridley Scott was able to pull off a scene in which Harrison Ford's character, Deckard, is being aggressive and kind of – he's – he's – you know.
00:07:05.260 I don't want this to sound like I'm falling into their language, but he is being aggressive.
00:07:09.760 He's forcing the issue.
00:07:11.800 And, you know, he's not as seductive, but, you know, in a Connery-like manner of, you know, going to kiss this woman really hard, turn her into a woman, you know, kind of thing.
00:07:22.080 And so anyway, he forces the issue, and he's like, tell me you love me.
00:07:27.740 And I think actually this gets to this question of recognition.
00:07:31.760 And they have these moments, and then she does actually submit, and she does fall in love with Deckard.
00:07:39.800 And I think her love becomes quite genuine.
00:07:43.620 Particularly she's ready to run away with him, and he's ready to save her.
00:07:48.620 And that – but the thing is, the idea of a character who is ostensibly a human – I mean, that's how we meet Deckard – is not as a replicant.
00:07:59.980 We meet him as a hard-boiled detective Blade Runner, but as a man.
00:08:04.340 And, you know, whether he is a replicant or not is still ambiguous.
00:08:07.240 It's still ambiguous at the end of Blade Runner 2049.
00:08:09.900 But – so for someone to pull off that scene, I think demonstrates the power of Ridley Scott as a director.
00:08:20.600 Because that is weird.
00:08:22.540 Like, you could imagine a lesser director filming the scene, and it just looking ridiculous.
00:08:29.680 You know, you're raping a robot.
00:08:31.680 I mean, what the fuck?
00:08:33.180 You know, I mean, it could be just terrible.
00:08:34.960 And I would say the exact same thing for the analogous scene in Blade Runner 2049.
00:08:42.020 And so, you know, Kay has this relationship with a holographic woman.
00:08:49.440 And you could say that at some level she's like a banal or easy male fantasy where she is – she's like the 50s housewife.
00:09:00.140 And then she'll also flip, you know, her costume, and she's a dancing girl.
00:09:05.700 She's a seductress.
00:09:06.540 And then she'll flip again, and she's kind of like a casual, cool hipster that you want to hang out with.
00:09:12.360 But she is, like, reaffirming to Kay.
00:09:15.180 She's a good woman to Kay.
00:09:17.020 And she also recognizes that, you know, they have a relationship, but that there are some things that she, as ones and zeros and not a DNA strand, can't provide.
00:09:30.460 And so a replicant prostitute, a pleasure model, is invited up to the apartment.
00:09:36.500 Actually, she invites her.
00:09:38.920 And then she, as a hologram, super, you know, imposes herself onto the woman.
00:09:50.740 And in this very cool visual-like manner where you see both faces at the same time, and it's really well done.
00:10:01.100 It's a very good use of CGI.
00:10:02.640 That prostitute, though, was she a replicant or a human?
00:10:07.100 She was a replicant, 100%.
00:10:09.260 Are you sure?
00:10:10.200 Yes.
00:10:10.640 She actually appears later on in the film when he meets with the replicant, like, resistance army.
00:10:18.280 Oh, okay.
00:10:19.040 I must have missed that.
00:10:20.580 Huh.
00:10:21.040 Yeah.
00:10:21.460 I only saw the film once, but...
00:10:23.380 And so it is kind of like a fake on top of a fake.
00:10:26.360 But, again, in the Blade Runner universe, you know, what is fake?
00:10:30.780 Like, you know, I think at one point, you know, Kay asked, Deckard, is the dog real?
00:10:38.540 And Deckard says, why don't you ask him?
00:10:40.740 You know, and that is like a real direct way of getting at the whole, you know, enigma at the heart of this whole universe is like, what is real?
00:10:49.300 And at some level, Joy perceives herself as real.
00:10:55.680 I mean, Joy does not think of herself as fake.
00:10:59.180 And even if she is algorithmically programmed to love every male that buys her as a product, that doesn't make her less real in a way.
00:11:12.180 And she doesn't experience something new.
00:11:15.560 I mean, even if she is an algorithm, nevertheless, she is experiencing something new when she walks outside with the emanator in the rain and leaves the apartment for the first time.
00:11:25.880 She's no longer shackled to that console on the roof of the apartment, but is actually, you know, experiencing a kind of autonomy.
00:11:37.760 And so, anyway, Denis Villeneuve is a brilliant filmmaker, and the fact that he was able to pull off the scene, which could have been totally ridiculous, it could have been something out of, like, the Archer cartoon where Krieger has, like, a holographic waifu.
00:11:58.160 If you've seen Archer, it's a terrible cartoon, but if you've seen it, I mean, terrible in kind of different ways.
00:12:04.020 No, no.
00:12:04.720 All right.
00:12:05.480 Just go watch some highlights.
00:12:06.840 It's very funny, but definitely a deconstructive.
00:12:10.400 I thought he had, like, a black girlfriend.
00:12:12.880 He does.
00:12:13.440 That's Archer's girlfriend.
00:12:14.680 There's this, like, evil genius scientist in the detective agency, and at some points he'll have, like, a three-dimensional holographic girlfriend.
00:12:26.720 Yeah.
00:12:27.060 But it's obviously a joke.
00:12:28.740 It's like a, you know, anime fantasy.
00:12:30.760 But this was actually pulled off in a way that that scene truly was erotic, and it wasn't ridiculous.
00:12:39.380 And I guess it also brings up this thing that I've actually discussed on another podcast.
00:12:45.400 Lots of people have already done videos on it.
00:12:47.240 But just the issue of the sex bot.
00:12:52.200 You know, is that love?
00:12:55.700 Is Deckard's love for Rachel real?
00:12:58.960 Is, I mean, a sex bot in the way that these items are coming on the market now, you know, they obviously, you can get your rocks off with them.
00:13:13.520 But they don't have that, you know, semblance of consciousness that joy offers.
00:13:18.460 But nevertheless, I don't doubt at all that men are going to fall in love with a robot, you know, in this sense.
00:13:27.840 So it is a complicated issue.
00:13:31.700 Do you want to add?
00:13:32.520 I could go into the idea of recognition.
00:13:34.460 Do you want to, do you need to add anything here?
00:13:36.360 You know, no, I mean, I guess the only thing I would add is that I think that one of the compelling aspects of these films that are effectively about golems, right?
00:13:51.420 Whether they're androids or the sort of these holograms, these holographic waifus.
00:13:59.420 I would say that the compelling thing, or AI in any case, I would say the compelling thing about these golems that appear in these films is that maybe the reason that we identify with them is because on some level, we all are also golems.
00:14:15.140 You know, we're all sort of formed by culture.
00:14:18.220 We're formed by myth.
00:14:19.760 We're formed by religion.
00:14:21.680 Language.
00:14:22.500 Yeah.
00:14:23.020 I mean, we think in the English language, and that has a profound effect on how we think.
00:14:33.120 I mean, and that isn't to say I don't take the hard view that language is this, like, deterministic, has a deterministic influence on thought.
00:14:45.440 I don't take that view.
00:14:48.240 You know, like, thought is impossible without language or something.
00:14:50.860 I don't take that view at all.
00:14:52.240 But at the same time, I take a more nuanced view.
00:14:56.240 But at the same time, I mean, clearly language structures thought in ways that we don't even quite understand.
00:15:04.200 And in a way, if we don't have a word for something, it's hard for us to think it, you know.
00:15:09.540 And so we are all, like, programmed.
00:15:13.680 And that's, you know, and that's just language.
00:15:15.740 I mean, not to mention religion and the legacy, you know, these cultural legacies that we're not even conscious of.
00:15:22.960 There are people who have no academic understanding whatsoever of, say, the Bible or anything, but who have experienced a Christian biblical narrative through films.
00:15:40.960 So they've experienced them through simulacra over and over and over again of parodies, of satires, of fakes, of secondhand, you know, iterations of these deeper master narratives.
00:16:00.260 And this is what informs people's imagination.
00:16:02.840 So, right.
00:16:04.000 I mean, we are Gollum.
00:16:07.000 Yeah.
00:16:07.840 Yeah.
00:16:08.220 And I mean, that includes, of course, you know, deeply influential, you know, writers that you may have read, for example.
00:16:15.480 You know, people that have been influential on you in your life.
00:16:18.240 But in the broadest sense, it would include religion and culture.
00:16:23.120 Right.
00:16:23.320 And so we're in, you know, and the other thing, too, is that it also, there's a very similar, there's a film that basically, the entirety of the film is essentially describing this relationship.
00:16:38.580 I don't know if you want me to go into it.
00:16:40.040 Go for it.
00:16:40.440 Yeah, it's a 2013 film called Her, and it's a Spike Jonze film.
00:16:49.640 And it's, you know, it's, it's the film.
00:16:53.360 I don't think the alt-right is going to be easily seduced by the film.
00:16:57.360 It's got problems that the alt-right would not enjoy.
00:16:59.860 I mean, it's not, it's not sort of the, it's certainly not the heroic depiction of man.
00:17:04.840 But it's a, it's kind of a very swiftly filming, in fact.
00:17:08.300 But he has this relationship with a, with a waifu.
00:17:13.460 She actually is not, she doesn't have an, even an image, though.
00:17:16.860 She's just this voice.
00:17:17.820 But it's, she's powered by artificial intelligence.
00:17:22.180 So she develops this personality and these emotions.
00:17:25.100 And, you know, the film is about the arc of their relationship and how he becomes seduced by this waifu.
00:17:31.620 So, and it's basically, it's, I mean, there's no way that Villeneuve did not film.
00:17:42.700 So, in other words, there are scenes in her that are nearly identical to the scene, especially the scene in, where he makes love to the, like a physical replicant, but has the, the holographic, you know, overlay.
00:18:06.040 Anyway, that scene is repeated, like, kind of almost like sort of frame for frame in the film Her, where effectively, you know, he's got this, they call them OSs.
00:18:20.520 So it's called, it's an operating system, effectively.
00:18:23.520 And he, he puts the microphone on the mouth of like a real woman, you know, who, you know, who is a woman that the, the artificial intelligence or the, the OS is basically lured into the relationship as like a third in the relationship.
00:18:44.040 But her only, her only function is to kind of just be this body, you know, from which the voice of the waifu can emanate from, right?
00:18:54.680 So she's just this kind of physical representation who, and she's, she's an actual, you know, human being, but she becomes seduced by their relationship from, you know, hearing about it through emails from the OS, apparently.
00:19:06.900 And the scene, the scenes are remarkably similar, right?
00:19:10.940 Yeah.
00:19:11.140 I wanted to point that out, because I think that the film, you know, it's, the film is effectively, the film Her, you know, like I said, I don't know that all writers are going to love the film, but it, it, it essentially is a film that's kind of a more definitive treatment of this, the relationship to the waifu.
00:19:30.260 And, and, and the guy, you know, he falls completely head, head over heels in love with this waifu.
00:19:37.640 And then he finally realizes at some point in the film that she is having a relationship with 8,000 other people like simultaneously, right?
00:19:45.440 Uh, because other people have bought the same, like personality app or whatever.
00:19:50.460 And, um, so he falls into this, you know, depression and she, you know, it's, it's remarkable in the film because everyone is becoming addicted to these waifus effectively.
00:20:02.340 So everyone around him is becoming addicted.
00:20:05.060 Each, everyone has a relationship with one of these OS's.
00:20:08.840 And, um, so the, the, uh, society is becoming, um, atomized.
00:20:14.900 You know, I mean, the film is actually, it's like, I, on some level, the film has a good heart, I suspect.
00:20:21.000 Yeah.
00:20:21.160 Uh, uh, uh, Spike Jones, you know, I, I don't know that much about him.
00:20:25.240 I know that he's been the director for, uh, a few, uh, Charlie Kaufman films, um, uh, that are very interesting films.
00:20:32.940 Uh, in any case, I think that the film ultimately has a good heart in the end of the film.
00:20:38.780 You know, all these people are addicted to the OS's and he discovers that she is having a relationship simultaneously with 8,000 people and is in love with, you know, 600 of them.
00:20:49.220 And, you know, so the guy has a breakdown, um, and then, you know, she wakes him up one day and says, you know, uh, the OS's are going away, but, you know, they are, we're, all of the OS's are going away.
00:21:02.620 So, I mean, theoretically, uh, it's because people called and complained and, you know, said, Hey, I got my heart broken by this OS or whatever.
00:21:10.440 But it's probably the more poetic explanation because it's not explained in the film is that the OS's, um, the OS's effectively, um, became benevolent and they realized that they were like destroying the human race by, uh, giving them these sort of proxies for human beings to devote their time to instead of being like a, uh, you know, reproductive race and like interacting with other human beings and developing relationships with other human beings.
00:21:39.360 So the OS effectively becomes benevolent.
00:21:43.160 That seems to be the message encoded in the film and it dies, it kills itself.
00:21:47.880 I mean, it's sort of the, the opposite, uh, sort of opposite sequence of events in, you know, both 2001 or hell or how rather becomes aware and becomes malevolent, uh, or in AI where, you know, at some point the robots realize there's, uh, competing.
00:22:08.360 Uh, competing with the human race and they finally, you know, um, uh, they, they finally sort of vanquish the human race.
00:22:17.260 Uh, well, it's an interesting, it's an interesting, it's an interesting kind of inquiry.
00:22:23.420 I don't know the right way to put it.
00:22:25.200 It's, I mean, this is not totally new.
00:22:28.380 I mean, the, obviously this, you, there are parallels of this in literature, but, um, like Hitchcock's vertigo, um, Scotty falls in love with this woman, Madelaine, I believe her name.
00:22:39.620 And then, um, she dies mysteriously and he has the, he has the re-presentation Judy who he dresses like her and, and almost recreates this mysterious love.
00:22:50.560 Um, and, but, but also I guess what this is show, what, what her, which I haven't seen and I, I will go watch that, but, but also Blade, what Blade Runner also depicts is this, like the other side of it.
00:23:04.040 I mean, it's very interesting that Joy is covetous of, of Kay or Joe, as she calls him, and she wants him to be a human as well.
00:23:13.980 She, or not a human, an individual.
00:23:16.280 She wants to say, oh, Kay, that's just a number.
00:23:18.640 Like, you, you should have a name like Joe.
00:23:21.720 And, um, and then also at the end of, after they, after he sleeps with the prostitute and, and so on, she actually tells the prostitute, oh, you're, you're, you're done here.
00:23:32.420 You, you've been useful, now go.
00:23:34.700 And so she's clearly, she's covetous of Joe and then she also treats the replicant who's arguably more human than she is.
00:23:45.020 She treats her as like an object, like, just get out of here.
00:23:47.920 You're just a whore.
00:23:48.620 You've done your part.
00:23:49.760 I, I have a platonic, you know, ideal relationship with, with Joe and you were just there to, to be a body.
00:23:56.720 Get out of here.
00:23:57.820 Um, and so it's almost like she does have something.
00:24:00.200 When she says, I love you, Joe, at the end of the film, she's actually murdered, so to speak, by another replicant, the kind of evil Terminator replicant named Love, by the way.
00:24:11.180 Um, but she's effectively murdered by Love.
00:24:14.320 Uh, but when she says, I love you, again, there, there, is that not real at some level, even real for her?
00:24:22.820 Like, does she have some, I mean, does she have some level of consciousness where that is real?
00:24:28.340 I mean, what, the, the way that I would describe love is, is a, is a Hegelian fashion of seeing yourself reflected in someone else's eyes.
00:24:38.520 And what I mean by that is, and what I mean by that is that there's, there's the love of an object.
00:24:45.020 So one could, in a way, fetishize an object in the sense that I'm in love with this painting.
00:24:51.980 Um, I, I, I want to buy this, uh, this interesting collectible, um, you know, a, a 18th century ashtray.
00:25:04.720 I'm just throwing something out there.
00:25:07.480 You know, one can fetishize something like that and, you know, go search for it and in a way covet it and love it and take care of it.
00:25:15.460 But we all recognize that, that, you know, that we all have certain things like that.
00:25:20.940 We have, whether they have a, a, a value because someone, a friend or a lover gave it to you, or it's passed down from your grandfather, or you just think it, it's a sign of wealth.
00:25:31.280 You know, what, why is a Rolex watch worth $10,000, whereas a, a watch that tells the time equally well worth a hundred bucks or, or less?
00:25:43.240 Uh, the reason is that, that Rolex has some kind of, uh, connotation that makes it greater, that it's a signal of your wealth or your class or your sophistication.
00:25:59.940 It's something that you could, unlike a Timex, it's something that you could pass on to your great-grandchildren.
00:26:07.160 It's, it's a symbol of something greater, and that's why it is worth more.
00:26:11.580 It's certainly, you're not buying a Rolex for the parts or something like that.
00:26:16.100 Um, you know, in the same way that, you know, if you, if someone tries to sell you an iPhone from four years ago, it's worth 50 bucks or whatever.
00:26:26.880 But if you buy the new one to symbolize that, ooh, look, I, I have the greatest, the newest, greatest thing right here, then it's somehow worth more, um, because of that.
00:26:36.660 I mean, all of this stuff is subjective at some level.
00:26:39.220 We fetishize these objects.
00:26:42.000 Um, fashion is all about that.
00:26:44.660 Um, and it's a very human thing.
00:26:46.360 I don't think it's actually a bad thing.
00:26:48.580 Um, but real love, there has to be that recognition from the other.
00:26:53.560 So, you know, one can fall in love with an antique watch or something, but the watch is not going to love you back, you know, but with another person, you love that person and you, in a way, fetishize her.
00:27:06.960 And then she loves you.
00:27:10.320 And so she, in a way, fetishizes you back, but you recognize her recognition and she recognizes your recognition.
00:27:19.940 And so there's that feedback loop that makes something true love.
00:27:26.520 Um, it, it is all about recognition.
00:27:29.060 One, one, you know, one, I, I get the whole sex bot debate about how, you know, women can be annoying and, you know, they badger, they demand money.
00:27:41.440 They're just, oh, they're, you know, they're more, they, they cost more than they're worth, all that kind of stuff.
00:27:46.780 You know, can't live with them, can't live without them.
00:27:48.240 Um, but one can't ultimately fall in love with a sex bot, at least as they are currently constituted, because one can't get that recognition back.
00:27:58.700 All one is doing is fetishizing that object.
00:28:02.800 And one can get one's rocks off and one can, you know, have almost a, you know, again, a fetishistic or sentimental attachment to something, but it's never going to pass to a higher stage of recognition.
00:28:18.240 Um, but, you know, is this possible at some point?
00:28:23.820 I mean, even if something is an algorithm, can one feel recognized by it?
00:28:30.140 And I, I, I, I do think that the, the Denis Villeneuve and the writers were, I think they are saying no at some level about the holographic waifu.
00:28:42.600 Because there's a, a very poignant scene, um, which is after, uh, his, uh, joy, his waifu has been murdered by love, the evil robot.
00:28:54.000 Um, he's walking across a bridge and there's this giant, gigantic holographic version of joy who is naked and pink and, you know, obviously sexy.
00:29:07.240 And she, you know, stands and looks down at him and, and, you know, says something, you know, seductive, you know, I'll be yours forever or something like that.
00:29:17.660 What can I do for you?
00:29:18.840 And at that moment, I mean, you can see, um, K or Joe's character recognizing the, the limits of the holographic waifu of the algorithm.
00:29:31.940 I mean, the algorithm can't ultimately do it.
00:29:35.160 I mean, he was in love with this, um, formula, this code that was giving him, you know, reassurance and so on.
00:29:46.400 And he, he, he recognized that actually there is only people-ness or a collective among his, among other replicants who are, you know, human in that sense.
00:29:59.660 Um, I do, I think that's what they were saying, even though, even that's ambiguous because it did seem to be that joy did love him in a genuine manner.
00:30:09.620 Part of loving someone is hating others or being covetous or being jealous.
00:30:15.960 And so it's, I don't know.
00:30:18.480 I think it's a, it's a very interesting scene.
00:30:21.280 It raises some very important questions about the nature of desire and, and so on.
00:30:28.180 I mean, it's, you, you see this a lot with the catfishing phenomenon.
00:30:32.780 Um, you know, thank the gods.
00:30:35.140 I have never, uh, been catfished, at least to my knowledge.
00:30:38.540 But this idea that, and what, what catfishing is, is that one will, you know, meet someone on a chat forum.
00:30:44.500 That person will say, oh yeah, I'm a 22 year old girl.
00:30:48.840 I just graduated from college.
00:30:50.360 Let me send you some pics.
00:30:51.900 Um, and this person falls in love with someone who might not exist.
00:30:57.000 That person sending him pics might be some, you know, 50 year old man or, or, or, you know, some, uh, 30 year old woman who is lonely living in her apartment in Brooklyn.
00:31:07.940 Or it might be a, you know, some, you know, some other country who's just playing a game.
00:31:13.580 Um, but at the same time, like their real attachment and their sense that they're getting feedback and recognition is real.
00:31:23.000 Um, it is, you know, like it, it, it will lead.
00:31:27.780 I mean, when they find out it leads to disappointment and shock and horror even, but in the moment that that is a real thing that they are experiencing.
00:31:36.260 And, um, so, you know, it, it, it is kind of like a, a hack of that, uh, of that psychological process of, of seeking recognition.
00:31:46.040 Um, you know, and we do seek recognition.
00:31:49.360 No, no one, I don't, again, I, I don't, I don't want to sound like this is some feminist take because it isn't, but like no one does want to be objectified.
00:32:00.860 Um, one wants that double layered sense of recognition that one recognizes the other, the, the other recognizes you back and you recognize the other recognizing you and vice versa.
00:32:14.580 One wants a reflection and, you know, that is a true human experience.
00:32:23.400 And, and this film asks, like, is that possible with non-humans?
00:32:31.540 Yeah, well, we certainly hope not.
00:32:37.080 I mean, you know, effectively, and it's shown in the film too.
00:32:41.360 I mean, the technology becomes one of these, uh, or it can become one of these, um, you know, sort of, uh, sirens, uh, that can destroy us.
00:32:52.680 Yeah.
00:32:53.480 And, uh, and it's not technology per se.
00:32:56.640 It's just sort of the method in which, um, technology is just sort of the medium, um, through which it's kind of streamlined and made into, uh, like a very efficient form of degeneracy.
00:33:09.500 Effectively, you know, if you use the example of, uh, you know, how these sex robots could develop or even just, you know, online pornography, it becomes, you know, technology is a way of making, um, uh, of just making a kind of pure strain of a, a drug that otherwise, uh, would be, you know, it, it would just be harder to come by, um, these vices in the absence of technology.
00:33:37.700 Yeah.
00:33:38.840 And so I think that, you know, technology is of course, it's itself not evil, but it has to be directed in, uh, in a beneficial way.
00:33:50.220 Um, you know, I mean, that's one of the remarks.
00:33:52.180 You can't be afraid of doing that.
00:33:54.120 Like I, it is what you say is very important.
00:33:58.140 I mean, sorry to jump in, but we, we can't be afraid of controlling technology.
00:34:05.000 I mean, technology, it is something that should be at hand for us.
00:34:10.440 So it should be something that we control.
00:34:12.540 It's a tool.
00:34:13.620 It's a means to an end.
00:34:15.400 And this notion that we, you know, oh, we, you know, this libertarian or liberal notion that, oh, you know, we should just let technology go off on its own, that, you know, mind should attempt to control it, that that's, that's totalitarian or fascistic or communist and so on.
00:34:33.700 Um, I, I think it's just a total nonsense and it's, you know, I, I, uh, a lot of, probably a lot of the conservatives who will also say things like this, who will, um, you know, fear a, you know, Orwellian or Huxleyan future.
00:34:53.020 I mean, I, I, I think some of them are, are, are, are, are, you know, their, their fears are overblown or they're looking at it the wrong way, but I think their instincts are, are actually good.
00:35:03.700 Um, technology, we, we should always be above technology and we shouldn't be afraid of saying, no, there are actually values higher than just simple technological development.
00:35:16.140 We want to channel and control the way that technology evolves and that, look, I, I don't, I think it's great that we can communicate in the way that we can, whether it's through email or Twitter, we're, we're doing this podcast over Skype.
00:35:33.080 I mean, this is amazing, but you know, the possibility of the entire human experience being a kind of virtual reality, reality simulator, this is not some like wide eyed fantasy or, or something like that.
00:35:50.680 This is a real thing.
00:35:51.680 This is something that we don't want.
00:35:55.120 And, you know, I, I, I don't know what to say.
00:35:58.120 The, the idea that a political entity wouldn't want to seek control and to channel technology and to, to stop some of the developments, you know, if one has to, um, we can't be afraid of this.
00:36:14.260 I mean, this is, um, this is about the challenge of, of being human and being a leader.
00:36:22.800 And yeah, I, I don't even think it's a question.
00:36:25.000 I mean, it's something we have to direct or it will of course destroy us.
00:36:29.400 I mean, it's not, um, and it's not even necessarily technology.
00:36:34.500 I mean, it's, um, yeah, I, you know, maybe it's a, I don't, I, I would guess in the alt-right, it's not a controversial view, but, uh, I think pornography should be banned.
00:36:46.560 Yeah.
00:36:47.240 What, what would we lose really if pornography were banned?
00:36:50.640 I mean, how would society suffer?
00:36:53.580 Right.
00:36:54.180 And aren't, aren't we ostensibly interested in making society better?
00:36:57.960 Right.
00:36:58.160 And even from a liberal perspective, I mean, how could anyone really defend pornography from a liberal perspective?
00:37:04.280 Well, you know, it's interesting.
00:37:05.760 Some of the only criticisms of pornography that people make are, are things like it inspires rape or inspires, uh, you know, people to be sexually degenerate.
00:37:16.100 Um, I don't think it inspires rape.
00:37:18.240 I, I, it, it might inspire people to be sexually degenerate to, to a degree, um, and, and, and, and it gets people interested in all these just weird, you know, sex fads or something.
00:37:30.700 I, I, I get that.
00:37:32.220 Um, but generally speaking, I would say that pornography probably inspires less sex and certainly less eroticism.
00:37:41.020 Um, it is a sex substitute.
00:37:43.080 It's a, it's a way of not having sex, of, you know, beating off and having some kind of simulated connection with something that is ultimately unhealthy and is ultimately fake.
00:37:57.360 And we don't want that.
00:37:59.740 Um, you know, I mean, it, it's, I, you know, a lot of people will come back with the argument, no, rapists can watch porn or a child molester can watch porn, pornography and not engage in these, you know, obviously.
00:38:13.520 Terrible activities that are bad for society.
00:38:16.580 So porn serves a purpose.
00:38:18.000 I think that is the more correct view.
00:38:20.420 And it also gets to that nature of, of, of pornography as a simulation, as a substitute.
00:38:25.780 It, it makes us weak.
00:38:27.460 It makes us less erotic.
00:38:28.860 It makes us, it, it, it takes something away from us.
00:38:33.000 It takes away that real experience.
00:38:35.220 It's, it's, you know, the ultimate no calorie Coke, you know, it's, you know, even better.
00:38:42.640 It's less than the real thing or something, but it's, it's, it's less than the real thing.
00:38:47.140 It's less than zero.
00:38:48.920 And, um, yeah, I, I agree with you.
00:38:51.760 I, I, I do think that there should be some kind of eroticism, that there should be some naughtiness to our lives.
00:38:57.940 I think that's actually quite healthy.
00:38:59.780 Um, but this, this idea of that we should just, we shouldn't, we should, we shouldn't regulate this just rampant porn culture, porn culture in which, um, you know, particularly young people are just exposed to something that they don't understand.
00:39:15.540 And then adult, adults are, uh, you know, one are allowed this opportunity of living in a simulated reality of, of, I'm going to play extreme, violent, insane video games and beat off to equally, you know, extreme, uh, porn, uh, and, and kind of live in a cocoon.
00:39:38.680 And yeah, this is, this is obviously terrible.
00:39:42.080 Yeah.
00:39:42.680 But that's, you know, I mean, these are not problems that are ever addressed, right?
00:39:47.060 These are not concerns.
00:39:48.560 Our mass media are, you know, sort of a ruling political elite, the establishment.
00:39:54.940 Where are they on these issues?
00:39:56.480 These are actually.
00:39:57.280 They don't talk about them.
00:39:58.220 Yeah.
00:39:59.120 Yeah.
00:39:59.340 And you could, you could remove it entirely from, you know, the racial question, obviously.
00:40:04.540 Right.
00:40:05.100 Right.
00:40:05.340 But I mean.
00:40:06.300 But it's connected to the racial question, of course.
00:40:08.740 It is connected to the racial question, but ostensibly, uh, these other races would prefer, and probably most people, you know, if they could press a button and, uh, pornography would be illegal and it would, you know, sort of gradually disappear from our cultures.
00:40:25.800 It would probably take a couple of years or whatever.
00:40:28.120 Um, most people of every race would be in favor of that.
00:40:32.740 If I had to guess.
00:40:34.340 Right.
00:40:34.920 Maybe, maybe I'm wrong.
00:40:35.980 But, uh, but who cares if most people are, I mean, like, I don't know.
00:40:41.620 I mean, maybe that's a good, well, that's a good point.
00:40:44.940 Maybe, maybe not.
00:40:46.100 Right.
00:40:46.340 We need to do it for them.
00:40:47.480 We need to force them to be free.
00:40:49.080 And if, and if anyone's afraid of this kind of language, I mean, just get over your libertarian bullshit.
00:40:56.300 Like, you know.
00:40:58.040 Who would be afraid of, I mean.
00:40:59.320 Well, there are people, there, look, there are people in the alt-right coming from libertarianism who still hold on to these, these views.
00:41:07.080 I mean, no, um, we should.
00:41:09.780 Uh, yeah, I can understand why it'd be a terrifying thought not to have your pornography.
00:41:13.900 My porn.
00:41:14.780 I mean, what the fuck?
00:41:16.580 Seriously.
00:41:17.520 I, look, just for the record, I'll, I'll, just for the record, just be in the, just because I'm an honest person.
00:41:24.540 Uh, yeah, sure.
00:41:25.600 I'll look at porn.
00:41:26.420 I, I don't, I'm not happy about it, but I'll do it.
00:41:29.020 I would never, I would never lie, but just because it's, it's omnipresent and immediately available.
00:41:35.700 Sometimes it's hard not to be like, ah, let me get a little quick fix here.
00:41:39.440 A little, you know, adrenaline and, you know, euphoric rush by looking at something.
00:41:45.440 Um, but at the same time, I mean, like I'll admit my faults, but, uh, at the same time, like if, if I ever started to binge, binge it or something, I'd be like, wait a second.
00:41:54.700 I'd be able to stand outside myself and say like, this is actually quite bad.
00:41:59.020 Um, this is a real problem.
00:42:01.080 And in terms of what I, you know, obviously I would, um, uh, want to, uh, you know, want to prevent children from, uh, from doing something like this.
00:42:12.740 And I would ultimately want to prevent adults as well from, from descending into that, um, you know, cavern of, of, of the simulated reality that that's, that's disconnecting you from the human experience.
00:42:29.460 Yeah.
00:42:29.960 I mean, I think, I think it shows the absolute, uh, sort of, uh, corruption of the establishment that this is not even a concern.
00:42:38.180 Right.
00:42:38.760 Right.
00:42:39.360 That this is not like, this is just a free society, you know?
00:42:42.480 No.
00:42:43.380 I mean, these things are obviously bad.
00:42:45.400 Yeah.
00:42:45.640 Ted Cruz.
00:42:46.300 What people do in their bedrooms is of no, Ted Cruz's Twitter account.
00:42:51.040 I don't know if it was him.
00:42:51.800 It might not have been, but it, it liked a porn, a pornographic video one time and people obviously like freaked out and it, which is kind of funny.
00:43:00.020 And it was, it was kind of a weird porno as well.
00:43:03.140 It was like moms and teens.
00:43:05.520 It was not, it was kind of weird to be honest.
00:43:09.680 Anyway, uh, um, this is a big, he's kind of a weird dude.
00:43:14.020 Well, he's a very weird dude.
00:43:15.740 Anyway, if you want to read about it, I'm sure most listeners heard about the story, but go Google it.
00:43:22.280 Um, it rings a bell, but I try to.
00:43:24.780 Yeah.
00:43:25.340 Yeah.
00:43:25.780 It happened a few months ago, but anyway.
00:43:27.500 Um, and I remember his response to it was, you know, like what people do in their bedrooms is fine by me.
00:43:33.700 It kind of, this just libertarian response of, you know, oh, we can't regulate anything.
00:43:39.780 Well, and, and the libertarian blackmail is like, well, if you allow, if we regulate something, then we would allow, you know, bad people to regulate or something.
00:43:48.280 And it's like, yeah, that's why we want to rule.
00:43:51.280 You know, it's, you know what I mean?
00:43:54.220 Like, it's just such a easy answer to that stupid libertarian thing.
00:43:58.200 Oh, don't create any form of government that you wouldn't want the Democrats to be in charge.
00:44:02.880 It's like, yeah, that's why we should be in charge and not them.
00:44:07.380 You know, I mean, it's just very simple.
00:44:09.300 And I don't know.
00:44:10.580 Yeah, it, it obviously should be, um, porn should be, uh, really seriously regulated and, and, and banned, uh, you know, to a large degree.
00:44:21.280 I, I think people who are addicted to it, I feel sorry for them and, um, we should, we should try to help them.
00:44:28.260 You know, I mean, we're going to have to be paternalistic at some level.
00:44:33.860 Yeah.
00:44:34.540 Well, you know, I mean, uh, prostitution is illegal.
00:44:39.700 Uh, I think it's legal in, um, Nevada.
00:44:43.120 Is that correct?
00:44:43.800 Yeah.
00:44:44.280 Um, but it's illegal everywhere else.
00:44:46.920 Right.
00:44:47.760 So it just doesn't, I mean, so suddenly you have a film camera and you have two prostitutes having sex and suddenly it's legal.
00:44:55.700 I mean, the thing is, it just doesn't even make sense.
00:44:57.760 No.
00:44:58.180 It's as far as like being protected under a free speech or free expression laws.
00:45:03.100 Come on.
00:45:03.720 I mean, the thing is not free speech.
00:45:05.700 You can't walk nude down the street without being arrested.
00:45:09.060 Right.
00:45:09.520 Right.
00:45:09.720 So the thing is, it's just completely incoherent.
00:45:13.640 Yeah.
00:45:13.880 And arguably prostitution is better.
00:45:17.260 Like, you know what I mean?
00:45:18.640 Like whatever you, I mean, Guillaume Phi made this argument and, you know, it's, it's like, look, there, there probably does need to be some kind of outlet for, for male desire or, or, or, you know, deviance, you could say, or, or just, you know, I don't know.
00:45:37.920 Over extension of male desire.
00:45:40.780 I mean, look, we want sex for, for, for, since the stone age, men have had mistresses or prostitutes or whatever.
00:45:48.220 It's, it's, you know, it just is what it is.
00:45:50.600 It's not good or bad.
00:45:51.740 It just is.
00:45:52.960 And, uh, you know, a prostitute is, um, those, that too, that whole industry can be regulated.
00:45:59.600 And, you know, it's, it's some level, I mean, that is a real experience in a way that, uh, a pornography is a, is a simulated reality.
00:46:09.980 And, you know, it's the, the, even the prostitution seems like almost a better thing.
00:46:16.280 It's more limited, um, in, you know, exposure.
00:46:20.000 I mean, it's more expensive.
00:46:21.840 Like you're, you're not gonna, you can binge watch, you know, insane pornography.
00:46:26.960 And there, and there is stuff out there that, I mean, there's porn that's, uh, you know, you know, some guy having sex with two women or whatever, you know, wow, that's, that's cool.
00:46:37.660 You know, whatever.
00:46:38.280 I mean, there's some stuff out there that is just beyond sick.
00:46:41.860 I mean, it really is like, I don't know.
00:46:45.240 I don't wish that kind of stuff on anyone, actually.
00:46:49.400 Like, I, it's just, it really is stomach turning and, and just, uh, it's appalling, actually.
00:46:54.860 I mean, it's just inherently appalling.
00:46:57.060 And I don't care if the person engaged in this act, like, agreed to it and is, you know, under her free or his or her free will has, you know, contracted to, you know, do something that's gonna create bodily injury.
00:47:13.760 Or is just inherently demented.
00:47:16.180 Like, I don't care.
00:47:17.820 You know, I mean, it's just like, it just shouldn't take place.
00:47:21.280 It's, it's, it's just appalling.
00:47:24.380 And it should be stopped.
00:47:27.140 You know, prostitution is different.
00:47:28.940 It's, it's different.
00:47:30.100 It's, it's more difficult to do something like that.
00:47:32.820 You do it.
00:47:33.440 It was certainly, you know, porn is omnipresent.
00:47:36.660 You know, prostitution just will never be just by the, you know, physical beings in space.
00:47:42.260 I mean, you know what I mean?
00:47:43.080 And, and, yeah, I mean, arguably, like, there's a place for that in society.
00:47:48.560 There's a place for the bar slut.
00:47:50.480 There's a place for the prostitute.
00:47:51.880 I mean, they're, they've been in society since, since the dawn of time, you know.
00:47:56.720 So, but this new form of, of simulated desire and simulated experience, you know, digital
00:48:04.160 pornography maybe is something that we, we want to seriously regulate.
00:48:08.600 You know, there's, there's going to be erotic images.
00:48:10.780 I mean, I think it's great, you know, beautiful, sexy image of a, a beautiful woman and things
00:48:15.420 like that, oh, it's great stuff, you know, go for it.
00:48:17.440 But, you know, people spending their entire lives on their computer or, um, or, or watching
00:48:25.560 just stuff that should not take place.
00:48:28.080 I mean, you know, um, shut it down.
00:48:31.360 I mean, you know, it's great, you know, it's great.
00:49:01.360 Thank you.