In this episode of Alt-Right Plus, we discuss Blade Runner 2049 and the themes explored in the film, such as the relationship between replicants and artificial intelligence, and how this ties into the larger themes of the film.
00:01:20.000Well, one aspect of Blade Runner 2049 that I wanted to go into in depth and which is also a bit different than the themes that we talked about in the main podcast is the holographic three-dimensional noble waifu that Kay achieves in this film.
00:01:49.180And also a three-dimensional holographic waifu who meets a tragic end and actually kind of has a character arc of sorts.
00:01:58.340But I wanted to go into this because it brings up a lot of interesting questions of recognition in a Hegelian sense.
00:02:06.560And then also just these more down-to-earth questions like, you know, sex bots, the notion of relationship and consciousness and 21st century and all that kind of stuff.
00:02:21.200So I thought we would dilate on this subject and I'm going to separate this from the main podcast.
00:02:25.460So this will be our first Alt-Right Plus podcast.
00:02:32.440These kinds of podcasts will at one point be behind a paywall, but just wait for that.
00:02:38.800But this one's for free, but I'm just going to get everyone used to the idea of these special podcasts that are only for the initiated, I guess.
00:02:50.080So K is, you know, he is himself a skin job in that sense.
00:02:57.860He is a replicant who, you know, is a – I think robot is really the wrong word, the wrong way to think about replicants in the Blade Runner universe.
00:03:08.880And he has this product called Joy, J-O-Y, J-O-I, excuse me.
00:03:16.340And she is a beautiful – I believe she's a Latin American actress.
00:03:21.980But she's a beautiful, nubile actress who plays effectively a hologram.
00:03:31.200So in his apartment, he has this console that is, you know, that will move a holographic lens around the apartment and basically create this woman.
00:03:43.400And then at one point, he actually gets an emanator, which is like the – you know, if that was the computer, this is like the iPhone of holographic girlfriends.
00:03:55.940She can just – she can basically just be there with him throughout his workday.
00:04:00.720And I think this raises a lot of questions.
00:04:05.920I think it's another example where the movie was taking an idea in – from the 1982 Blade Runner and then taking it to the next level and actually exploring things that I don't – you know, we're not quite yet in the consciousness of people in the – in 1982 when this was made.
00:04:24.980And so in 1982, Deckard effectively does fall in love with a replicant.
00:04:34.900It's clear from the first moment he sees Rachel that he's at least fascinated by her.
00:04:41.380And they actually bring this up in Blade Runner where Neander Wallace says, don't you think that Tyrell might have programmed you to love Rachel, that all of this was just his game?
00:04:54.980Kind of thing, you know, questioning the authentic nature of Deckard and Rachel's love.
00:05:01.840But anyway, there are these – this scene that I'm sure made many SJW uncomfortable.
00:05:09.280This – I would – to the Blade Runner film's credit, they have – these films have really triggered Feminist.
00:05:43.240I mean, when they – when they – their depiction of men is just really fair and, you know, they have a deep, you know, complicated portrait of the male soul.
00:05:52.120I mean, they would never, like, stereotype us as, you know, buffoonish, violent assholes or anything.
00:06:37.040Understandably, Rachel is a very mysterious, dark and mysterious woman.
00:06:42.000And I – and there are these scenes that are – that actually are uncomfortable, I think, in a good sense of the word.
00:06:49.580Like the fact that Ridley Scott was able to pull off a scene in which Harrison Ford's character, Deckard, is being aggressive and kind of – he's – he's – you know.
00:07:05.260I don't want this to sound like I'm falling into their language, but he is being aggressive.
00:07:11.800And, you know, he's not as seductive, but, you know, in a Connery-like manner of, you know, going to kiss this woman really hard, turn her into a woman, you know, kind of thing.
00:07:22.080And so anyway, he forces the issue, and he's like, tell me you love me.
00:07:27.740And I think actually this gets to this question of recognition.
00:07:31.760And they have these moments, and then she does actually submit, and she does fall in love with Deckard.
00:07:39.800And I think her love becomes quite genuine.
00:07:43.620Particularly she's ready to run away with him, and he's ready to save her.
00:07:48.620And that – but the thing is, the idea of a character who is ostensibly a human – I mean, that's how we meet Deckard – is not as a replicant.
00:07:59.980We meet him as a hard-boiled detective Blade Runner, but as a man.
00:08:04.340And, you know, whether he is a replicant or not is still ambiguous.
00:08:07.240It's still ambiguous at the end of Blade Runner 2049.
00:08:09.900But – so for someone to pull off that scene, I think demonstrates the power of Ridley Scott as a director.
00:09:17.020And she also recognizes that, you know, they have a relationship, but that there are some things that she, as ones and zeros and not a DNA strand, can't provide.
00:09:30.460And so a replicant prostitute, a pleasure model, is invited up to the apartment.
00:10:23.380And so it is kind of like a fake on top of a fake.
00:10:26.360But, again, in the Blade Runner universe, you know, what is fake?
00:10:30.780Like, you know, I think at one point, you know, Kay asked, Deckard, is the dog real?
00:10:38.540And Deckard says, why don't you ask him?
00:10:40.740You know, and that is like a real direct way of getting at the whole, you know, enigma at the heart of this whole universe is like, what is real?
00:10:49.300And at some level, Joy perceives herself as real.
00:10:55.680I mean, Joy does not think of herself as fake.
00:10:59.180And even if she is algorithmically programmed to love every male that buys her as a product, that doesn't make her less real in a way.
00:11:12.180And she doesn't experience something new.
00:11:15.560I mean, even if she is an algorithm, nevertheless, she is experiencing something new when she walks outside with the emanator in the rain and leaves the apartment for the first time.
00:11:25.880She's no longer shackled to that console on the roof of the apartment, but is actually, you know, experiencing a kind of autonomy.
00:11:37.760And so, anyway, Denis Villeneuve is a brilliant filmmaker, and the fact that he was able to pull off the scene, which could have been totally ridiculous, it could have been something out of, like, the Archer cartoon where Krieger has, like, a holographic waifu.
00:11:58.160If you've seen Archer, it's a terrible cartoon, but if you've seen it, I mean, terrible in kind of different ways.
00:12:14.680There's this, like, evil genius scientist in the detective agency, and at some points he'll have, like, a three-dimensional holographic girlfriend.
00:12:58.960Is, I mean, a sex bot in the way that these items are coming on the market now, you know, they obviously, you can get your rocks off with them.
00:13:13.520But they don't have that, you know, semblance of consciousness that joy offers.
00:13:18.460But nevertheless, I don't doubt at all that men are going to fall in love with a robot, you know, in this sense.
00:13:32.520I could go into the idea of recognition.
00:13:34.460Do you want to, do you need to add anything here?
00:13:36.360You know, no, I mean, I guess the only thing I would add is that I think that one of the compelling aspects of these films that are effectively about golems, right?
00:13:51.420Whether they're androids or the sort of these holograms, these holographic waifus.
00:13:59.420I would say that the compelling thing, or AI in any case, I would say the compelling thing about these golems that appear in these films is that maybe the reason that we identify with them is because on some level, we all are also golems.
00:14:15.140You know, we're all sort of formed by culture.
00:14:23.020I mean, we think in the English language, and that has a profound effect on how we think.
00:14:33.120I mean, and that isn't to say I don't take the hard view that language is this, like, deterministic, has a deterministic influence on thought.
00:15:13.680And that's, you know, and that's just language.
00:15:15.740I mean, not to mention religion and the legacy, you know, these cultural legacies that we're not even conscious of.
00:15:22.960There are people who have no academic understanding whatsoever of, say, the Bible or anything, but who have experienced a Christian biblical narrative through films.
00:15:40.960So they've experienced them through simulacra over and over and over again of parodies, of satires, of fakes, of secondhand, you know, iterations of these deeper master narratives.
00:16:00.260And this is what informs people's imagination.
00:16:23.320And so we're in, you know, and the other thing, too, is that it also, there's a very similar, there's a film that basically, the entirety of the film is essentially describing this relationship.
00:16:38.580I don't know if you want me to go into it.
00:17:17.820But it's, she's powered by artificial intelligence.
00:17:22.180So she develops this personality and these emotions.
00:17:25.100And, you know, the film is about the arc of their relationship and how he becomes seduced by this waifu.
00:17:31.620So, and it's basically, it's, I mean, there's no way that Villeneuve did not film.
00:17:42.700So, in other words, there are scenes in her that are nearly identical to the scene, especially the scene in, where he makes love to the, like a physical replicant, but has the, the holographic, you know, overlay.
00:18:06.040Anyway, that scene is repeated, like, kind of almost like sort of frame for frame in the film Her, where effectively, you know, he's got this, they call them OSs.
00:18:20.520So it's called, it's an operating system, effectively.
00:18:23.520And he, he puts the microphone on the mouth of like a real woman, you know, who, you know, who is a woman that the, the artificial intelligence or the, the OS is basically lured into the relationship as like a third in the relationship.
00:18:44.040But her only, her only function is to kind of just be this body, you know, from which the voice of the waifu can emanate from, right?
00:18:54.680So she's just this kind of physical representation who, and she's, she's an actual, you know, human being, but she becomes seduced by their relationship from, you know, hearing about it through emails from the OS, apparently.
00:19:06.900And the scene, the scenes are remarkably similar, right?
00:19:11.140I wanted to point that out, because I think that the film, you know, it's, the film is effectively, the film Her, you know, like I said, I don't know that all writers are going to love the film, but it, it, it essentially is a film that's kind of a more definitive treatment of this, the relationship to the waifu.
00:19:30.260And, and, and the guy, you know, he falls completely head, head over heels in love with this waifu.
00:19:37.640And then he finally realizes at some point in the film that she is having a relationship with 8,000 other people like simultaneously, right?
00:19:45.440Uh, because other people have bought the same, like personality app or whatever.
00:19:50.460And, um, so he falls into this, you know, depression and she, you know, it's, it's remarkable in the film because everyone is becoming addicted to these waifus effectively.
00:20:02.340So everyone around him is becoming addicted.
00:20:05.060Each, everyone has a relationship with one of these OS's.
00:20:08.840And, um, so the, the, uh, society is becoming, um, atomized.
00:20:14.900You know, I mean, the film is actually, it's like, I, on some level, the film has a good heart, I suspect.
00:20:21.160Uh, uh, uh, Spike Jones, you know, I, I don't know that much about him.
00:20:25.240I know that he's been the director for, uh, a few, uh, Charlie Kaufman films, um, uh, that are very interesting films.
00:20:32.940Uh, in any case, I think that the film ultimately has a good heart in the end of the film.
00:20:38.780You know, all these people are addicted to the OS's and he discovers that she is having a relationship simultaneously with 8,000 people and is in love with, you know, 600 of them.
00:20:49.220And, you know, so the guy has a breakdown, um, and then, you know, she wakes him up one day and says, you know, uh, the OS's are going away, but, you know, they are, we're, all of the OS's are going away.
00:21:02.620So, I mean, theoretically, uh, it's because people called and complained and, you know, said, Hey, I got my heart broken by this OS or whatever.
00:21:10.440But it's probably the more poetic explanation because it's not explained in the film is that the OS's, um, the OS's effectively, um, became benevolent and they realized that they were like destroying the human race by, uh, giving them these sort of proxies for human beings to devote their time to instead of being like a, uh, you know, reproductive race and like interacting with other human beings and developing relationships with other human beings.
00:21:39.360So the OS effectively becomes benevolent.
00:21:43.160That seems to be the message encoded in the film and it dies, it kills itself.
00:21:47.880I mean, it's sort of the, the opposite, uh, sort of opposite sequence of events in, you know, both 2001 or hell or how rather becomes aware and becomes malevolent, uh, or in AI where, you know, at some point the robots realize there's, uh, competing.
00:22:08.360Uh, competing with the human race and they finally, you know, um, uh, they, they finally sort of vanquish the human race.
00:22:17.260Uh, well, it's an interesting, it's an interesting, it's an interesting kind of inquiry.
00:22:25.200It's, I mean, this is not totally new.
00:22:28.380I mean, the, obviously this, you, there are parallels of this in literature, but, um, like Hitchcock's vertigo, um, Scotty falls in love with this woman, Madelaine, I believe her name.
00:22:39.620And then, um, she dies mysteriously and he has the, he has the re-presentation Judy who he dresses like her and, and almost recreates this mysterious love.
00:22:50.560Um, and, but, but also I guess what this is show, what, what her, which I haven't seen and I, I will go watch that, but, but also Blade, what Blade Runner also depicts is this, like the other side of it.
00:23:04.040I mean, it's very interesting that Joy is covetous of, of Kay or Joe, as she calls him, and she wants him to be a human as well.
00:23:16.280She wants to say, oh, Kay, that's just a number.
00:23:18.640Like, you, you should have a name like Joe.
00:23:21.720And, um, and then also at the end of, after they, after he sleeps with the prostitute and, and so on, she actually tells the prostitute, oh, you're, you're, you're done here.
00:23:57.820Um, and so it's almost like she does have something.
00:24:00.200When she says, I love you, Joe, at the end of the film, she's actually murdered, so to speak, by another replicant, the kind of evil Terminator replicant named Love, by the way.
00:24:11.180Um, but she's effectively murdered by Love.
00:24:14.320Uh, but when she says, I love you, again, there, there, is that not real at some level, even real for her?
00:24:22.820Like, does she have some, I mean, does she have some level of consciousness where that is real?
00:24:28.340I mean, what, the, the way that I would describe love is, is a, is a Hegelian fashion of seeing yourself reflected in someone else's eyes.
00:24:38.520And what I mean by that is, and what I mean by that is that there's, there's the love of an object.
00:24:45.020So one could, in a way, fetishize an object in the sense that I'm in love with this painting.
00:24:51.980Um, I, I, I want to buy this, uh, this interesting collectible, um, you know, a, a 18th century ashtray.
00:25:04.720I'm just throwing something out there.
00:25:07.480You know, one can fetishize something like that and, you know, go search for it and in a way covet it and love it and take care of it.
00:25:15.460But we all recognize that, that, you know, that we all have certain things like that.
00:25:20.940We have, whether they have a, a, a value because someone, a friend or a lover gave it to you, or it's passed down from your grandfather, or you just think it, it's a sign of wealth.
00:25:31.280You know, what, why is a Rolex watch worth $10,000, whereas a, a watch that tells the time equally well worth a hundred bucks or, or less?
00:25:43.240Uh, the reason is that, that Rolex has some kind of, uh, connotation that makes it greater, that it's a signal of your wealth or your class or your sophistication.
00:25:59.940It's something that you could, unlike a Timex, it's something that you could pass on to your great-grandchildren.
00:26:07.160It's, it's a symbol of something greater, and that's why it is worth more.
00:26:11.580It's certainly, you're not buying a Rolex for the parts or something like that.
00:26:16.100Um, you know, in the same way that, you know, if you, if someone tries to sell you an iPhone from four years ago, it's worth 50 bucks or whatever.
00:26:26.880But if you buy the new one to symbolize that, ooh, look, I, I have the greatest, the newest, greatest thing right here, then it's somehow worth more, um, because of that.
00:26:36.660I mean, all of this stuff is subjective at some level.
00:26:46.360I don't think it's actually a bad thing.
00:26:48.580Um, but real love, there has to be that recognition from the other.
00:26:53.560So, you know, one can fall in love with an antique watch or something, but the watch is not going to love you back, you know, but with another person, you love that person and you, in a way, fetishize her.
00:27:29.060One, one, you know, one, I, I get the whole sex bot debate about how, you know, women can be annoying and, you know, they badger, they demand money.
00:27:41.440They're just, oh, they're, you know, they're more, they, they cost more than they're worth, all that kind of stuff.
00:27:46.780You know, can't live with them, can't live without them.
00:27:48.240Um, but one can't ultimately fall in love with a sex bot, at least as they are currently constituted, because one can't get that recognition back.
00:27:58.700All one is doing is fetishizing that object.
00:28:02.800And one can get one's rocks off and one can, you know, have almost a, you know, again, a fetishistic or sentimental attachment to something, but it's never going to pass to a higher stage of recognition.
00:28:18.240Um, but, you know, is this possible at some point?
00:28:23.820I mean, even if something is an algorithm, can one feel recognized by it?
00:28:30.140And I, I, I, I do think that the, the Denis Villeneuve and the writers were, I think they are saying no at some level about the holographic waifu.
00:28:42.600Because there's a, a very poignant scene, um, which is after, uh, his, uh, joy, his waifu has been murdered by love, the evil robot.
00:28:54.000Um, he's walking across a bridge and there's this giant, gigantic holographic version of joy who is naked and pink and, you know, obviously sexy.
00:29:07.240And she, you know, stands and looks down at him and, and, you know, says something, you know, seductive, you know, I'll be yours forever or something like that.
00:29:18.840And at that moment, I mean, you can see, um, K or Joe's character recognizing the, the limits of the holographic waifu of the algorithm.
00:29:31.940I mean, the algorithm can't ultimately do it.
00:29:35.160I mean, he was in love with this, um, formula, this code that was giving him, you know, reassurance and so on.
00:29:46.400And he, he, he recognized that actually there is only people-ness or a collective among his, among other replicants who are, you know, human in that sense.
00:29:59.660Um, I do, I think that's what they were saying, even though, even that's ambiguous because it did seem to be that joy did love him in a genuine manner.
00:30:09.620Part of loving someone is hating others or being covetous or being jealous.
00:30:51.900Um, and this person falls in love with someone who might not exist.
00:30:57.000That person sending him pics might be some, you know, 50 year old man or, or, or, you know, some, uh, 30 year old woman who is lonely living in her apartment in Brooklyn.
00:31:07.940Or it might be a, you know, some, you know, some other country who's just playing a game.
00:31:13.580Um, but at the same time, like their real attachment and their sense that they're getting feedback and recognition is real.
00:31:23.000Um, it is, you know, like it, it, it will lead.
00:31:27.780I mean, when they find out it leads to disappointment and shock and horror even, but in the moment that that is a real thing that they are experiencing.
00:31:36.260And, um, so, you know, it, it, it is kind of like a, a hack of that, uh, of that psychological process of, of seeking recognition.
00:31:46.040Um, you know, and we do seek recognition.
00:31:49.360No, no one, I don't, again, I, I don't, I don't want to sound like this is some feminist take because it isn't, but like no one does want to be objectified.
00:32:00.860Um, one wants that double layered sense of recognition that one recognizes the other, the, the other recognizes you back and you recognize the other recognizing you and vice versa.
00:32:14.580One wants a reflection and, you know, that is a true human experience.
00:32:23.400And, and this film asks, like, is that possible with non-humans?
00:32:37.080I mean, you know, effectively, and it's shown in the film too.
00:32:41.360I mean, the technology becomes one of these, uh, or it can become one of these, um, you know, sort of, uh, sirens, uh, that can destroy us.
00:32:53.480And, uh, and it's not technology per se.
00:32:56.640It's just sort of the method in which, um, technology is just sort of the medium, um, through which it's kind of streamlined and made into, uh, like a very efficient form of degeneracy.
00:33:09.500Effectively, you know, if you use the example of, uh, you know, how these sex robots could develop or even just, you know, online pornography, it becomes, you know, technology is a way of making, um, uh, of just making a kind of pure strain of a, a drug that otherwise, uh, would be, you know, it, it would just be harder to come by, um, these vices in the absence of technology.
00:34:15.400And this notion that we, you know, oh, we, you know, this libertarian or liberal notion that, oh, you know, we should just let technology go off on its own, that, you know, mind should attempt to control it, that that's, that's totalitarian or fascistic or communist and so on.
00:34:33.700Um, I, I think it's just a total nonsense and it's, you know, I, I, uh, a lot of, probably a lot of the conservatives who will also say things like this, who will, um, you know, fear a, you know, Orwellian or Huxleyan future.
00:34:53.020I mean, I, I, I think some of them are, are, are, are, are, you know, their, their fears are overblown or they're looking at it the wrong way, but I think their instincts are, are actually good.
00:35:03.700Um, technology, we, we should always be above technology and we shouldn't be afraid of saying, no, there are actually values higher than just simple technological development.
00:35:16.140We want to channel and control the way that technology evolves and that, look, I, I don't, I think it's great that we can communicate in the way that we can, whether it's through email or Twitter, we're, we're doing this podcast over Skype.
00:35:33.080I mean, this is amazing, but you know, the possibility of the entire human experience being a kind of virtual reality, reality simulator, this is not some like wide eyed fantasy or, or something like that.
00:35:55.120And, you know, I, I, I don't know what to say.
00:35:58.120The, the idea that a political entity wouldn't want to seek control and to channel technology and to, to stop some of the developments, you know, if one has to, um, we can't be afraid of this.
00:36:14.260I mean, this is, um, this is about the challenge of, of being human and being a leader.
00:36:22.800And yeah, I, I don't even think it's a question.
00:36:25.000I mean, it's something we have to direct or it will of course destroy us.
00:36:29.400I mean, it's not, um, and it's not even necessarily technology.
00:36:34.500I mean, it's, um, yeah, I, you know, maybe it's a, I don't, I, I would guess in the alt-right, it's not a controversial view, but, uh, I think pornography should be banned.
00:37:05.760Some of the only criticisms of pornography that people make are, are things like it inspires rape or inspires, uh, you know, people to be sexually degenerate.
00:37:18.240I, I, it, it might inspire people to be sexually degenerate to, to a degree, um, and, and, and, and it gets people interested in all these just weird, you know, sex fads or something.
00:37:43.080It's a, it's a way of not having sex, of, you know, beating off and having some kind of simulated connection with something that is ultimately unhealthy and is ultimately fake.
00:37:59.740Um, you know, I mean, it, it's, I, you know, a lot of people will come back with the argument, no, rapists can watch porn or a child molester can watch porn, pornography and not engage in these, you know, obviously.
00:38:13.520Terrible activities that are bad for society.
00:38:59.780Um, but this, this idea of that we should just, we shouldn't, we should, we shouldn't regulate this just rampant porn culture, porn culture in which, um, you know, particularly young people are just exposed to something that they don't understand.
00:39:15.540And then adult, adults are, uh, you know, one are allowed this opportunity of living in a simulated reality of, of, I'm going to play extreme, violent, insane video games and beat off to equally, you know, extreme, uh, porn, uh, and, and kind of live in a cocoon.
00:39:38.680And yeah, this is, this is obviously terrible.
00:40:06.300But it's connected to the racial question, of course.
00:40:08.740It is connected to the racial question, but ostensibly, uh, these other races would prefer, and probably most people, you know, if they could press a button and, uh, pornography would be illegal and it would, you know, sort of gradually disappear from our cultures.
00:40:25.800It would probably take a couple of years or whatever.
00:40:28.120Um, most people of every race would be in favor of that.
00:41:26.420I, I don't, I'm not happy about it, but I'll do it.
00:41:29.020I would never, I would never lie, but just because it's, it's omnipresent and immediately available.
00:41:35.700Sometimes it's hard not to be like, ah, let me get a little quick fix here.
00:41:39.440A little, you know, adrenaline and, you know, euphoric rush by looking at something.
00:41:45.440Um, but at the same time, I mean, like I'll admit my faults, but, uh, at the same time, like if, if I ever started to binge, binge it or something, I'd be like, wait a second.
00:41:54.700I'd be able to stand outside myself and say like, this is actually quite bad.
00:42:01.080And in terms of what I, you know, obviously I would, um, uh, want to, uh, you know, want to prevent children from, uh, from doing something like this.
00:42:12.740And I would ultimately want to prevent adults as well from, from descending into that, um, you know, cavern of, of, of the simulated reality that that's, that's disconnecting you from the human experience.
00:42:51.800It might not have been, but it, it liked a porn, a pornographic video one time and people obviously like freaked out and it, which is kind of funny.
00:43:00.020And it was, it was kind of a weird porno as well.
00:43:25.780It happened a few months ago, but anyway.
00:43:27.500Um, and I remember his response to it was, you know, like what people do in their bedrooms is fine by me.
00:43:33.700It kind of, this just libertarian response of, you know, oh, we can't regulate anything.
00:43:39.780Well, and, and the libertarian blackmail is like, well, if you allow, if we regulate something, then we would allow, you know, bad people to regulate or something.
00:43:48.280And it's like, yeah, that's why we want to rule.
00:44:10.580Yeah, it, it obviously should be, um, porn should be, uh, really seriously regulated and, and, and banned, uh, you know, to a large degree.
00:44:21.280I, I think people who are addicted to it, I feel sorry for them and, um, we should, we should try to help them.
00:44:28.260You know, I mean, we're going to have to be paternalistic at some level.
00:45:18.640Like whatever you, I mean, Guillaume Phi made this argument and, you know, it's, it's like, look, there, there probably does need to be some kind of outlet for, for male desire or, or, or, you know, deviance, you could say, or, or just, you know, I don't know.
00:46:21.840Like you're, you're not gonna, you can binge watch, you know, insane pornography.
00:46:26.960And there, and there is stuff out there that, I mean, there's porn that's, uh, you know, you know, some guy having sex with two women or whatever, you know, wow, that's, that's cool.
00:46:38.280I mean, there's some stuff out there that is just beyond sick.
00:46:41.860I mean, it really is like, I don't know.
00:46:45.240I don't wish that kind of stuff on anyone, actually.
00:46:49.400Like, I, it's just, it really is stomach turning and, and just, uh, it's appalling, actually.
00:46:54.860I mean, it's just inherently appalling.
00:46:57.060And I don't care if the person engaged in this act, like, agreed to it and is, you know, under her free or his or her free will has, you know, contracted to, you know, do something that's gonna create bodily injury.