Apologizing Makes Women & Lefties Hate You: Why?
Episode Stats
Words per Minute
184.25043
Summary
In this episode, we talk about a new study that shows that when people apologize, they are more likely to be punished than when they stand firm in their decision not to apologize. We also talk about the Peronist Conference and why we should be more restrictive about who can attend it.
Transcript
00:00:00.000
So who really, really, really punishes in approval people who make apologies about things?
00:00:07.180
It is liberals, where the result was 13.8, and even more than liberals, females, where the result was 16.3.
00:00:17.660
Hello, Simone. I'm excited to be here with you today.
00:00:20.780
Today came to me from a Cremieux post that I was reading on X.
00:00:26.360
And honestly, I didn't get this from XA. I got it from Facebook.
00:00:30.800
But somebody else screenshotted an X post by Cremieux, friend of the show, who said, you know who wrote a paper on this?
00:00:38.320
So specifically, he's talking to a guy who, there was some conference called BasedCon.
00:00:44.460
And this guy, John Carmack, backed out of BasedCon because it had become too controversial or right-wing, you know, from his perspective.
00:00:53.280
So what he says is, it is unfortunate that Rob has made BasedCon so intentionally provocative.
00:01:02.120
I feel a little uncomfortable after the events of last year.
00:01:05.680
You know, just whining, basically, that people too controversial, blah, blah, blah, blah, blah.
00:01:10.140
And then apologizing for having any involvement with it.
00:01:13.920
And Cremieux's like, what are you doing? Do not apologize.
00:01:17.600
And then he goes, do you know who actually wrote a paper on this?
00:01:19.980
At Richard Hanania. Been on the show before, by the way.
00:01:22.540
Fans don't like it very much after he turned on Trump.
00:01:25.780
Anyway, he found that when people presented with apologizing instead of standing firm,
00:01:32.980
women and liberals were more likely to want to punish them.
00:01:36.060
So I'm going to get into the abstract of the study, and then we're going to look at how different groups reacted to this.
00:01:42.740
And then after that, we're going to go into all of the other research on apologizing
00:01:45.980
that will generally show that when you should apologize, and that generally, at least in a public context,
00:01:53.820
If people are using, attempting to get you to apologize as a weapon against you,
00:01:59.640
i.e., you know, apologize or else a bad thing will happen to you,
00:02:12.680
Unless you are 100% okay with flipping 100%, like basically which team you're paying.
00:02:17.920
What do you mean by apologize when you're mid-mistake?
00:02:20.440
So a great example of this is often used in business case studies.
00:02:25.060
I think it was Tylenol or one of the medical companies.
00:02:28.700
Who needs a product that was actually harmful to people.
00:02:31.800
And then they did this big apology and removed at a huge cost very quickly all of their products from the shelves.
00:02:42.800
Yeah, I think that's more like if you've made a legitimate mistake that you wouldn't want to stand by, own it, fix it right away.
00:02:53.800
So I'm not of the camp of never apologize, but apologize when you did something that was a mistake,
00:03:00.160
not when somebody has been like, oh, did you know X person was at that conference?
00:03:04.180
And this is something that we personally went through.
00:03:08.380
Went through, go through, all the freaking time.
00:03:14.180
Is people will call us and try to get us to attack people we have had on the shows for things they have done in other places.
00:03:21.900
Or the conference, the peronadals conference or the people who talk there, right?
00:03:25.480
Or, and I will note, like, off the record, I took Kevin Dolan, who organizes it aside, and Simone knows this.
00:03:34.080
Like, I had like an hour and a half argument with him.
00:03:38.420
But I mean, like, I wasn't in public about this, right?
00:03:42.300
Being like, we should have more restriction on who can attend so we don't get people who could hurt other attendees at the event.
00:03:51.240
Like, my opinion was, we need to be more restrictive about this, right?
00:03:55.900
Not in reaction to people attacking us or with an apology.
00:04:00.280
Just like, let's not give them the opportunity to drive this wedge or say, you know, Malcolm, who appeared alongside figures like X and Y, right?
00:04:12.200
At the end of the day, I see the peronadalist movement and everything we do is very much a team effort.
00:04:16.740
And if he is putting in the risk and personal capital to make something like this happen, he gets to have the final call on stuff like that, even if I'm going to help financially back him.
00:04:28.280
And we have a lot of respect, like, for the principle of it.
00:04:31.600
Like, we have a lot of practical qualms about it because we know that, you know, there will be certain groups that will never go to a conference as long as it has a reputation like that.
00:04:41.800
And we hate that, for example, many governments wouldn't send representatives to a conference like this because of those characteristics.
00:04:50.380
So from a pragmatic standpoint, that's why you brought it up.
00:04:54.200
But from a philosophical standpoint, yeah, we don't like the idea of any form of censorship or restriction.
00:05:05.320
Yeah, the moment he took this stance and said, this is the direction we're going as a conference, we never publicly attacked him for that.
00:05:12.720
We never publicly attacked any of the other people who were at the conference.
00:05:16.040
We never said anything other than the media is being crazy about this.
00:05:22.580
Like, run out of a conference because some controversial person shows up or some homophobic person shows up, right?
00:05:29.060
Like, and even so, like, I remember somebody told me that the guy who was running it, and this is why we can't back down on this stuff.
00:05:35.880
And why we do need to present a united face once a decision has been made behind closed doors.
00:05:41.340
Which is to say, somebody told me that Kevin Dolan, the guy who ran the conference, was homophobic.
00:05:45.440
And I was like, I don't remember him being homophobic.
00:05:47.560
And they're like, well, he's against gays getting married.
00:05:53.160
What are you, are you not allowed to be Mormon?
00:06:00.300
Yeah, let's just stop the practice of religious freedom.
00:06:03.760
Yeah, well, because they expand their sort of umbrella of safety, right?
00:06:10.560
Of things they can push back by continuing to gain ground in terms of the apologies that they demand.
00:06:16.820
And so what I'm making here is you can disagree with the anyone should be allowed to talk at this conference thing, like this John Carmack guy clearly did.
00:06:24.200
But you have those conversations offline and whiz people.
00:06:28.220
And if you fail to persuade them, it means your arguments weren't good enough, and then stay as a team, right?
00:06:36.680
Because we need to have our backs against each other and know that we can trust each other.
00:06:41.900
And that when you attempt to appease this other group, as we'll see from the data we're going to get into in a second, you don't actually make them like you more or endear them to you.
00:06:49.900
You, in fact, make them want you to suffer more.
00:06:53.000
You, specifically, the one who apologized to suffer more.
00:06:55.560
So in the abstract to this particular study, it's called Does Apologizing Work?
00:07:03.980
Public figures often apologize after making controversial statements.
00:07:07.320
There are reasons to believe, however, that apologizing makes public figures appear weak and risk-averse,
00:07:11.860
which may make them less likable and lead members of the public to want to punish them.
00:07:16.360
This paper presents the results of an experiment in which respondents were given two versions of real-life controversies involving public figures.
00:07:26.440
Approximately half of the participants read a story that made it appear as if the person had apologized,
00:07:32.520
while the rest were led to believe that the individual had stood firm.
00:07:36.080
In the first experiment, hearing that Rand Paul apologized for his comments on civil rights
00:07:41.800
did not change whether respondents were likely to vote for him.
00:07:44.940
When presented with two versions of the controversy surrounding Larry Summer in his comments on women scientists and engineers,
00:07:51.580
however, liberals and females were more likely to say he showed that he should face negative consequences for his statement
00:08:01.480
The effects on other groups were smaller or neutral.
00:08:04.560
The effect suggests, and this is very, very, very interesting.
00:08:08.160
So the conservative figure, right, who went too far in a liberal perspective here,
00:08:19.440
But the liberal figure that Larry Summers, who I think ran Harvard and said that, you know,
00:08:25.460
women are biologically different from men, who went too far in a conservative direction,
00:08:30.080
was punished for apologizing and only by specific demographics.
00:08:34.600
So, you know, and I'll put this on screen here, if you look at the table where they go over the results,
00:08:41.760
So the groups that punished the apologizer more than the person who stood by their response,
00:08:49.840
There was slightly more punishment for the apologizer, but virtually statistically insignificant, right?
00:08:54.560
And I note here that in every one of the groups, the apologizer was punished more.
00:09:00.040
The other groups that didn't really care as much were the moderate political people,
00:09:05.300
where it was 6.3, and the conservative political people, where it was 6.5.
00:09:09.480
So who really, really, really punishes in approval people who make apologies about things?
00:09:16.700
It is liberals, where the result was 13.8, and even more than liberals, females, where the result was 16.3.
00:09:30.520
Well, yeah, women on average, or women on average are more likely to be progressive.
00:09:34.120
So if you're looking for the sort of nexus here, it is liberal females who are extremely unwilling.
00:09:47.000
You give them more chance to dig deeper and hit you further.
00:09:53.040
And we've seen organizations learning from this.
00:09:55.600
Cremieux was noting how American Eagle, they got, you know, we were in the era of like woke ads for a while,
00:10:01.160
where it was a bunch of like fat, disabled people who nobody actually aspired to be,
00:10:12.100
With Sidney Sweeney, which everyone is aware of now.
00:10:16.240
American Eagle, in response to all the controversy published,
00:10:19.520
Sidney Sweeney has great genes, is and always has been about the genes, her genes, her story.
00:10:26.280
We'll continue to celebrate how everyone wears their AE jeans with confidence their way.
00:10:34.480
And they attempted to, like the cancel mob, attempted to be like, hey, you can't say this.
00:10:41.440
You know, this is horrible, blah, blah, blah, blah, blah.
00:10:45.220
And I'd say in terms of public perception, it appears to have helped them greatly.
00:10:49.080
So before I go into additional studies, I'll just talk about the psychology of why, right?
00:10:53.820
So the person who's attacking you for breaking cultural norms, right, like randomly on Twitter,
00:11:00.500
Like, they're not doing this to protect their community.
00:11:03.660
They're not doing this because they know that, like, you're not actually a threat to their community.
00:11:09.160
They're not doing this to protect, you know, people who are in vulnerable positions,
00:11:14.940
because there's many other things they'd rather be doing.
00:11:17.000
You know, the famous data, there's more people living in slavery now than at any point in human history.
00:11:20.420
You know, they're doing this specifically because they want to have power over you, right?
00:11:27.840
That's the motivator for the individual who goes on Twitter and slanders you or attempts to cancel you
00:11:36.540
They want to feel power over you, and they see an opportunity to exercise that power over you.
00:11:43.780
If you allow them to exercise that power over you,
00:11:47.620
what you are doing is giving them that positive dopamine hit that they wanted,
00:11:52.840
and then basically putting up your hand and saying,
00:12:01.400
And keep in mind that his study, like, the participants didn't know what they were getting into, right?
00:12:05.440
You know, this is a, you know, two sample populations here, right?
00:12:15.540
Progressives and women really do punish you significantly more and will desire to punish you
00:12:24.640
Before I go further with other studies, anything you want to say on this?
00:12:27.560
I mean, I also get the impression that this is one of those, you're looking at very community in-group
00:12:34.480
and out-group based political ideologies and biotypes, what behavioral types, like women on average are
00:12:42.640
They're more group consensus-based as are progressives at this point in time.
00:12:46.540
And I think also what's going on is this sort of mob mentality of you take the weak one and you attack it
00:12:55.500
And if they, like, bow and, like, expose their neck, then you just go in for the kill.
00:13:01.020
So your only choice in these highly cliquey, highly group-based tribal societies is to fight back
00:13:10.600
ruthlessly and never show vulnerability, or they will smell blood and take you out.
00:13:20.680
No, I think that that does have something to it.
00:13:22.340
But I also think that you could go too far in the other direction.
00:13:25.240
For example, I think Trump often goes too far in the other direction of never admitting mistake
00:13:30.440
or fault or that any of his initiatives didn't work out the way that he intended them to.
00:13:34.880
And that is a shame because I think that the correct path is to stand firm when you're standing
00:13:41.520
But when you make a decision that genuinely turns out not in the way that you intended it to,
00:13:46.160
because of unforeseen factors, or because you made a missing judgment, you can raise your
00:13:51.040
honor in the general group's perception of you by saying, hey, I messed up here.
00:13:56.660
Now, to go further, because I'd love to go over some other studies on this, because I found
00:14:02.020
One study, the effects of attributions of intent and apology on forgiveness when saying sorry
00:14:09.240
It showed that apologies reduced forgiveness and increased blame when the offense is attributed
00:14:16.560
deliberate harm, but they promote forgiveness for unintentional offenses.
00:14:20.720
Inexperienced, participants were less forgiving of intentional transgressions who apologized compared
00:14:26.940
So note here, if you basically, quote unquote, accidentally, like offend a group by going over
00:14:34.940
saying, well, you know, women are actually different from men, it's very clear that you
00:14:44.120
If you do something like a government program that doesn't have the effects that you meant
00:14:48.660
for it to have, you will not face a penalty for apologizing for that.
00:14:58.480
So it's not never apologize, but never apologize for the type of thing where you stated your
00:15:02.800
earnest opinion, and then you were forced to back down from that earnest opinion.
00:15:12.060
When saying sorry may not help, the impact of apologies on social rejection.
00:15:15.660
Across multiple studies in 1800, apologies and social rejection scenarios, e.g. declining
00:15:21.640
invitations, increased hurt feelings, aggression measured by behaviors like allocating hot sauce
00:15:27.620
to the rejecter, and to express forgiveness, but did not increase actual forgiveness.
00:15:36.220
So when you have to socially reject somebody, do not apologize.
00:15:43.660
I'm not, I'm not interested, not, you know, and then change the subject.
00:15:47.780
That's helpful because I'm so tempted to apologize in those instances, and I bet a lot of people
00:15:54.720
Because you don't want to, you don't want to disappoint them, but yeah, I guess you're,
00:15:58.300
you're just, you're just kind of twisting the knife when you do, right?
00:16:05.400
When forgiving erodes self-respect and self-concept clarity.
00:16:10.420
Forgiving an offender who offers an insincere uncommitted apology or shows low trustworthiness
00:16:15.680
decreases the victim's self-respect and self-conception of clarity over time, making them feel like
00:16:22.900
In contrast, a forgiveness after a genuine apology boosts self-respect.
00:16:31.220
Oh, I accidentally didn't paste the title of this study, but this study showed reviews
00:16:36.080
why people avoid or give perfunctory apologies, including perceived ineffectiveness.
00:16:40.960
Apologies can fail or backfire if they threaten the apologizer of self-image,
00:16:44.580
or if the victim sees low concern leading to reduced forgiveness and heightened conflict.
00:16:51.280
So what's shown here is that apologies can even hurt the person you're apologizing to
00:17:03.180
Well, and clearly when you credibly threaten someone's self-image,
00:17:07.480
yeah, it doesn't matter if it's in the form of an apology or just any sort of statement,
00:17:14.540
Well, and you also see this, there's been some studies of public figures around this.
00:17:19.000
So a qualitative case study of cancel culture among public figures looked at public cases
00:17:23.620
like James Murray, Mimi Groves, Joe Rogan, Dave Chappelle, and analyzed Twitter reactions
00:17:29.560
Apologies were often, quote-unquote, unheard amid media focus and drama,
00:17:32.620
caving, e.g. Joe Rogan's platforms changes, polarized views without resolution,
00:17:37.760
resistance, e.g. Chappelle, sometimes shortened backlash.
00:17:41.240
Often, yes, apologies exacerbated losses, e.g. subscribers and opportunities.
00:17:46.040
Cancel efforts backfired by boosting target support so you could even win.
00:17:50.500
So a great example of this recently, and we have some episodes on this,
00:17:54.680
Somebody attempted to cancel this VTuber, Kirsha, for being mildly right-wing.
00:17:58.820
And they also tried to cancel our friend of the show.
00:18:04.600
And in both instances, they stood their ground, and they ended up gaining a bunch of followers
00:18:10.360
from this and a bunch of support for this, whereas the canceler just had a giant fist
00:18:15.400
fly back in their face and basically has dropped off the earth.
00:18:24.340
Now, in part, I would say that this is because the power of cancellations has significantly
00:18:27.720
dropped in the era of blue sky because, you know, the people who were most likely to cancel
00:18:34.220
As we said, you know, Trump trapped the progressives in a crystal, and now nobody can hear their
00:18:43.880
And blue sky is one of the best things that has happened for conservatives on the internet.
00:18:49.960
We used to get a lot of engagement on X, and now we don't because they all went away.
00:18:58.000
If we're not engaged with by people who hate us, who's going to engage with us, Malcolm?
00:19:06.060
Yeah, we got very good at manipulating them, which was great.
00:19:09.220
I mean, that's how we ended up really creating and solidifying the perinatalist movement.
00:19:13.480
But I will say I am glad that we sort of surfed the last wave of this to our, you know, the
00:19:22.820
And note here, you can be like, well, your channel doesn't have that many subscribers.
00:19:25.380
Yeah, but we get a lot of views and watch time.
00:19:27.680
And even without that, the amount that we're in the media, you know, we get about...
00:19:31.660
For the past six months, we've had about two pieces written on us every week, which is
00:19:35.380
a lot, you know, in terms of the reporters we're talking to and everything.
00:19:38.860
And that only became true because of cancel mobs.
00:19:43.780
And it's also a lot more fun than being famous for, like, things...
00:19:47.920
Because I was talking to Simone today, I really love reading pieces where they're interviewing
00:19:50.820
her because, you know, if we were like Greta Thornburg or something and we were promoting
00:19:55.860
something that everyone agreed with, it would be very much like, you know, the reporter comes
00:20:00.460
and you give them the information and then they craft some great story about how you were
00:20:04.060
taken away by the police or something like this.
00:20:06.740
But the reporters come to us looking for a fight.
00:20:09.240
Like, every line often reads like, you know, them swinging the sword in some competition
00:20:15.440
and then Simone replying was like a deft parry, you know, like, but you are a racist.
00:20:21.320
He's not pointing out all the times where literally, like, journalists have been like, well, so,
00:20:29.940
And then I, using terms and explanations more articulate than our enemies or our critics,
00:20:36.000
explain why our critics don't like us on the record or on film.
00:20:40.880
And Malcolm, like, just gets increasingly worried.
00:20:48.080
But what I'm saying here is, is they come at you nowadays when I see pieces and I'm just
00:20:53.060
so proud to be married to you where it's just like swing, like you're a racist.
00:20:56.680
And it's like, actually, aren't you and the people like you the real racist?
00:21:01.920
With a very good argument and it's like counter and then they have to, they have to swing back
00:21:07.260
against you because, oh no, they thought that they'd just be coming to cut you up.
00:21:12.440
They think they're coming here to just catch us, you know, looking terrible, right?
00:21:17.340
And so we're going to, you're a racist and you don't want women to have rights and you
00:21:22.120
And then it's, it's, it's, you know, block, block, block, counter repose.
00:21:29.480
It's way more fun than a journalist liked us, right?
00:21:31.920
What, what a good life, what a boring life that would be to be some, you know, and then
00:21:36.540
surrounded by sycophants who think exactly what we think.
00:21:39.480
Like imagine Greta Thornburg's life, like, like everyone who follows her, like her weird
00:21:48.360
If you look at the pro natalist movement, she only has, she can only lose.
00:21:53.980
Whereas when our position is to be the villains, it, you know, what, what do we do?
00:22:08.120
Let me find a more controversial way to say that.
00:22:12.380
No, but the, the, what I mean to say is that if you look at like her core friends who are
00:22:16.340
running the movement, they all have about the same perspective.
00:22:18.600
If you look at our core friends, it's like conservative Mormons and Catholics and Orthodox
00:22:23.860
Jews and like, and many of them, we don't agree with on many things at all.
00:22:28.200
Like, but we work together because we're working to preserve our individual cultures into the
00:22:32.860
future and create a bright future for humanity, even though we have different, you know, priors,
00:22:36.960
which is just so much more fun than organizing a movement where everyone's forced to tow a
00:22:42.500
That would feel really depressing to me, to be honest.
00:22:48.500
You know, if we had taken this other pathway, next study here, I'm sorry, the language behind
00:22:53.400
YouTube apologies and cancel culture analysis of 10 YouTube videos in 55 on linguistic cues,
00:23:01.180
Sorry pauses and reception, negative keywords and gestures conveyed an apology, but risked
00:23:06.540
seeming performative harsh comments reflect a mob backlash.
00:23:09.540
There was a lot of harsh comments on the apologies videos.
00:23:12.040
Oh, so they're doing like a sentiment, sentiment analysis.
00:23:14.820
It showed that in many cases, apologies made things worse for the person who was apologized.
00:23:21.260
And there's just such an interesting evolving culture around apologies as well.
00:23:27.200
Like the, the social media convention of writing an apology in a notes app and then publishing
00:23:36.240
It's, it's just a very common way that some people officially post their apologies.
00:23:41.240
And another way is, is, is, you know, the sort of regretful, tearful apology image that is
00:23:49.460
100 or like video that, that is no one believes, you know, it's all just sort of acted and yeah,
00:23:55.440
they just, it's, it's very much seen almost to me.
00:24:00.680
It reminds me of ritualistic forms of attempting to save face or accepting dishonor in, in ancient
00:24:13.400
Now you must write in your notes app, your apology and post the image of it online.
00:24:21.140
Well, no, then everyone has to comment on it and decide on whether your apology will
00:24:27.940
And the answer is always, no, it will not be accepted.
00:24:33.300
What's funny about that, that original study that I read that I found so interesting is it
00:24:37.160
was written by Richard Hanania who literally ended up apologizing and changing sides after
00:24:42.960
Trump's victory for stuff that Trump said he was going to do in the campaign cycle.
00:24:48.540
So he fell to this exact apology cycle that he noted.
00:24:54.380
I don't, I don't know if he saw it that way because I think his, the line he's pulling,
00:25:00.860
at least from what I've gathered in an interview he did with Jesse single and Katie Herzog on
00:25:05.380
blocked and reported is that he never apologized for anything.
00:25:10.260
It just happens to be that the Trump crowd is too dumb and lowbrow for his galaxy.
00:25:17.860
Yeah, we were talking about this and I was like, but they're literally not like the,
00:25:22.180
the, the online conservatives right now are like way more the intellectual crowd.
00:25:26.340
Like if you want to talk about like the high intellectuals of the internet, most of them
00:25:31.580
Very few of them are progressives these days, but simply because you're not allowed to have
00:25:34.800
a wide diversity of opinions in the progressive sphere, which intellectuals don't really like,
00:25:38.740
or people who are like big intellectual heavyweights.
00:25:40.660
But what he means is he, he clearly means here, because this is true, is that if you
00:25:45.420
want to ape this sort of high status urban monoculture, right?
00:25:48.740
Like you, that is not within the conservative sphere.
00:25:51.840
And if you look at a lot of what Richard Ananya has done through his decisions, it's chase
00:25:59.660
And I think that that is fundamentally what was upstream as this decision for him is he didn't
00:26:04.200
realize that he was going to have to accept and indulge in and celebrate low culture instead
00:26:10.540
of high culture by becoming a conservative figure.
00:26:15.140
He wanted to be invited to the cocktail parties, you know?
00:26:20.260
Cause even in the very beginning, he, like when he was a teenager, he'd post on these, like,
00:26:26.340
I mean, they were controversial, but I think still they, they saw themselves as like very
00:26:32.580
So his, his preference is to be in the heterodox, but strictly, or like forbidden, but strictly
00:26:41.980
But anyway, yeah, I don't think he saw himself as apologizing.
00:26:45.780
I mean, he wanted, I think that the, the, the thing here is he wanted to be high, high
00:26:53.260
By the way, something I didn't know about him that you told me when you remember this,
00:26:57.260
Richard Ananya is half Palestinian and half Jordanian, which is very close to Palestinian
00:27:02.280
as well, a neighboring country, which is just, I, I didn't know that.
00:27:05.560
I didn't know I knew any Palestinians who I respected.
00:27:11.660
But he's a Zionist, by the way, for, for, for people who don't know this, this, but
00:27:15.260
Palestinian Zionists, that would be a fun thing to have him on for.
00:27:18.940
Yeah, we should, we should, I mean, we should talk with him about that.
00:27:22.980
Maybe that's why our audience doesn't like him is because he's Zionist.
00:27:27.920
I mean, we are already Zionists as people know about us.
00:27:33.680
Well, I think a lot of our audience, you know, it takes pleasure in knowing that we don't
00:27:38.320
You know, so many, I think, you know, talking heads on the internet and stuff like that.
00:27:42.460
You go to them because they're going to confirm your beliefs and laugh at the other side.
00:27:51.100
And people come to us, but I think the thing that people know about us when they, when
00:27:55.640
they come to our channel and they see is we don't really chase status in the way that
00:28:00.800
like, like we clearly have a cohesive internal ideology and goal for our species and our children.
00:28:10.860
We'll replace you as genetically modified lab grown superhumans made from artificial wombs.
00:28:21.740
You, you guys can stay in a little zoo we'll make on earth for vanilla humans while we go
00:28:28.880
I mean, we don't, we don't plan to stay here either.
00:28:39.000
No, I think what, what I hope for, for this channel and, and what I love, I do love that
00:28:43.620
there are so many people who listen to us, probably the majority of people who listen
00:28:46.580
to us disagree with us on quite a few things, but they, they often listen because they want
00:28:52.400
people to bring up interesting, intellectually engaging and relevant and actionable subjects
00:28:58.600
that they then, our opinions help them sharpen their own, which are different, quite often
00:29:04.640
different, but they hadn't yet, because most, most people aren't talking about these things.
00:29:08.980
They, they hadn't engaged themselves with the ideas enough to even know where they stood.
00:29:14.800
And because we randomly bring these things up, they're like, okay, well they're wrong,
00:29:20.100
And I love that because I mean, in the end that, that serves our agenda of wanting to
00:29:24.160
see an even greater ideologically diverse diaspora of humans out there.
00:29:29.260
And if we can get people to just form more and more varied opinions on things, just by
00:29:34.660
deflecting off of our opinions, power to the people.
00:29:43.000
No, I, and, and I, I, I do find the intellectual diversity and that's why I like our, our discord
00:29:47.880
so much of our audience really fun because we don't fall into just the typical, like we've
00:29:53.000
been called manosphere influencers or like tech, right?
00:29:56.740
Influencers or like, I mean, I wonder if it creates a ceiling on our audience size, which
00:30:01.360
would be quite disappointing if it does, but I've seen the audience of most of the other
00:30:06.620
sort of mainstream right-wing influencers often at some point, unless they're just doing
00:30:10.680
the incredible, like down the line griff often turn on them.
00:30:13.880
So, you know, when is our audience going to turn?
00:30:16.900
Jordan Peterson's original works were extremely obscure.
00:30:21.100
Like his, his first books and everything, all his work up until this random point at which
00:30:24.900
he got famous, very inaccessible, not mainstream, highly niche.
00:30:34.560
I'm just giving you an example of how like, hopefully we don't have an audience ceiling,
00:30:39.840
That's only for like a very specific type of people that really like to engage intellectually.
00:30:44.140
There is a chance that someday maybe we may reach a larger audience by fulfilling a more
00:30:54.380
Maybe not exactly daddy who wears weird suits for Jordan Peterson, maybe something a little
00:31:00.760
Actually, Jordan Peterson's great here because Jordan Peterson for me shows, like you can really
00:31:05.080
Jordan Peterson has F'd up on everything he promised people he would do.
00:31:09.080
He's like, follow the advice that I give you and your life is going to be like X and Y.
00:31:14.960
And yet he's been unable to, you know, follow his own advice, whether it's, you know, cleaning
00:31:18.780
his room or, you know, staying mentally disciplined and stoic or, no, I mean, it's like, how, how
00:31:24.620
is the God of like stoic philosophy these days, like a drug addict for a while, right?
00:31:30.040
You know, clearly he's not, he, he, he's not able to like the device he's giving you guys
00:31:40.460
But I think you're missing what he, I don't think people necessarily, they don't want the
00:31:48.600
In the case, he has stayed, I think a fairly unassailable person was in the Republican influencer
00:31:54.980
And I think it's because look, even though his advice doesn't appear actionable, it doesn't
00:31:59.700
appear that he betrayed his, like he clearly thought it was, he's clearly doing his best
00:32:05.820
with, with, with what he has at trying to describe the world as he sees it and move
00:32:10.160
And I think that he's done stuff that like, for example, one of the things he's criticized
00:32:13.640
for is being like, well, you know, like theoretically God exists and like, you know, not really confirming
00:32:19.260
a belief in God in the way other people would confirm a belief in God, you know, because it's
00:32:23.300
not a, you know, and a lot of people criticize him for this, but I actually think that this
00:32:26.920
is part of why he has stayed so beloved is because he doesn't kowtow to just what his audience
00:32:33.420
wants him to do, even at the most deep level, when it would be so easy for him to just say,
00:32:39.420
yes, a traditional God, as Christians understand it exists, the Christian God is real, blah, blah, blah.
00:32:44.500
He'll be like, well, the Christian God is metaphysically real or like metaphorically real,
00:32:48.100
you know, he's like, analogistically it's real, you know, and, and, and he doesn't have
00:32:55.280
to do that, but he does do that, which I think shows intellectual integrity, intellectual
00:33:00.580
integrity, which has, has, has kept people alongside him.
00:33:05.460
Whereas, okay, vis-a-vis like saying you're religious without saying you're religious.
00:33:10.340
I just saw a clip today where someone was, a journalist was, was asking Donald Trump who,
00:33:16.780
you know, says that he's a, a Christian and that specifically his favorite book is the Bible.
00:33:22.200
And these journalists were like, well, okay, can, you know, can you give us, you know, what's,
00:33:25.820
what's your favorite passage from the Bible? And Trump's like, listen, that's very personal.
00:33:30.280
You know, that this is a very personal thing. You know, I, I, I just used to stop the kind of
00:33:34.620
thing I think people should be talking about. And they're like, no, but really just like one,
00:33:37.820
you know, one verse, it inspires you. He's like, no, this is just very personal. Like.
00:33:45.020
Of course he doesn't. But I mean, what, where would you put that in intellectual honesty?
00:33:51.100
I think that, that is, that is a classic Trump non-apology, but he's emotionally honest.
00:33:57.560
Everybody knows who Trump is and what he's really about. You know, I don't think that he's
00:34:02.900
successfully, you, you can watch two people talk and, and Trump may tell more like objective lies,
00:34:10.880
but they're not lies in, in a, I don't know how to put it, but like in, in a, the gist of the
00:34:17.540
point, right? Like they're eating my favorite book is the Bible. He means like, I mean,
00:34:24.320
my favorite thing is everyone's favorite thing. And everyone's favorite thing is the Bible.
00:34:27.540
So what he means is like, I'm on the side of Christians. Chill out. Yeah. Yeah. Yeah. Yeah.
00:34:31.880
Yeah. On the side of Christians. When he says the cats and the dogs, people are like, oh, he lied.
00:34:37.840
But like, he is right. That like, these people are culturally different from us and do things in,
00:34:43.060
in different ways than us. Right. Yeah. 100%. Yeah.
00:34:45.840
In, in ways that we may find culturally abhorrent. If you interpret all things
00:34:49.880
Trump says as metaphor. As metaphor for what's happening. No, no, no. But what I mean is
00:34:55.040
the other politicians, right? Trump just went out there and he read that in a news,
00:35:00.020
one of his like far right newspapers, and then just repeated it. I don't think he thought he
00:35:04.380
was trying to. Probably Laura Loomer told him. Yeah. He probably, he probably wasn't trying to lie
00:35:09.120
to people. Actually, I think it's very clear that he wasn't trying to lie to people. He just said
00:35:12.860
what he saw on like his Twitter feed that morning. That's a very emotionally honest thing to do.
00:35:17.500
Whereas, you know, his opponent on stage, you know, studied forever what the voter wants to hear.
00:35:24.060
And then was saying that, you know, twisting the facts to be true, but, but what, you know,
00:35:30.000
their scorecard and, and looking, you know, high class and, and, and looking good to their high
00:35:35.540
class friends. And that emotionally doesn't resonate with people. And so I think that that's
00:35:41.960
how Trump's been able to get around this. You know, it is, it is actually pretty remarkable that
00:35:46.480
Trump has really only expanded his base over time. Good point. Yeah. In terms of, you know, who,
00:35:52.340
who supports him. And I, I think that that'll happen into the future, as long as he keeps this
00:35:56.560
sort of emotional honesty, he is who he is. Nobody's like, when he does something that like,
00:36:01.740
when the progressives are like, Oh, he did the thing. Like, can you believe it? Everyone's like,
00:36:05.780
I mean, yeah, it sounds like something he'd do, you know, I, that, that seems about right to me.
00:36:12.920
Well, and I think that's, that goes back to our important character theory, which I think does
00:36:18.300
kind of dovetail well with this apology concept, which is that you should have a public persona with
00:36:26.080
very clear virtues and very clear vices. And what makes Trump exceptional is that if you ask, you
00:36:33.020
know, 10 people, what's right and wrong with Trump, they're all going to have pretty consistent
00:36:36.620
answers. Whereas if you do the same with like Kamala Harris or Hillary Clinton, you're just going to
00:36:41.380
get so many different answers. And that means that you can't really control your image as well. And the
00:36:48.920
most important thing is, as long as your vices are vices that do not disqualify you from the
00:36:53.800
position you are trying to achieve in society or the job you're trying to get or whatever,
00:36:58.180
then you're fine. And Trump's vices do not change. Like, yeah, as soon as your vices don't
00:37:04.140
disqualify, basically anything can leak as long as it's in line with your vices. Yeah. And you don't
00:37:09.500
have to apologize. That's one of those things. Like if it's your known vice, then you also get kind
00:37:14.040
of a get out of jail free card. Like when it leaked that, you know, Oh, Trump slept with a
00:37:18.160
prostitute. Everyone was like, what did you expect? We knew that. Yeah. I mean, of course he did.
00:37:22.700
Was there somebody who thought that didn't happen? Like, yeah. But if you try to hide your
00:37:26.880
vices, if he tried to fob himself off to society as some kind of a paragon of virtue and morality
00:37:33.700
and family values who, you know. Yeah, no, then that would be really bad. And, and, you know,
00:37:39.900
for us, you know, I think that we're known, this is why, like, I'm able to do an episode where I'm
00:37:44.600
like, love isn't real. I don't really love my wife or kids, you know, and it doesn't become
00:37:48.800
a big scandal because the progressives have been saying that about us since forever.
00:37:52.700
They've been like, he beats his kids and he doesn't really love them. And he blah, blah,
00:37:56.640
blah, blah, blah, blah, blah. We're for like a normal press thing. That would be like a,
00:38:00.280
okay, this is it. This is over for you. How, how dare you leak that?
00:38:05.980
Right. So-called prenatalist doesn't actually, yeah. Isn't actually a kid person.
00:38:12.340
Yeah. You are actually a kid person. I guess that's not what they would say. Isn't actually
00:38:15.460
like emotionally driven to have children, which you, how gross. Yeah. What are they
00:38:20.920
to you? Toys? Pets? Exactly, Simone. Oh man, I'm just so disappointed in people.
00:38:28.840
Disappointed in these animals. I have children because it makes me feel good.
00:38:32.900
But at the end of the day, you've got to keep in mind, when you apologize, you're retreating.
00:38:38.840
That's what an apology is. When you apologize, not because you made a genuine mistake, which I
00:38:45.440
think conservatives need to get better at doing. 100%. But you apologize because, because what
00:38:51.320
conservatives have learned, and this is true, is they can always tell you, well, that offended me.
00:38:55.940
How dare you believe that? Restate what you said, right? Or, or apologize for holding these beliefs.
00:39:01.400
They can always expand the, the, the bubble of offense. And they use this to push us further and
00:39:07.240
further back into crazy things. Like, God, what's the controversy right now in the UK? Everyone's
00:39:12.900
freaking out because a trans woman in the UK was working at like a, some fancy department
00:39:19.480
store, like clearly not passing and wanted to try, like walked into the, the girl's changing
00:39:26.100
room and tried to change the bra of a 13 year old to get it fitted for her to like feel up
00:39:30.960
her breasts and everything. And everyone was like, what, why did you allow this? And so JK Rowling
00:39:36.020
called a, a, what's the word? Whatever, where you don't buy from a store.
00:39:44.020
A boycott. And the first word that came to my mind was fatwa.
00:39:51.460
Yes. I mean, she did, that's the, they're functionally the same thing. Come on.
00:39:55.500
And so she called her fatwa and then all the, you know, like, like classically all the people
00:40:00.440
were like, this is so offensive. And it's like, oh, so you'll just, because no reasonable
00:40:05.140
person thinks that that's a good thing to normalize. Any guy who says that they identify
00:40:09.780
as a woman being able to fondle underage girls, like that's, that's not a normal thing. It's
00:40:16.840
normal for people to be freaked out that that's happening in a major department store, you know?
00:40:21.640
So they're completely in the right, but what we've learned is they'll just expand the scope
00:40:28.020
of normal as much as they need to until the point where it's like, how dare, you know,
00:40:33.540
at first it's elite have, you know, sex parties that they invite minors to, you know, this
00:40:40.440
is a conspiracy and eventually it's going to be like, how dare you, you know, attack these
00:40:46.300
Well, yeah. So speaking of that on, on the Epstein issue, per your philosophy, I, my, my
00:40:54.660
guess, if I'm applying this advice on apologizing strategically well, correctly, what the Trump
00:41:01.760
administration should have done when they decided to not ultimately be 100% transparent
00:41:07.140
with this was instead to say, I'm sorry, we said we were going to share all of this.
00:41:11.860
We have subsequently learned that from a national security standpoint, we just can't do it.
00:41:17.220
And this is one of those instances in which for the, the best interest of the American
00:41:21.900
public, we're going to have to, you're going to have to take our word for this and it sucks.
00:41:29.320
Yeah. Or, or what they should have said is the, the, the, the files that could have proven
00:41:35.620
something else happened here are no longer accessible to us. The Democrats scrubbed it.
00:41:40.920
You know, one of these people will be like, Oh, Trump's not doing it. Cause he's on the
00:41:44.720
Epstein list. As I pointed out before, this doesn't pass basic logic. If the Democrats had
00:41:50.740
evidence that Trump was in, in a hard, incontroversial way that wasn't leaked already tied to the Epstein
00:41:58.100
stuff, that would have been, um, that would have been during the election cycle. Like
00:42:03.020
also he has the sexual tastes of a poor man, not a rich man. And if you know what I mean,
00:42:09.780
Oh yeah. We, we actually talk about this in our book. It's, it's, it's very interesting.
00:42:12.880
So males have a very interesting, and we have an old episode on this to be like, why do the people
00:42:18.740
exist that are attracted to this in, in, in most communities? Like, why are there these weird
00:42:22.520
sex parties and stuff as a male? You're sort of drawn between two very strange and opposite
00:42:28.220
extremes in terms of arousal patterns. You either can optimize for making absolutely sure a thing is
00:42:35.460
a female. So that means you're attracted to busty, buxom, Brazilian butt lift style women.
00:42:42.020
Not just a female, but healthy. So, so what does it mean? You have very healthy. It means you have
00:42:47.240
very large breasts. You have a very large, but you look very voluptuous. You have long nails.
00:42:52.740
You're definitely female. No question about that. We'll survive the winter. Can carry a baby to
00:42:58.040
term. This is, this is what poor people as arousal patterns are optimized for.
00:43:03.060
Yeah. Cause you need that immediate assurance that they will, they will last.
00:43:06.060
I mean, you can even just see this from like basically looking at people, right? Like if you,
00:43:11.900
if you look at like this, the traits, like when I go to see somebody that's got like five inch nails
00:43:15.980
and like giant bazongas, I'm like, oh, you're a poor person, right? Like, and you're trying to
00:43:20.920
appeal to poor people. This is, this is a very poor thing. But when you look at the, the very wealthy,
00:43:27.100
the, the, the, the other thing you can optimize for as a male is I'm not optimizing for the most
00:43:34.640
certain it's a female person, but for the longest fertility window I can get. Yeah. Basically total
00:43:40.600
lifetime value. Like you have the luxury to play the long game. So you're going to try to get an
00:43:46.520
asset that will last the longest. Right. The problem is, is that the traits that signal that someone is a
00:43:53.320
female are typically inversely correlated with lengths of fertility window, i.e. longer fertility
00:44:00.400
window is smaller breast, smaller, but more boyish figure. And if you see the, some of the women
00:44:09.240
that like the, the wealthiest men in the world go after, they have this body type and it's just
00:44:16.160
clapped. But there are other men, including men who are wealthy now, but either have like a weird,
00:44:21.740
like poor man mindset or came from more humble beginnings. Like his wife does not look like a
00:44:29.160
rich woman. I didn't say it. You said it. Okay. But like, and also Trump, and this is why I'm like,
00:44:33.160
Trump is not one of the people who I think is on like the Epstein list or something. Like there aren't
00:44:37.760
videos of him because that's just not his, just not his type. He goes for the, the, he goes for the
00:44:44.600
definitely a woman people. Right. If you're looking at the, what's an example of like an avatar of the
00:44:50.800
other type, you're looking at somebody like Grimes, right? Like Grimes is probably as hot as you can
00:44:55.980
get in terms of what rich people find hot. Very gammean, very, yeah. Ethereal. Yeah. Thin.
00:45:03.360
Yeah. Yeah. Who are some other examples? I mean, like a lot of, a lot of models are like that.
00:45:08.780
Just very young, very thin, a little bit. Yeah. But yeah, that wasn't, yeah. So yeah.
00:45:18.000
You just, you think that Trump should have in this case, even though this is such a delicate issue
00:45:23.800
because he had to break a campaign promise, he should have apologized in some way with this.
00:45:28.920
I agree. He should have said, I effed up and I would have had I been in his position, but you know,
00:45:35.960
Yeah. I guess when in doubt, if you have to choose just one, like you, you sometimes apologize
00:45:43.580
or you never, ever, ever apologize. It's probably just safer to have a policy of never, ever, ever
00:45:48.920
apologizing. I agree. Yeah. No. Have we apologized on anything on this show? I don't know.
00:45:56.460
I don't know. Like when I tweeted today, I said that we, we made a mistake in acting as though it
00:46:03.700
was just communism that only worked in certain contexts and the democracy also only works in
00:46:09.420
certain contexts and we've learned, but that wasn't an apology. It's not like anyone's like accused us of
00:46:14.420
doing that. So no, I don't think we, I don't think we have, but yeah.
00:46:20.300
Yeah. Yeah. I think I've, I've, well, no, I, I, I've apologized for things I did in the distant
00:46:24.900
past. I mean, like I slept around a lot in the distant past and I now think that that's wrong
00:46:29.620
and that's not something that I would recommend that my children do, which is sort of an apology,
00:46:37.980
but it's more just saying that. I don't know. I mean, you're, you're like, you're a wingman
00:46:42.520
to, well, sort of to other women. Yeah. I mean, I tried to help them. I didn't do it in like a mean
00:46:46.980
way, but I don't think that it's something that I would promote today, but I was never famous when
00:46:53.580
I was doing that. So I never promoted it in the way that other people might promote it.
00:46:57.840
Well, for what it's worth you, you don't. I checked on tea.
00:47:07.640
I signed on. I got verified. Well, cause I was thinking maybe there's like episode fodder here.
00:47:12.080
I want to explore it more and see what people's actual reports of people are to see if there's
00:47:16.120
for an episode in there. I would love it if you look up people who we meet in like business
00:47:26.920
Oh no, I should. Oh no. Okay. Well, I know what I'm doing tonight after.
00:47:33.500
But I didn't have anything on tea. Nobody complained about me?
00:47:35.940
No, there was nothing. I flagged it so that if someone does post something that I will know.
00:47:43.700
Yeah. It actually asks you once you're verified and you go through the signup flow, like,
00:47:47.700
are you married? And do you want to know if your husband comes up? Cause yeah,
00:47:51.740
you might not want to. And I appreciate that they ask.
00:47:54.840
That's actually pretty cool. I mean, yeah. So for people who think that I might have some like
00:47:58.440
scandalous history or be like, this is like reverse Ashley Madison, right? Like you can find out if
00:48:04.120
some influencer is actually out there scandaling it up. And I guess.
00:48:08.840
Well, but see, I wouldn't trust what's posted about influencers because I would expect people
00:48:13.400
to make stuff up. And I don't, I probably shouldn't include this in our episode because
00:48:17.840
it would incentivize people to go on the platform and make things up. So you should probably not.
00:48:23.640
No, I think they'd make things up in a way that's very silly.
00:48:26.680
And where it's like, obviously not you. Yeah. Our, our real stories are stranger than fiction.
00:48:32.980
Yeah. Our real stories. If people knew like our actual scandals, they'd be like,
00:48:37.500
Oh my God, what the, like, okay. That makes so much sense. That makes so any of our actual scandals.
00:48:45.340
Everyone is going to be like, cause there are things where I'm like, I do not want the public
00:48:49.320
to deal with this, but people would hear it and they'd be like, like, it'd be one of those,
00:48:54.200
like, just drop the microphone and walk away things. Like, of course it checks out, whatever.
00:49:00.900
Yeah. Yeah. Just, just like Trump and his, his various peccadillos. Everyone's like, yeah,
00:49:08.620
but how are you surprised by this? No one's surprised by this. So yeah. Okay. Well,
00:49:15.700
anyway, love you to death Simone. Um, tonight we're just going to do the slow air fryer, right?
00:49:21.440
Yeah. But I kind of want to back up for this. Cause yeah, it's not very good. Can I make some
00:49:27.880
gyoza for you? It's like a backup maybe just to be safe. Yeah. Gyoza would be great. Let's see if
00:49:33.840
it works. And if it doesn't work, I'll have some gyoza. Yeah. And I don't need many gyozas, you know,
00:49:38.300
keep in mind no more than like three meat gyozas and a couple of vegetable ones. If I'm only doing
00:49:42.620
just two, if I have the big pork ones, those ones are like freaking just two of the big pork ones.
00:49:47.980
Yeah. That's fine. I've never had a gyoza like that before. That doesn't count. It's like a,
00:49:51.800
it's the burrito of gyoza. It's so intense. So yeah. Okay. I will, I won't get to it. I love you
00:49:58.800
desperately. All right. My, my, my food slave, go cook, get in the kitchen. Yeah, I will. I will.
00:50:06.960
Drop that baby on your pregnant back and then get in the kitchen and cook us dinner for your
00:50:12.240
freaking five kids. And hope she doesn't grab a knife and stab you. I know she's right there.
00:50:20.140
She's within reach. I really need to move. Literally might try. Like she does grab knives.
00:50:25.840
She does. She grabs everything. She grabs everything. Like I'll take off my, my, my dress at night and
00:50:31.660
like magnets will fall out. Like all sorts of things that she's like shoved down my back. It's,
00:50:37.000
this is my life now. You are beleaguered. You are beleaguered.
00:50:43.740
All right. Down I go. I love you. All right. Love you. Bye.
00:50:48.820
I know I'm giving you more too many episodes to process and insisting on all these conversations,
00:50:53.900
but I just really like talking with you. Don't take this away from me.
00:50:58.720
I'm forcing you to, God, it's so many. Yeah. Like tomorrow, Simone, I don't want to do any
00:51:04.920
more episodes. We've got too much of a backlog at this point. And you say, no, Malcolm, you
00:51:09.420
must, you must do more episodes. And I was like, well, I won't prep any. Then she's like,
00:51:14.080
then I will. Yeah. Then fine. I will. No problem.
00:51:19.600
You monster, Simone. You won't let me be done. And then people are like, why don't you just put
00:51:24.880
them on Patreon? I might use them on the main channel. No, no. We're going to need to put a
00:51:31.080
whole bunch more for members only on Patreon. They deserve it because they're the best.
00:51:37.740
I would have thought that my days would be researching things to have interesting conversations
00:51:42.100
with my wife. Gosh. I mean, we're, we're not far from turning this into our primary income source.
00:51:47.260
And if we do that, that would be the craziest thing ever. And I'm just paid to have interesting
00:51:51.400
conversations with the person I enjoy talking to most in the world.
00:51:55.780
That would be the dream. Though I can't discount the fact that you put a lot of work into editing
00:52:00.280
these and I really appreciate it. Oh yeah. That is a lot of work actually.
00:52:04.500
Yeah. But it's work I can do while reading romance mangas. That's, that's what I've learned.
00:52:10.600
Well, yeah. Honestly, if that makes it work for you, you should a hundred percent do it.
00:52:16.300
No, it makes me more excited to do it. I'm like, oh boy. Yeah.
00:52:19.440
Yeah. That's perfect. Good. I never thought I'd get into these, like these, these very legit.
00:52:26.880
Nah, I get it. I got it. They are like girls have a more discerning taste than I thought in terms of
00:52:34.360
the stuff that they read. I was like, this is pretty, pretty good. Yeah. I haven't read one
00:52:38.640
since love advice from the great Duke of hell, but like I, that was the last thing I've ever read
00:52:44.100
that I accidentally stayed up like two hours, you know, past my bedtime to, to read. Cause it's just
00:52:51.100
that good. You know, it's just like, and, and laughed out loud. Who, who laughs out loud when
00:52:56.440
they read anymore, but it's. Yeah. Do, do I need to give you some of, some of mine to, to entrap you?
00:53:02.160
And I don't know. I mean like your, your favorite genre is so specific, you know, and love advice
00:53:06.780
from the great Duke of hell. You would love this genre. Cause it's all about noble courtiers doing
00:53:10.680
like being incredibly proficient and playing social games to try to marry the prince. Yeah. If it's
00:53:16.700
attractive, competent people who are good at what they do, I probably will like it. Yes, you will like
00:53:22.060
it. And everyone, even the characters who are explicitly in the script as ugly are incredibly
00:53:27.440
attractive. Cause it's, I mean, cause it's, yeah, it's fricking manga. When they draw