Red Ice TV - March 08, 2024


Hoover Institution Stuck In 2016, Distressed About ‘Alt-Right’ Women


Episode Stats

Length

47 minutes

Words per Minute

178.51323

Word Count

8,508

Sentence Count

824

Misogynist Sentences

57

Hate Speech Sentences

50


Summary

The Hoover Institution hosted a workshop on how the rhetoric of women in the alt-right broadens the movement s appeal, featuring a conversation with Richard Nielsen, an associate professor of political science at MIT, presenting joint work with Eliza Oak called That s what these men spend their time on? Analyzing women like me.


Transcript

00:00:00.000 The Hoover Institution within Stanford University claims to be a public policy research center
00:00:06.220 promoting the principles of individual, economic, and political freedom.
00:00:10.200 Well, they just hosted a workshop examining women in the alt-right.
00:00:15.140 Hoover Institution just released new research about the alt-right.
00:00:20.120 Yes, that's right, that thing that happened eight years ago in 2016.
00:00:23.640 The workshop features a conversation with Richard Nielsen on how the rhetoric of women in the alt-right broadens the movement's appeal.
00:00:33.900 Really? You guys are still on about this?
00:00:36.300 Turns out this Spurgy nerd has been working loosey-goosey on this since 2019, I think he said, collecting data.
00:00:43.560 And here we are in 2024.
00:00:45.580 So let's hear the introductions and show a few clips for your viewing entertainment, of course.
00:00:53.640 Hello, everyone, and welcome to the Hoover Institution.
00:00:57.400 The Glockenspiel. I'm sorry, I can't even...
00:01:00.400 Right up the gates, you have to talk shit.
00:01:03.360 What is it with the World Economic Forum?
00:01:06.560 They're all using the corporate Glock.
00:01:09.020 It's not Glockenspiel. It's like a...
00:01:11.140 Xylophone?
00:01:12.460 Xylophone is technically the metal ones. This is the wooden ones.
00:01:16.280 I forget what that's called, but enjoy this.
00:01:18.200 Hello, everyone, and welcome to the Hoover Institution workshop on using text as data and policy analysis.
00:01:28.500 In this workshop, we feature applications of natural language processing, structured human readings, and machine learning methods to text as data to examine policy issues across the social sciences.
00:01:40.300 To examine women saying things.
00:01:42.060 Policy issues.
00:01:42.840 No, it's hilarious.
00:01:44.840 Where is he living here?
00:01:45.900 The technology.
00:01:48.560 So I think I had a rehearsal studio, like literally in an underground bomb shelter that looked like that.
00:01:53.300 That's where he takes the kids he kidnaps or something.
00:01:55.940 There you go.
00:01:56.740 With his weird...
00:01:57.460 I'm just doing some algorithms here on my wall.
00:02:02.040 Some brilliant physics.
00:02:04.280 Solving racism.
00:02:05.180 Natural security.
00:02:05.960 Yeah, natural security.
00:02:07.680 Goey slop.
00:02:08.860 ...political science in other fields.
00:02:10.960 I'm Justin Grimmer, and Steve Davis and I co-organized the workshop.
00:02:15.700 Today, we're thrilled to have Rich Nielsen, who's an associate professor of political science at MIT, presenting joint work with Eliza Oak called...
00:02:24.920 That's what these men spend their time on?
00:02:26.420 Yeah, that's it.
00:02:27.260 Analyzing women like me?
00:02:28.640 Well, you know, if you can't have women, then you have to look at the data.
00:02:34.400 Yeah, can we look at the data?
00:02:36.220 You know, they pay...
00:02:37.040 Can we analyze why I'm still single?
00:02:39.400 They pay their mortgages with this.
00:02:42.400 They're sitting here for not...
00:02:43.540 Four years later.
00:02:45.700 Four years later.
00:02:46.940 And this is...
00:02:47.420 But the material is from 20...
00:02:49.440 As you said, 2016.
00:02:51.480 It's old news, dude.
00:02:52.620 It's almost a decade ago.
00:02:53.760 It's almost a decade ago.
00:02:55.280 And it's, of course, only YouTube data, by the way, too.
00:02:58.380 Like, they can't even...
00:02:59.500 You know, we can't look anywhere else.
00:03:01.280 He tells his motivation, the blonde guy there, his focus on why he's doing this and how he's collected the data and what effects he thinks it's having.
00:03:10.380 Rich is going to speak for about 30 to 40 minutes.
00:03:13.140 If you have any questions, please place them in the Q&A feature.
00:03:17.320 Steve and I might interject with some pressing questions.
00:03:20.220 All right.
00:03:20.940 And it's possible that there could be some live responses in the Q&A, though unlikely.
00:03:25.880 There was unlikely because no one's watching this shit.
00:03:28.480 It's very bad news.
00:03:30.220 Even he admits, yeah, no one's going to be there.
00:03:32.700 Now, we have to mention first here, though, how the rhetoric of women in the alt-right...
00:03:37.860 Well, first of all, not everyone in here is a woman, right?
00:03:40.620 So the second one they include...
00:03:42.140 Or white.
00:03:42.580 That's a man.
00:03:43.820 That's Asian man.
00:03:45.500 Oh, yeah.
00:03:46.120 Blair.
00:03:46.400 Half Asian, white man or something.
00:03:48.960 Yeah, anyway.
00:03:49.680 And the majority of these, I mean, Rebecca does some stuff.
00:03:53.080 Yeah.
00:03:53.160 Mirworn has gone off from a tan.
00:03:54.960 She's doing, you know, kind of other stuff.
00:03:58.520 Faith is gone.
00:03:59.660 Bree is gone.
00:04:00.280 Bree is not really doing it.
00:04:00.720 Faith is gone.
00:04:01.560 Right blonde bottom record, she's gone.
00:04:03.200 You're there, obviously.
00:04:03.940 Brittany's kind of doing stuff.
00:04:05.280 Kind of doing, like halfway.
00:04:06.180 Who is this over here?
00:04:06.900 I forget her name, dang it.
00:04:08.600 But I don't even see her doing stuff.
00:04:10.240 Yeah, that's why.
00:04:11.000 So we have one, two, and two and a half of the women here.
00:04:16.820 They're, like, still engaged in some kind of content creation on any regular basis, at least.
00:04:22.800 So that's the number one failure at the gates here, I guess.
00:04:26.620 But anyway, we can go back to this riveting presentation here.
00:04:29.620 So there may be some disturbing content ahead.
00:04:32.300 Oh, the warning.
00:04:32.580 Trigger warning.
00:04:33.260 Oh, watch out.
00:04:34.100 You might be recording.
00:04:34.320 There's a trigger warning.
00:04:35.860 Again, straight from Charlottesville.
00:04:39.080 Here's the ticket torches.
00:04:40.440 And here's Ayla and Lana.
00:04:44.120 Articles in the popular press.
00:04:48.100 So this is from The Atlantic.
00:04:49.740 This is NPR.
00:04:52.220 That's a photograph of Lana Lochteff.
00:04:54.900 Oh, really?
00:04:57.240 He shows a bunch of articles with me.
00:04:59.120 Because remember, I was attacked, like, the most of all these girls.
00:05:02.620 Do you think of a stock footage?
00:05:04.560 Here's just some random photo of a random girl.
00:05:06.840 He has to explain that.
00:05:08.320 Is he in love with you, Lana?
00:05:11.020 Is he sitting over dreaming?
00:05:13.680 That's a photograph of Lana Lochteff, who I just, you know, had a shot up from her video.
00:05:19.660 Again, a lot of emphasis on Lana Lochteff and Red Ice TV.
00:05:23.760 Yeah, gee, I wonder why, you know.
00:05:28.560 It's just common because, like, all the articles for a time, remember?
00:05:31.900 It was, I was, like, on mainstream news, like, all the time.
00:05:34.660 Yep.
00:05:35.100 It was hilarious.
00:05:35.840 It's funny now how the most of them have dropped, like, dropped out.
00:05:39.320 It's just, like, if there's any, like, mainstream articles, it's about, like, either Nick Fuentes or, like, I think that's, I think that's basically it.
00:05:48.000 That's what they've resulted into.
00:05:49.260 It's, like, we either cover Nick Fuentes nonstop, like, like, Right Wing Watch does, or it's, like, hardly anyone else doing anything.
00:05:57.720 Exactly.
00:05:58.320 Like, we don't exist because it was, you know.
00:06:00.500 And I think that's a strategy.
00:06:01.680 But anyway, be that as it may, it's interesting.
00:06:04.320 So, he's just going to talk about his focus and men versus women, basically, is his goal, saying the same messages and how it impacts people.
00:06:13.780 It's just hilarious.
00:06:14.480 Okay.
00:06:14.740 Okay.
00:06:15.560 Actually, and, you know, headlines like, Why are women joining the alt-right?
00:06:22.900 And you can see, you know, here in the kind of header of the transcript, its ideology rejects Jewish people, people of color, LGBTQ people, and immigrants.
00:06:34.320 And it's dominated by white men.
00:06:36.280 So, why are some women joining it?
00:06:37.940 I think that's been the dominant.
00:06:39.120 It's all over the place, this guy.
00:06:40.840 Like, he's not doing a good job explaining anything here, first of all.
00:06:45.300 What the critique of those Jewish people are?
00:06:48.100 Or when we do talk about people of color?
00:06:50.380 Usually, it's tied to behavior.
00:06:52.160 No, we just randomly attack, you know, black people and gay and Jewish people for absolutely nothing.
00:06:57.480 Yeah, because we don't focus on, like, how the LGBTQ, like, crowd is going after kids or anything like that, or, like, migrant crime.
00:07:04.520 Or rape, disproportionate rape, you know.
00:07:07.520 Dominant frame.
00:07:08.820 Reject them, yeah.
00:07:10.360 Or black crime or Jewish power.
00:07:13.760 Like, oh, that's all hands off.
00:07:15.280 Can't talk about it.
00:07:15.940 We can only, according to this guy, and this is supposed to be, what is it, Hoover Institute?
00:07:20.860 Institution?
00:07:21.560 That's, like, more libertarian.
00:07:23.780 Yeah, but he's, like, joining with them on this workshop.
00:07:27.340 Yes.
00:07:27.800 Right?
00:07:28.160 Yes.
00:07:29.080 Okay.
00:07:29.280 But, I guess, talking about white privilege, though, that's fine, right?
00:07:32.320 That's a real thing.
00:07:33.520 Yeah, this is now libertarian freedom.
00:07:35.440 Freedom.
00:07:36.140 You know, that's what it is.
00:07:36.800 But, you know, again, they have their, Hoover has their, what, campus on the Stanford campus, right?
00:07:41.580 They have their facilities there.
00:07:43.760 It's just a matter of time before they're infected fully with this kind of AIDS, right?
00:07:47.300 And the, call it, journalistic takes on women in the alt-right.
00:07:54.060 Alt-right, again, also characterized by this headline from Sayward Darby, who's written
00:08:01.740 a book about it now.
00:08:03.900 Great book.
00:08:04.620 How can alt-right women exist in a misogynistic movement?
00:08:08.700 And I actually think that that's not the most interesting question in this space.
00:08:14.020 So, this question has already largely been answered.
00:08:17.640 The easy answer is that it's a patriarchal bargain.
00:08:19.820 A patriarchal bargain?
00:08:20.700 Patriarchal bargain is...
00:08:21.620 Hold on.
00:08:22.220 Hold on.
00:08:22.540 So, I have to go knock on the door of the patriarchy of this movement, this one man somehow, and
00:08:29.800 be like, hello, man, gatekeeper.
00:08:32.380 Can I please come in and make a racist video?
00:08:35.180 Will the patriarchy let me make this racist nationalist video as a woman?
00:08:40.320 It's a patriarchal bargain.
00:08:42.680 I mean, this is ridiculous.
00:08:44.260 Well, check this out.
00:08:45.060 This is coming.
00:08:45.640 It's written by Sayward Darby, who is like such a feminist, right?
00:08:49.160 Thinks that being a feminist is being pro-woman, and that anti-feminist is being anti-woman,
00:08:54.660 which, of course, it's the complete opposite.
00:08:56.420 Sayward Darby also has run with this lie, and I never said that, that men in the alt-right
00:09:01.920 threatened and harassed me, that I said that.
00:09:04.380 I never said that.
00:09:05.740 But that's everywhere, and it's being used everywhere by these people.
00:09:08.540 Yeah, it's been very little of that, and if there is some of that, it's a few troll accounts,
00:09:12.440 and that could be either, it could be friendlies or not, and I mean, who cares, right?
00:09:17.440 In the scope of things, no, most of the men who are in right-wing circles are very happy
00:09:22.820 that women want to be there, for the most part.
00:09:25.000 Not always, of course, but for the most part, they're like, okay, that's good.
00:09:28.400 But look at the subtitle here, though, right?
00:09:31.820 Strategies for Gaining Security and Autonomy Within a System of Gender-Based Oppression.
00:09:36.520 And isn't this making the whole argument in reverse, right, that if they had to seek,
00:09:45.540 if this is a survival strategy, that's what it sounds like to me, right, strategies for
00:09:48.900 gaining security, which basically means, like, survival within it, then, wouldn't it be
00:09:52.820 easy for these women just to go side with the system as it exists?
00:09:56.720 Yeah, you would get the whole support of the system, the power structure.
00:10:00.900 It's like, and it's not that men within the movement are weak, but I'm saying, as it exists,
00:10:08.220 it's not a wealthy, influential movement, for the most part, right-wing movement, especially
00:10:13.200 dissident movements, or at that time, alt-right.
00:10:15.760 So it's fairly low status.
00:10:17.720 In other words, there's no incentive for women to be in those circles unless, and this is what
00:10:23.800 they don't get, unless they just truly believed in the things that they're talking about.
00:10:27.700 Well, they can't fathom that.
00:10:29.000 That's just it, right?
00:10:29.500 They cannot do that.
00:10:30.700 They can't fathom that.
00:10:31.100 It would have been more interesting if you said, well, survival strategies.
00:10:33.000 Truly grassroots.
00:10:33.880 What about survival strategies for their offspring or something, as they're becoming minorities,
00:10:38.860 as they're being replaced or something?
00:10:40.500 That's right.
00:10:40.980 These people don't, they don't understand.
00:10:43.060 So they're trying to, with all these weird, like, answers to it.
00:10:47.600 But anyway, we could, I'm digressing.
00:10:49.360 Let's go on.
00:10:49.700 Keep going.
00:10:49.880 Where women trade security and autonomy in exchange for supporting a system of gender-based
00:10:59.380 oppression.
00:11:00.980 I'm so oppressed, Henrik.
00:11:02.520 I'm so oppressed in my relationship with you.
00:11:05.220 Why would they agree to that?
00:11:06.940 Every woman, like, content producer I know, like, her husband is, like, so supportive and
00:11:11.640 loves what she's doing.
00:11:12.760 They have a good relationship.
00:11:14.220 So this is just bullshit.
00:11:15.440 You know, they just, like, guess.
00:11:16.760 They just assume it's, like, some, a movie to them.
00:11:20.960 And, like, women are just doing so much better, like, if they sided with, like, the immigrants
00:11:26.380 or something, you know, like, people are anti-white or something, you know.
00:11:30.200 Yeah, right.
00:11:31.080 This has been, you know, documented, theorized.
00:11:34.320 This goes back to Candiote in 1985.
00:11:37.460 It's been documented and theorized pretty well.
00:11:40.220 And I think.
00:11:40.940 Well, that settles it.
00:11:42.560 It's been documented.
00:11:44.540 Some liberal documented.
00:11:45.400 Oh, so Eliza and I are less interested, kind of, in this first-order question.
00:11:49.780 Although I'm happy to talk about, you know, like, why might these women be participating?
00:11:54.580 It's not.
00:11:55.500 Please tell me.
00:11:56.460 What could possibly be the reason?
00:11:59.000 Because they can't be genuine in what they believe, right?
00:12:01.940 The phenomenon, as the journalistic writing makes it seem, right?
00:12:07.580 And I just want to put this out there.
00:12:09.180 Of course, he doesn't actually say what any of our talking points are.
00:12:15.240 No.
00:12:15.520 What do we talk about?
00:12:17.360 No.
00:12:17.820 And let's examine that.
00:12:19.520 He doesn't at all throughout this whole thing.
00:12:21.480 Oh, they just don't like blacks and Jews and gays.
00:12:24.240 Yeah.
00:12:24.520 Well, that's a pussy way out, dude.
00:12:26.520 That's totally.
00:12:27.160 Well, the framing is, given that this is evil and bad, why does women who supposedly must side with, like, the leftist progressive establishment, because they're the ones who are going to take care of these women, why do they side against them?
00:12:42.580 You know what I mean?
00:12:43.040 It has to be some Stockholm.
00:12:44.080 That term is wrong, by the way.
00:12:46.000 I'll do that once.
00:12:46.880 But regardless, people understand it.
00:12:48.540 There must be some Stockholm syndrome thing going on here.
00:12:52.780 What I think is more puzzling or where we have less clear answers are how do women get authority in a male-dominated movement?
00:13:02.540 See, there we go.
00:13:03.060 And what effects do they have?
00:13:04.240 How do we get authority?
00:13:05.680 We contract central command.
00:13:07.520 That's what I'm saying.
00:13:08.080 I go knock on the door of the patriarchy and say, can I make a video?
00:13:11.580 You start a YouTube channel, dude.
00:13:13.500 You just make a video and you put it online.
00:13:17.040 That's what happens.
00:13:17.980 There's no one you go to to ask to, like, can I come into this patriarchal, oppressive space now and put up my video?
00:13:26.280 I mean, this is hogwash.
00:13:28.220 You have to understand, the accusations are admissions.
00:13:32.460 This is how they operate.
00:13:34.060 They go to central.
00:13:35.380 How can I get in?
00:13:36.840 Who can pay me to do this?
00:13:38.700 Yeah, pay or who can I bribe?
00:13:40.460 Or who do I need to, you know, maybe even worse, bend over for?
00:13:44.020 Or whatever it is, right?
00:13:44.960 The record industry works that way.
00:13:46.300 The journalistic industry works that way.
00:13:47.900 The political system works that way.
00:13:49.940 The financial, everything works that way.
00:13:52.320 In this case, it's just organic.
00:13:54.840 It's just women doing a YouTube channel and talking about shit.
00:13:57.880 And then either other people are saying, you suck, your takes are bad.
00:14:01.320 Or they say, hey, this is some good shit here.
00:14:03.100 Like, they're having good points.
00:14:04.520 And they support that, you know what I mean?
00:14:06.500 But no.
00:14:06.920 It's pretty simple.
00:14:07.560 It's not that complicated, man.
00:14:10.000 And you can see here, I've put up pictures of nine of the 13, or sorry, 11 women that we're analyzing in this particular paper.
00:14:22.960 There is a very strong, like, performance of gender that's happening visually on the screen.
00:14:31.240 Performance of gender.
00:14:33.040 Well, actually, one is a man.
00:14:34.780 The rest are all biological females.
00:14:36.880 There's one strong performance of gender in here.
00:14:40.180 Or gender.
00:14:41.240 I mean, women, females, whatever.
00:14:44.040 Pretending to be.
00:14:45.300 LARPing as a woman.
00:14:46.540 Yes, that I agree with you.
00:14:47.520 Listen to his language here.
00:14:49.960 Dude, what happened to you?
00:14:51.520 You've been brainwashed.
00:14:54.460 I guess nothing happened.
00:14:55.940 And then maybe that was the problem.
00:14:56.840 God.
00:14:57.760 A strong performance of gender.
00:15:00.260 And then he's like, and they share the same look or something.
00:15:03.920 Yeah, they're not fat.
00:15:06.880 Fair amount of, like, priority towards a certain look, certain appeal, also a certain style of video.
00:15:13.240 Normal hair colors.
00:15:14.780 Normal clothes.
00:15:15.800 Not looking like a feminist.
00:15:17.240 Non-lefty mutants.
00:15:19.020 Exactly.
00:15:19.540 Just like being a normal woman.
00:15:22.160 Note that they're not mutants like me.
00:15:25.180 Sit and watch videos with you.
00:15:27.640 You would see some similarities in addition to the kind of general appearance.
00:15:34.660 What are you talking about?
00:15:35.760 Like, everyone sits in their house and makes a video.
00:15:39.240 A million videos are like this.
00:15:40.940 I think he, no, I think he, did he actually talk about the present, I think he's talking
00:15:44.080 about the presentation of the women, like how they present themselves.
00:15:47.640 That's what I think he means by that.
00:15:48.440 Like normal women?
00:15:50.160 Yeah.
00:15:50.880 Normal women.
00:15:51.560 They comb their hair and do makeup.
00:15:53.480 And they're, you know, most of them are attractive.
00:15:55.760 You know what I mean?
00:15:56.020 Not at all, but, you know, especially not at the top.
00:15:59.360 The most attractive one there with the little kissy lips there.
00:16:02.740 Oh, my gosh.
00:16:04.220 We want to theorize this a little bit, but also.
00:16:07.460 He's got to theorize.
00:16:08.600 Let's theorize this a little bit.
00:16:11.380 Blair White has been, many people, of course, are like, you're out.
00:16:16.780 You know what I mean?
00:16:17.140 Like, that's plenty of that going on, too, by the way.
00:16:19.760 Sure.
00:16:20.040 Like, you can say some good things, but you're out.
00:16:22.020 Yeah.
00:16:22.240 You're not welcome here.
00:16:23.200 You know, as far as we know, previously unexplored corpus of videos that these women and then counterpart men in the movement have produced.
00:16:35.780 And that's what the paper is, is looking at women's rhetoric, looking at men's rhetoric and comparing it to see, OK, what can we glean from the rhetoric they use about how they're constructing their authority in this space?
00:16:49.780 In space, online, in the world?
00:16:54.000 Yeah.
00:16:54.200 What is this?
00:16:55.020 Constructing their authority?
00:16:56.600 I thought you had to go to somebody to get the authority.
00:16:59.580 This is just ridiculous.
00:17:00.460 Isn't that a contradiction to what he just said?
00:17:02.000 Each one of these women have to construct, they have to make a space and create their own authority within this page.
00:17:08.500 I mean, this is just ridiculous.
00:17:10.320 Again, since 2019, they've been sitting there.
00:17:13.420 Dude, are you married?
00:17:14.240 Do you have a girlfriend?
00:17:15.220 Do you know how relationships actually work, other than what progressive, you know, feminists tell you?
00:17:20.400 As I said, only YouTube.
00:17:21.900 Now, there could be this reason that, like, Google owns YouTube and Google, like, started to see it as, like, a CIA funds, essentially.
00:17:28.660 So it's like they can get exactly everything they need from them.
00:17:31.440 And maybe they can't get what they want from a Rumble or, you know, an alternative platform like BitChute or something.
00:17:36.920 But it is telling that it's always that.
00:17:38.940 And most of these people have been banned from YouTube.
00:17:41.480 Yes, exactly.
00:17:42.580 And he even says, oh, it kind of made it hard, you know, 2019.
00:17:45.560 And this is 2024, dude.
00:17:47.040 What have you been doing all this time?
00:17:48.220 And by the way, he says he went to, like, SPLC first for his research.
00:17:53.920 You didn't even research anything.
00:17:55.360 Show some mainstream articles.
00:17:57.520 You know, that's not research.
00:17:58.920 Given that these people are evil, here's the default position on that.
00:18:02.760 Now let's analyze, like, how their reach is and how that impacts people and influences them.
00:18:08.420 Give us the definition of alt-right that you are after.
00:18:11.880 Good question.
00:18:12.380 And how you would sample from that, that frame.
00:18:17.060 Yeah.
00:18:17.260 Okay.
00:18:17.620 So, shoot.
00:18:20.740 Do we have a bunch of things got cut as we tried to shorten the paper?
00:18:24.340 I'm like, do we have a definition of alt-right in the paper at this moment?
00:18:28.720 No.
00:18:29.040 We'll define alt-right.
00:18:30.300 It was a cut there.
00:18:31.320 Do you see that?
00:18:31.980 They actually edited it afterwards.
00:18:33.200 Yeah, because he was fumbling around like an idiot.
00:18:35.220 Probably.
00:18:35.560 Like, eh.
00:18:36.260 He literally is doing this whole thing about women in the alt-right, the alt-right message.
00:18:40.840 And he doesn't even know.
00:18:41.380 Like, no one even uses this term anymore.
00:18:43.280 Like, this is embarrassing and cringe.
00:18:45.680 And then he can't even define the thing.
00:18:47.800 Yeah.
00:18:48.020 So, the paper, the women in the alt-right.
00:18:51.200 And the second part of that, he can't even do it.
00:18:54.620 50% of the thing.
00:18:55.920 Horrible.
00:18:57.080 Horrible.
00:18:58.500 Holding a right-wing set of ideas and being open to, or more than open to, racist interpretations.
00:19:10.060 And what is that?
00:19:10.780 Define that.
00:19:12.020 Of world politics.
00:19:13.060 And this isn't just America.
00:19:14.320 Define that.
00:19:16.260 Saying I'm white and I should exist.
00:19:18.440 That white privilege doesn't exist.
00:19:20.140 I mean.
00:19:20.700 Accepting racist talking points.
00:19:23.800 Of course, again.
00:19:25.280 Well, you know.
00:19:25.680 Sure, whatever, man.
00:19:27.340 Doesn't touch any points.
00:19:28.160 Let's get back to more data and show these screenshots and lines and numbers.
00:19:33.640 Yeah.
00:19:34.000 So, basically.
00:19:35.360 To say you're racist.
00:19:36.420 Why not just say, like, racist women then or something?
00:19:39.060 Why even do the alt-right thing if it just means racism?
00:19:42.380 His big conclusion is that women make more racist material content than men.
00:19:47.220 Ooh.
00:19:48.360 Ooh.
00:19:49.120 Now, this part is interesting and hilarious because he's talking about multidimensional
00:19:55.200 video data and observational analysis.
00:19:57.740 All these big words of how he gathered material to analyze or face, gender, emotion recognition
00:20:04.520 and these sampled frames and Google object detection.
00:20:08.240 It's just hilarious.
00:20:09.300 Man, this guy's, like, going all in on this thing.
00:20:12.100 So, all right.
00:20:13.420 Let's listen.
00:20:14.420 Of some of that content.
00:20:17.080 We still have a lot from the OCR there.
00:20:20.880 And then we're going to do Google object.
00:20:22.760 That means optical character recognition.
00:20:25.240 So, I guess they've used that of text for sample frames.
00:20:30.080 So, not of the actual video.
00:20:31.160 So, text that appears in the video, I assume.
00:20:34.200 But, okay.
00:20:34.900 Whatever.
00:20:35.420 It's a whole big waste of time.
00:20:37.200 Can't you just...
00:20:37.920 I remember many of them, too, a while ago talked about this.
00:20:40.700 How they're using AI because they're just too lazy.
00:20:43.460 And it's like, who has time to watch all this stuff?
00:20:45.620 So, let's just have AI scan all of it for us and then just pull out, like, the racist parts so we can listen to it.
00:20:51.820 Say, oh, you mean to not put it in context or whatever?
00:20:54.060 But, anyway.
00:20:55.060 Google label detection, which is saying what search terms does Google think you would want to associate with this image?
00:21:00.940 Then we're going to use an API from a firm called Face Plus Plus to do face, gender, and emotion recognition on these.
00:21:11.540 Because Google no longer does that through their API.
00:21:18.620 What's the point?
00:21:20.200 This is...
00:21:21.180 Dude, can we just look at this image from...
00:21:24.880 This is what he's spending his time on?
00:21:26.960 Look at this.
00:21:28.040 I mean, you would think this is, like, some psychotic or, like, some obsessed stalker or something.
00:21:35.320 I like that they've identified, here's a person, here's another person.
00:21:45.720 Happiness.
00:21:46.160 They have face, hair, nose, chin, cheek, face.
00:21:49.180 Here's the facial expression.
00:21:50.800 Oh, my goodness.
00:21:53.120 And the comment here, too, is we don't need foreign babies.
00:21:55.520 We have plenty of our own babies.
00:21:56.800 And it's really these new incoming babies from foreign lands that are causing problems.
00:22:00.420 They're pushing for their own foreign...
00:22:01.800 Is this an actual quote or is it?
00:22:03.180 I'm not...
00:22:03.760 They're pushing for their own foreign culture.
00:22:04.900 It's true.
00:22:05.740 We don't need foreign babies.
00:22:06.620 No, of course not.
00:22:07.240 They're trying to lobby for more political power.
00:22:08.780 Yes, they strain on the welfare system and they're taking our jobs that we don't even have.
00:22:12.540 Remember her big white baby challenge?
00:22:14.020 I've had six have more than me.
00:22:15.960 And it, like, made all over the news.
00:22:17.780 Remember CNN and everywhere.
00:22:19.780 Yeah, so he's, like, he's obviously analyzing from this interview.
00:22:23.600 Okay.
00:22:25.860 Visual display of what that's going to look like for the frame we've been looking at.
00:22:28.980 So, in the optical character recognition is going to be in the orange here.
00:22:34.900 Pulling out things on the screen.
00:22:36.940 Text.
00:22:38.160 You can see there's some mistakes.
00:22:39.480 For example, it thinks the Twitter bird is a Y.
00:22:43.120 Oh, my God.
00:22:45.520 It doesn't even do it.
00:22:46.660 It doesn't even do it accurately.
00:22:49.280 Oh, shit.
00:22:50.840 Can't you just look at the lower third and just, like, is this relevant or what's going on here?
00:22:56.460 This is just horrible.
00:22:58.640 Man.
00:22:59.120 I can't believe they put this workshop up.
00:23:01.460 But why a frame, though?
00:23:03.040 Here's a frame.
00:23:03.780 Like, here's a frame.
00:23:05.800 Okay.
00:23:06.180 But is there a longer analysis of the, of, like, what the face does when something is said?
00:23:14.900 Or is it?
00:23:16.020 No.
00:23:16.520 I mean, his big conclusion, I'll tell you later, is that, you know, viewers receive content from women more than men.
00:23:27.040 Like, they, you know, political content.
00:23:29.040 They like to watch that from women saying, you know, things like that.
00:23:32.960 Things that we talk about.
00:23:33.900 Right?
00:23:34.200 Yeah.
00:23:34.520 That's why, yeah.
00:23:35.620 Ooh.
00:23:35.860 So why does he need these, like, lines and this study?
00:23:39.920 It's, like, basically all these technical tools and data analysis to study why we were popular on YouTube.
00:23:45.700 It's, like, he can't see the forest through the trees.
00:23:48.040 He can't even name the freaking trees, right?
00:23:50.240 Instead, he studies, like, this little particle over here and over there, avoiding the forest.
00:23:55.040 We were popular on YouTube because we were the first to put ourselves out there telling big truths about our current world and our place in it as white people.
00:24:02.960 And it might be a little shocking to those who live in denial, but for many, they liked it and they still do because so few are doing it, right?
00:24:09.660 However, since 2016, a lot has changed, buddy.
00:24:13.040 We're not so shocking anymore.
00:24:14.400 There's many more voices like ours, but it's just funny.
00:24:17.460 They have to use all these tools, ultimately, studying our message and why is it reaching people and why is it popular and how is this working?
00:24:26.200 It's called truth.
00:24:27.740 It's called truth and it's called normal women saying the truth.
00:24:30.440 Well, again, let's use multidimensional video data and text to, you know, video to text analyzing to, can you just not sit and watch it and listen to what's being said?
00:24:42.620 But it's like this, they need a way to go through all of it without actually, like, blah, blah, blah, blah, not hearing it or something.
00:24:48.440 You know what I mean?
00:24:48.920 Like, it's almost like, we can't be, we'll be infected also by it.
00:24:52.060 I don't know what, it's bizarre.
00:24:53.580 And then obviously, like, feminists aren't as popular because they're annoying, mostly ugly, and they tell a lot of lies and people are sick and tired of it.
00:25:01.380 It's fading away.
00:25:02.540 Well, they can go to any mainstream channel to get that shit.
00:25:04.440 You know what I mean?
00:25:04.940 This is like, you know, this is the big threat from the small, you know, at the time.
00:25:08.460 Well, I mean, it was great.
00:25:09.300 I'm not, I don't want to belittle it.
00:25:10.580 It was, it was, it is growing.
00:25:13.060 It is, it has been growing since 2016.
00:25:14.600 But they've, they've managed to, you know, knock it, knock it down a couple of pegs with all the censorship.
00:25:19.020 But it's coming up again because now our talking points are hitting mainstream talking heads now, right?
00:25:25.200 Like a lot of lingo that we used many years ago, people are using in the mainstream.
00:25:28.960 So it's just funny.
00:25:30.340 You could use all your tools and graphs and all your analysis, but he wants to, he's avoiding the elephant in the room, which is the message.
00:25:39.300 It's the message that resonates.
00:25:41.300 That's just it.
00:25:41.760 Not some weird, like, because the, this phase is being made here.
00:25:46.840 And he thinks somehow that it's like these women, you'll say later, were like, put there because they were attractive.
00:25:52.820 Here, go say this message.
00:25:53.800 He can't fathom that this is an organic, just grassroots movement and that there's women that truly care about the, you know, their future and the future for their children.
00:26:03.720 Peter Thiel and the Mercers put these women there.
00:26:06.520 Yeah, exactly.
00:26:07.300 Is that their argument or something?
00:26:08.260 Exactly.
00:26:08.600 Uh, the face plus plus algorithm is in the light blue there.
00:26:12.140 It's pulling out female neutral and says no smile here for Lana Lochteff.
00:26:16.140 She's smizing, I would call it, um, in green, you have the object detection.
00:26:21.820 So here it's like really showing you it's just, I know he really wants to be my friend.
00:26:26.220 I didn't even grab the microphone, for example.
00:26:28.480 Um, the labels actually, uh, do a little better.
00:26:32.760 Um, so we get a lot of really useful things about, uh, especially kind of the female gender presentation in the red there.
00:26:40.440 And you can see the boxes for the whole.
00:26:42.600 It says female gender.
00:26:44.100 The female gender.
00:26:45.700 Like what, what is this lingo, man?
00:26:47.880 What happened to you?
00:26:48.680 I guess this is what they use in academics now.
00:26:50.580 It's because you're not really a, nobody's a woman, right?
00:26:53.440 Except men who've, you know, chopped off their generals.
00:26:56.380 They are women, but women as such is just a female, it's a human person presenting as a, as a female.
00:27:04.540 This is how MIT people talk nowadays?
00:27:06.700 Oh my gosh.
00:27:07.420 I guess I have to go through all the prior 30 minutes to get this, but like.
00:27:10.560 No, he didn't explain.
00:27:11.380 Is this actually done on like live video and you get like a multi, multi-layered, uh, uh, breakdown of how this changes over time in the video presentation?
00:27:23.620 Look, this guy's.
00:27:24.240 Puts it in context to the content or what?
00:27:26.340 No, he's horrible at explaining what, what are you doing?
00:27:29.260 What are you looking for?
00:27:30.740 How are you getting this data?
00:27:32.140 And then the two old guys don't ask any important questions either.
00:27:35.140 Of course not.
00:27:35.800 Okay.
00:27:36.020 So how are we going to analyze this?
00:27:38.840 We're going to have a topic model where we combine the text and image data in the same topic model.
00:27:46.840 Uh, nuts and bolts of that is we come up with matrices from the transcripts, matrices from the, uh, visual, from turning the visual aspects of the frames into words.
00:28:03.240 Oh God, man.
00:28:04.140 Uh, and then we append them and there's a question of weighting.
00:28:07.180 We're actually just going to weight them one-to-one at the moment.
00:28:10.000 Um, uh, but acknowledging that, uh, this is kind of new methodological terrain of exactly how you should weight images, uh, versus text.
00:28:19.500 And I'm trying to do some theoretical thinking about that on the statistical side as well.
00:28:23.700 It's not, it's not working out.
00:28:24.460 I'm happy to talk about it.
00:28:25.060 Uh, this is very, uh, then we, this is where all the comments were like, wow, this was really bad.
00:28:34.560 How did you guys publish this?
00:28:36.220 And who's paying for this?
00:28:37.980 Is this Hoover Institution?
00:28:39.820 No, I mean, it's the most complicated ways of just, it's just, it's just a message of like, an observation, for the most part.
00:28:49.100 But the message about observation of reality that's being discussed for the most part by, by whether it's women or men, it's like this shit's, or like, here's a news story.
00:28:57.280 Look at how crazy things are.
00:28:58.800 And we have to like organize to stop this bad thing from having, that's, I mean, that's, that's, that's kind of it, right?
00:29:04.140 In a sense, this weird, this is so weird.
00:29:07.100 It's like, and then a computer has to analyze these things for you to give you some kind of answer because they're completely lost.
00:29:13.420 In some like Klingon talk, you know, it's, it's, it's really simple, our messages.
00:29:18.380 Text matching conditions, topical model.
00:29:20.920 They hate white people, okay?
00:29:22.020 We can summarize it real quick.
00:29:23.660 So implications.
00:29:25.800 Yes, please.
00:29:26.820 As I kind of indicated, this differentiation or outbidding frame we started the project with, uh, our answer now is kind of like, yes.
00:29:34.140 It's like, yes, and they do a bit of both, um, but more complicated than just one or the other.
00:29:41.040 Uh, but the image of all right women as trad wives and pulling things just towards gender and home life, et cetera, is pretty woefully incomplete.
00:29:51.220 Women are talking about white nationalisms, white nationalisms, core ideas more than men.
00:29:56.520 They're provoking more racism when they do, although maybe not through pinkwashing.
00:30:01.440 Okay.
00:30:01.740 Any guy that uses pinkwashing is like, you can't take him seriously.
00:30:07.040 I, I, is he a straight man actually using the term pinkwashing right now?
00:30:12.060 Oh my God.
00:30:13.300 What?
00:30:14.160 Women aren't just talking about things at home and in the kitchen and their kids.
00:30:17.640 They're actually getting involved in politics.
00:30:19.520 We can't have that.
00:30:20.660 And unless they're liberal women, then it's fine.
00:30:22.420 Right.
00:30:22.720 But when it's women on our side of things, oh my God, they're talking, it's white nationalism.
00:30:27.780 Whenever they talk about politics, it's evil white nationalism.
00:30:31.400 They're provoking more racism.
00:30:33.480 Is that a, is that a, some kind of indication that we can image of all right women as trad wives is incomplete.
00:30:37.920 So they're not actually like trad wives then?
00:30:41.860 Is that what he's saying?
00:30:42.920 Or is he saying that they could not create a picture of that in order to analyze that properly?
00:30:49.580 Or it's all just fake to trap you in.
00:30:51.400 But regardless.
00:30:52.040 But we really just want to sell you racism.
00:30:54.340 Doesn't that mean then that one of their kind of arguments of what it is or supposed to be is not what it actually is?
00:31:01.640 Well, yes.
00:31:02.260 You know what I mean?
00:31:02.480 Yes.
00:31:02.820 And I love it too.
00:31:03.460 Provoke more rage.
00:31:04.240 Define racism.
00:31:05.840 We say white people.
00:31:06.580 Any ethnic group standing up for themselves.
00:31:07.960 White people shouldn't be replaced.
00:31:09.760 Okay.
00:31:09.980 That's our big message.
00:31:11.160 We will not be replaced.
00:31:12.540 And here is, here is the agenda.
00:31:14.840 Here are the people doing it.
00:31:16.300 These are all the ways that they are doing it.
00:31:18.040 And we're putting our foot down and saying no.
00:31:19.840 And we're exposing it.
00:31:20.760 So some cucked computer algorithm is basically telling them, oh, it's more racism in this one.
00:31:28.040 Okay, great, man.
00:31:29.360 But did you actually look at it?
00:31:31.000 And did you listen to what it was about?
00:31:32.700 Yeah, but the same computer algorithm.
00:31:33.680 It's like all the AIs, right?
00:31:35.000 It's talking, you know, oh, black pride is healthy.
00:31:37.780 Asian pride.
00:31:38.500 Yeah, that's what I'm saying.
00:31:39.080 It's a cucked, like, exactly.
00:31:40.520 Same thing.
00:31:41.100 White pride.
00:31:41.700 Bad racism.
00:31:42.840 Maybe they use some AI tools in this project too, essentially.
00:31:47.460 And it's basically the default position is white people bad and white people can't do what other ethnic groups can.
00:31:53.100 Yeah.
00:31:53.480 Right?
00:31:53.820 Can't ever stand up for ourselves and call things out, call out lies when we see it.
00:31:57.720 Okay.
00:31:57.980 Evidence is simply that women, when they talk more about race, generate more racism.
00:32:05.180 Define generate more racism.
00:32:07.620 You're analyzing what the audience will be perceiving or feeling when they watch it or what do you mean?
00:32:14.580 After I watch this video with Lana, I just really hate blacks and Jews and gays now.
00:32:18.040 I feel more racist now.
00:32:19.220 I mean, come on.
00:32:21.360 I don't even know what that means, but okay, whatever.
00:32:24.540 There's a lot of stuff going on in his head there.
00:32:26.640 Yes.
00:32:27.140 If men talked more about race in the movement, they would also generate more racism.
00:32:33.560 So.
00:32:34.500 Oh, really?
00:32:35.220 Okay.
00:32:35.480 Well.
00:32:36.100 But maybe they talk more about, like, specific example, whereas women then still, because it's like, what, fair?
00:32:43.020 Like, 26 in death was kind of fairly new in a way, right?
00:32:47.640 This is not your 1.0 kind of stuff, right?
00:32:49.780 No.
00:32:49.860 This is, like, popping up.
00:32:51.100 So it kind of makes sense that if men had generally been in those types of topics longer, they are moving on and talk more about other things, whereas women is, like, trying to both describe and at the same time kind of wrap their heads around the ideas that's being discussed as they're entering into those topics.
00:33:05.160 And make it relatable to their current experience, their lived experience, right?
00:33:09.280 As women, as mothers, right?
00:33:11.260 Yeah.
00:33:11.600 And it's this idea that it's like, well, yeah, of course you want to recruit people, if you will, or, like, convince people.
00:33:18.420 Of your ideas.
00:33:19.300 Of course.
00:33:20.260 Everyone's doing that.
00:33:21.580 I know.
00:33:21.900 It's like, oh, they're trying to recruit people.
00:33:24.580 It's like, so do you.
00:33:26.800 With your dumb ideas.
00:33:28.740 Broad conclusion.
00:33:29.720 Well, he's not a really good job at that, but, you know, that's a different thing.
00:33:32.360 That this isn't just about making white nationalism seem friendly with pretty faces.
00:33:38.200 We don't think that we've fully exhausted some analysis of the performance of gender here.
00:33:42.780 It's quite interesting.
00:33:43.860 And it's in our topic models, but there's a lot more you could do.
00:33:48.540 Here's to four more years of 2016 content coming up.
00:33:51.780 Oh, my God.
00:33:52.680 I'll pay my boat loans for another four years.
00:33:55.820 Finally, we think the takeaway is that women are effective at eliciting support for the movement's most dangerous ideas.
00:34:03.560 What are those dangerous ideas?
00:34:05.900 Having children?
00:34:07.120 Having children that look like you?
00:34:09.100 They don't want those white kids.
00:34:10.080 Because that's what he was bitching about with Ayla there.
00:34:12.960 You know, the white baby child.
00:34:13.980 Oh, is that a dangerous idea?
00:34:16.080 Being a mom and having kids that look like you.
00:34:18.380 I mean, come on.
00:34:20.700 Motivate this here.
00:34:21.940 Nothing.
00:34:22.560 You don't explain anything.
00:34:24.220 Not Jack.
00:34:25.080 You don't get into one argument.
00:34:28.220 Of the observational analysis, which I'm very happy to hear about.
00:34:32.060 We are trying to make it better.
00:34:33.780 You know, there's reality.
00:34:35.080 Yeah, because it sucks.
00:34:36.260 That's why we can somebody help us here.
00:34:38.920 We have this.
00:34:39.520 We have nothing.
00:34:40.340 But you can infer from an observational analysis like this.
00:34:44.040 It should horrify you that many of these racist comments are still up on YouTube.
00:34:48.740 Oh, my God.
00:34:50.220 Thinking about like how racist comments is hurting me.
00:34:55.080 And it's funny because this got hammered with comments, too.
00:35:00.260 Good.
00:35:00.640 Telling them what losers they are.
00:35:02.140 How pathetic this is.
00:35:03.580 By the way, it also showed that they had been in to put comments in to elicit certain.
00:35:12.220 That's what I'm saying.
00:35:13.380 Yeah, simulated YouTube comments.
00:35:14.480 They put in comments to elicit certain responses or something like that.
00:35:18.240 Here's an anonymous account.
00:35:19.400 Female, male account.
00:35:20.620 Let's drop the same shit in there.
00:35:22.560 It can even like mix it up a little bit to like make it seem that it wasn't bots.
00:35:26.460 I mean, it's not bots, but it's essentially bots, right?
00:35:28.720 That's what it is.
00:35:29.440 So it shows you like that I have to run gay ops on this thing, essentially, right?
00:35:34.200 But anyway, yeah, there was some screenshot on the number of content creators, I think,
00:35:38.040 or something like that.
00:35:38.580 And it was like how many there were from Red Ice in there.
00:35:41.980 Blonde Buttermaker, Red Ice, right?
00:35:43.880 Lana, Red Ice.
00:35:45.420 Oh, Impivara, Patrick.
00:35:46.700 Eric Red Ice, Impivara, Patrick, exactly.
00:35:48.800 Like different content creators that we had that they had analyzed.
00:35:52.160 But anyway.
00:35:52.780 Coach Red Pill, well, he's dead.
00:35:54.820 Yeah, Ukraine killed that guy.
00:35:57.380 But that's fine.
00:35:58.440 Let's give him a couple more billions.
00:36:00.860 Kind of one of the prominent people in the movement.
00:36:04.720 So I see it as a mix.
00:36:05.860 There are other powerful people, mostly men in the movement,
00:36:10.480 who can elevate and amplify and vouch for a new entrant.
00:36:16.940 Into the authority space.
00:36:18.220 And they actually do exactly what you're describing.
00:36:20.720 Can you bring this?
00:36:21.620 Can we interview this girl?
00:36:23.100 Can you put her on your show to let her enter into the authority space?
00:36:27.900 What is this nonsense?
00:36:29.800 Total nonsense.
00:36:31.360 I thought these things didn't exist in their world, supposedly.
00:36:34.000 But now they're like admitting that that's how they operate.
00:36:36.880 Listen, we happen to see a video.
00:36:38.340 That girl's cool.
00:36:39.040 Let's bring her on the show or whatever.
00:36:40.620 Oh, now we've allowed her into the patriarchal authority space.
00:36:44.780 We literally have this cartoonish Lex Luthor view of how we're sitting in our cave dungeon
00:36:51.360 and scheming to like, can this one be let in?
00:36:55.340 They really do.
00:36:55.920 The council of the secret right-wing society.
00:37:00.980 There's more.
00:37:01.780 There's more.
00:37:03.540 Appearing on each other's shows.
00:37:05.380 Oh, no.
00:37:05.680 They're doing this interview that you can see on the screen right here with each other.
00:37:08.720 This is Lana using her authority to try to amplify Isla's authority.
00:37:16.000 I'm interviewing.
00:37:16.820 It should.
00:37:17.460 Well, technically, that shouldn't be allowed.
00:37:19.760 Okay?
00:37:20.540 What is this authority talk all the time?
00:37:23.320 It's two women having a conversation, dude.
00:37:25.580 Settle down.
00:37:26.420 No, I'm telling you.
00:37:27.260 This shows you how much of a pecking order, like environment that they operate in continuously,
00:37:32.960 whether it's academics, politics, finance, whatever.
00:37:35.120 Liberal catty bitches.
00:37:36.220 Exactly.
00:37:36.800 It's like the worst hierarchical, you know, brown-nosing environment you can ever imagine.
00:37:42.720 In this case, it's like, oh, that's a cool person.
00:37:44.960 Let's bring them on.
00:37:45.740 It's not like, it's not complicated.
00:37:47.380 No, and never mind.
00:37:47.620 It was this whole big scheme going on before we allow someone on the show.
00:37:53.020 Isla is now famous for saying something that Lana wants to amplify.
00:37:58.240 Lana has the reach.
00:37:59.240 She's essentially in this, like, her and her husband ran Red Ice TV together.
00:38:04.060 Still do.
00:38:05.600 But I can see it over time, actually, in that my inference is that they learned from the
00:38:10.820 engagement side that Lana's videos got more engagement, and they actually then over time
00:38:16.500 keep doing more to make Lana more prominent in the channel.
00:38:20.780 We'd produce more videos because it's successful.
00:38:23.220 I liked it.
00:38:23.600 It was doing well.
00:38:24.600 I liked it, and I got inspired.
00:38:26.180 Let's make more videos.
00:38:26.820 This is doing well.
00:38:27.860 Let's not do that.
00:38:29.240 That's obviously, that's how anybody, you know.
00:38:31.640 We still do.
00:38:32.540 We still do everything we were doing before.
00:38:37.640 I think it's this, again, that they shouldn't be, how are they allowed to do this?
00:38:42.040 You know, kind of like, where's the referee?
00:38:44.480 Hello, come in here.
00:38:45.220 Stop this.
00:38:46.800 As a result.
00:38:47.820 That also brings up that there's a business aspect to all of this, which is partly why
00:38:52.520 they were so irritated about getting kicked off of YouTube because they were making money.
00:38:56.660 No, we never made money on YouTube.
00:38:58.680 We had a couple of super chats.
00:39:00.020 We didn't monetize.
00:39:01.020 In the beginning, but yeah, we didn't have any.
00:39:02.100 We never monetized.
00:39:03.120 I had money, I had revenue.
00:39:04.200 It's always that.
00:39:04.880 See, they're the ones always thinking about money.
00:39:06.900 These liberals are getting like millions and millions of dollars.
00:39:09.400 Well, this is because they know that you have to have funds to grow your operation, to get
00:39:13.020 your message out, to spread your, you know, PR or your propaganda.
00:39:16.040 Obviously, they're trying to operate that this is like some, they've discovered, they're
00:39:19.840 on to us that we're trying to promote and grow and get our ideas.
00:39:23.560 Oh my God, the shock.
00:39:25.520 I don't have a clear answer to like, you know, are they selected from above or do they grassroots
00:39:31.420 from below?
00:39:32.280 I think it's a mix.
00:39:33.720 And I actually think authority construction.
00:39:36.200 How can it be a mix?
00:39:37.180 Wait, wait.
00:39:37.500 Who would be selected from above?
00:39:39.160 Who is this person above that is selecting these women and putting them?
00:39:43.440 Is he accusing us of being fed?
00:39:47.480 We're selecting them from above.
00:39:51.140 From where?
00:39:52.180 From who?
00:39:53.040 That's what I'm saying.
00:39:53.940 There's this man behind the door that we have to knock on.
00:39:56.560 That's what these people have become now.
00:39:58.320 They've become those crazy conspiracy theories.
00:40:00.720 They're looking at.
00:40:01.400 That's what it looks like.
00:40:02.140 Look at the body language here of how this person.
00:40:04.420 And it has to be all these other things and the actual thing that's just right in front
00:40:08.540 of you.
00:40:08.880 And it's fairly obvious of what it is.
00:40:11.100 But yeah, no, these are the conspiracy people now.
00:40:13.420 They think somebody like to put us there somehow.
00:40:17.040 Like, well, can you answer who?
00:40:19.100 Did we get money from somewhere?
00:40:21.000 You know, like, yeah, I wish we got money from somewhere.
00:40:23.000 We haven't got shit from anybody.
00:40:24.300 And if we were, wouldn't, what's the reasoning with us being banned from every place?
00:40:29.720 Wouldn't, like, if this is some secret thing of amplifying these ideas by some right-wing
00:40:33.520 powerful person or something, wouldn't we be, like, allowed on all these platforms?
00:40:37.460 Exactly.
00:40:38.160 I don't know.
00:40:38.640 Exactly.
00:40:39.160 It's just nothing makes sense because they're just, all right.
00:40:42.800 is typically a mix of both hierarchy and kind of grassroots, like, putting yourself forward
00:40:50.960 as a potential authority.
00:40:52.660 All right.
00:40:53.020 So it looks like we're at times.
00:40:54.600 As a potential authority.
00:40:55.400 It's always putting yourself out as an authority.
00:40:59.520 People put videos out all the time and put them online.
00:41:02.860 It's not this big, grand scheme or, like, I'm now in the authority space.
00:41:07.300 I just, wow.
00:41:08.020 That's how these people think.
00:41:09.980 Yeah, it's pretty shocking to learn just how retarded these people actually are and how
00:41:17.240 little they're just willing to just, I mean, again, I know they have this, like, they're
00:41:22.680 hyper-programmed by the mainstream.
00:41:24.760 They read mainstream articles, Wikipedia pages.
00:41:27.060 They go straight to, like, you know, the stenographers in the mainstream media that's releasing, like,
00:41:32.700 intelligence operatives at Wikipedia's, you know, talking points about who's a bad person
00:41:37.120 and why we have to go after them legally and, you know, with Operation Chokepoint, you know,
00:41:41.220 terminate them from payment processors so they can't grow organically like everyone
00:41:44.880 else is allowed to do.
00:41:45.720 I know all these things.
00:41:47.180 But still, you'd think to be a human in there somewhere that can at least, like, all right,
00:41:52.200 I don't agree with them, but let's hear them out or something.
00:41:55.840 Like, what are they actually arguing?
00:41:56.700 Or what's the point?
00:41:57.920 Or how is this possible kind of thing?
00:41:59.560 No, he doesn't actually bring up one of those, right?
00:42:02.220 I'm going to just read a couple of video comments on this one.
00:42:04.440 Did you guys just wake up from a seven-year coma?
00:42:07.180 In internet years, this content is beyond stale.
00:42:10.300 This is one of the worst presentations I've ever seen.
00:42:12.520 The entire thing is very confused and vague.
00:42:14.780 What even is the point of this?
00:42:16.740 Surprised that the Hoover Institute will put out a video wrapped in so much left-wing rhetoric
00:42:20.780 and obviously shoddy research.
00:42:23.060 The more the presenter talks, the more dubious he sounds.
00:42:25.560 His data points are absolutely ridiculous.
00:42:27.840 Can't even give a definition for the title of alt-right.
00:42:30.320 It's incredible.
00:42:30.880 These are the studies that Hoover Institute funds, dot, dot, dot.
00:42:35.480 He lost me early at Atlantic and NPR being popular media.
00:42:41.040 I'm glad you gave us a trigger warning at the start that tells me everything I need to know
00:42:44.800 about how unserious and juvenile you and your audience are.
00:42:48.320 Here's a couple more here.
00:42:49.180 What a bizarre and strange presentation study.
00:42:52.380 The presenter can't even define the core topic of the study.
00:42:55.120 Fails to provide any clear or useful conclusion.
00:42:57.740 As a past donor to the Hoover Institution, I have some questions.
00:43:00.540 One, who funded this study?
00:43:02.240 Two, will nonsensical studies like this be funded or showcased in the future?
00:43:06.560 Three, has the Hoover Institution abandoned its ideological foundation of libertarianism
00:43:11.520 and joined the rest of academia which has been consumed by woke activism and fraud?
00:43:17.320 And then someone else says,
00:43:18.180 I love the way they try to make a headline out of this.
00:43:20.620 The alt-right are providing free cookies at their meetings,
00:43:23.220 thereby increasing incentives to participate.
00:43:25.200 They use the extra big chocolate chips, thereby amplifying my desire to attend an anonymous source claimed.
00:43:32.120 And then they get hammered by people telling them, you know, like, you're stuck in a 2016 time loop.
00:43:38.180 And they really are, Henrik.
00:43:39.300 We see this all the time.
00:43:41.060 This is all they have now.
00:43:42.840 The same articles are coming out.
00:43:44.500 Is there an AI writing this?
00:43:46.800 What's going on?
00:43:47.340 Are they given the talking point?
00:43:50.000 Let me put on my conspiracy hat.
00:43:51.980 Like, who's put this guy there?
00:43:53.240 Is he an artificially generated character?
00:43:56.380 Is that why he's so poorly at presenting?
00:43:58.100 Like, you need to update to improve your AI?
00:44:00.380 I don't know.
00:44:01.600 Oh, my gosh.
00:44:02.620 Yeah.
00:44:02.900 All right.
00:44:03.360 Well, you know, they got complaints.
00:44:05.160 Keep whining.
00:44:06.620 Why don't you do that?
00:44:07.180 Just keep whining behind your, like, dry, boring, academic, like, my data points.
00:44:13.260 What the fucking hell?
00:44:16.640 Meanwhile, it's like.
00:44:18.300 Meanwhile, like, our world is burning, like, to the ground.
00:44:21.380 You know what I mean?
00:44:21.800 And this is all these people can do.
00:44:23.740 Well, women in 2016.
00:44:25.240 Yeah.
00:44:25.440 And again, we've, you know, we've been talking about this particular issue.
00:44:29.500 We've been around longer than a decade.
00:44:30.920 But for a decade, you know, a lot has changed, right?
00:44:34.760 Mainstream conservatives, like I said, are picking up our arguments.
00:44:37.880 Like, get with the times, Boomer Nielsen.
00:44:40.200 You're getting left in the dust.
00:44:41.440 Plus, the future is white political movements, right?
00:44:45.280 To stand up for our best interests, just like everybody else's.
00:44:48.660 He doesn't call brown, black, or Asian women racist for standing up for their people or
00:44:53.100 interests, right?
00:44:54.540 So, we are the future.
00:44:56.060 Get used to it.
00:44:57.100 I mean, they don't have much of this left.
00:44:59.620 And despite of all the whining they did or all the reports that they've done.
00:45:03.840 Sure.
00:45:04.140 I mean, it led to some censorship.
00:45:06.240 And, of course, that's, you know, still there and continues.
00:45:08.240 So, it's not that that's gone away, but they, the people that want us, you know, Terminator
00:45:13.500 or want us not to be able to talk about these things, they've also upped their tactics.
00:45:18.840 They're now just going, they're suing people now.
00:45:20.960 They're going after them legally.
00:45:22.300 They're trying to put them in jail.
00:45:24.200 That's the only thing they have left now.
00:45:25.960 Because, again, that's what authoritarian, you know, again, to talk about power structures
00:45:30.380 or hierarchies or, like, dominance when it comes to, like, who holds influence and power.
00:45:34.560 Like, you're sitting in the environment of this immense power structure that's, like,
00:45:39.120 you know, shutting people down, forcing them out of not being able to participate, essentially,
00:45:45.300 right, in order to influence or talk about the concerns that they have or something like
00:45:48.820 that.
00:45:49.180 Again, this is, like, not, it's not violent rhetoric or anything like that.
00:45:51.640 Hey, this is a hands-on approach on how to solve these issues.
00:45:54.000 This is what we think, blah, blah, blah.
00:45:55.480 No one's trying to hide it.
00:45:56.380 And they're part of a system that's just trying to squash those people entirely.
00:46:00.940 And they can't see it.
00:46:02.060 Yeah, wake up, Hoover Institution and Richard Nielsen.
00:46:04.780 Stop being boomers and pull your heads out of your ass.
00:46:07.680 They're not going to.
00:46:08.920 And they're going to, one day, either they'll go to the grave or they're going to learn the
00:46:13.060 hard way.
00:46:13.820 But they'll never come to our side.
00:46:19.880 This was a preview of our latest Western Warrior exclusive member show.
00:46:24.360 Watch the whole show at redicemembers.com.
00:46:27.120 Sign up, get access to all our exclusive members' content.
00:46:30.520 We have an archive of thousands of shows, interviews, and videos.
00:46:34.500 You can also sign up for a membership at odyssey.com slash at redicetv or subscribestar.com slash
00:46:41.480 redice.
00:46:42.600 We are Indie Media, 100% supported by you.
00:46:47.020 No commercials, no sponsors, no funny business.
00:46:49.980 For $10 a month, you can help us spread truth in a time of utter insanity and censorship.
00:46:55.960 We're in this together.
00:46:57.320 The time to stand our ground is now.
00:46:59.300 Go to redicemembers.com.
00:47:19.980 And we're in this together.
00:47:31.680 Thank you.
00:47:32.020 Thank you.
00:47:32.300 Thank you.
00:47:33.040 Thank you.
00:47:38.800 You