RadixJournal - February 18, 2020


Nicker Nation Nixed


Episode Stats

Length

44 minutes

Words per Minute

176.70131

Word Count

7,884

Sentence Count

596

Misogynist Sentences

15

Hate Speech Sentences

24


Summary

Nick Fuentes is the latest victim of YouTube's ongoing campaign of de-platforming, and we discuss why this is a problem, and what it means for the future of the incel movement. We also discuss modern heresy and modern deplatforming.


Transcript

00:00:00.740 Issue 2. Nicker Nation. Nickst.
00:00:05.940 This past week, Nick Fuentes was expelled from the world's largest video platform, YouTube.
00:00:13.300 Adding insult to injury, it happened on Valentine's Day, already the most painful day of the year for incels.
00:00:20.300 While dissidents of the past circulated handwritten samizdat among comrades,
00:00:25.100 today, we use the most popular media platforms and, at least potentially, talk to the world.
00:00:32.160 Whatever you think of Nick Fuentes, his experience is paradigmatic.
00:00:37.740 There is a sword of Damocles hanging over the heads of every dissident,
00:00:42.260 and in 2020, that means every person who expresses white identity, however couched.
00:00:48.140 The panel discusses deplatforming and modern heresy.
00:00:55.100 The latest victim of this ongoing deplatforming, the great shuttening, was Nick Fuentes,
00:01:08.320 a, how do you describe him, conservative Zoomer commentator.
00:01:15.060 He is someone who does a lot of live streaming.
00:01:18.540 He clearly has an audience, and he was banned for basically doing things that he does on a daily basis.
00:01:28.400 So, our, you know, friend JF, I think each of us has been on his program at least once.
00:01:35.420 I've been on it a dozen times or so.
00:01:37.900 Got into a personal feud with Nick.
00:01:40.760 Nick loves these personal feuds.
00:01:42.580 He's gotten into a personal feud with just about everyone.
00:01:45.380 And he, and JF has posted a video on BitChute showing his channel violations, the TOS violations against YouTube.
00:01:56.900 But I watched that video, and to be honest, you know, I don't like what Nick does.
00:02:02.620 I've never watched a full stream.
00:02:04.540 I've watched clips, and I can't imagine listening to them for an hour.
00:02:08.620 It's crude and shallow, but at the same time, nothing that he did was illegal.
00:02:16.960 Perhaps you could say it was a violation of TOS at some level.
00:02:20.740 You could flag it.
00:02:21.400 I would not do that.
00:02:23.620 And again, what he got banned for was what he was doing for the last three years.
00:02:30.400 So, it is highly inconsistent.
00:02:33.720 And that inconsistency is a kind of feature, not a bug, of the system.
00:02:38.620 When we don't, you know, if they would just simply ban everyone to the right of Jeb Bush, we could then know what the rules were and go on to another platform.
00:02:48.480 But this inconsistency, ambiguity, it creates an overall chilling effect where, you know, I can put up a video and say, oh, this is really good.
00:02:59.180 I bet we're going to get, you know, 25,000 views.
00:03:01.760 People are going to love it.
00:03:03.460 And also, I could think I'm going to upload the same video, and my whole channel is going to be taken down, and I'll lose all of this investment that I put into the platform.
00:03:14.500 It is really psychologically damaging.
00:03:18.480 So, anyway, let's talk first about the NIC situation, and then let's talk about the de-platforming issue on a more theoretical or policy basis.
00:03:28.520 So, Keith, why don't you jump in?
00:03:33.120 And you are a certified Zoomer.
00:03:37.360 So, why don't you jump in?
00:03:39.220 And what are your thoughts on Fuentes and NICR Nation?
00:03:45.820 I think, technically, I'm a millennial, actually.
00:03:49.740 Maybe I wish I was a Zoomer, you know, self-hate millennial.
00:03:53.300 But then, again, is there any other kind of millennial?
00:03:55.960 But, yeah, I'm just on that border.
00:03:58.140 As bad as millennials are, Zoomers are kind of worse, from what I can tell.
00:04:03.380 But go ahead.
00:04:05.180 Yeah, but, you see, because, you know, if I was a couple of years younger and I was a Zoomer, I had no optics.
00:04:10.660 Hmm, right.
00:04:12.360 Great optics.
00:04:13.020 I'm never sure.
00:04:13.780 I'm never sure of my optics just being on that kind of borderline, you know?
00:04:17.760 Yes.
00:04:18.100 I could be drifting into bad optics.
00:04:20.100 But in terms of the Fuentes banning, I mean, I don't know if JF is responsible.
00:04:25.240 Because, like, I know JF said he was flagging his videos.
00:04:28.020 That was surprising.
00:04:28.740 Because, to me, like, I kind of would have just expected that there was, like, dedicated Antifa flagging all these videos every day anyway.
00:04:36.020 But, I mean, even if that was the case, that he got banned for that, I mean, you can look at someone like James Alsop.
00:04:42.160 He got banned with no strikes.
00:04:43.660 Red Ice got banned with no strikes.
00:04:45.000 So many of these examples, Cultured Dog, all these people.
00:04:48.520 I mean, Zorius got banned.
00:04:49.980 He didn't even have any political content on his channel.
00:04:52.320 It was just music.
00:04:53.260 Like, that was definitely the most egregious.
00:04:55.460 I agree.
00:04:55.760 So, it is this, yeah, it is this problem of YouTube won't make clear what the guidelines are.
00:05:02.000 But that's definitely intentional because it does lead to this kind of paranoia and self-policing and, you know, distance.
00:05:08.920 Are they going to put work into building a channel and put work into making videos if they know that at any time they could be given the chop like that?
00:05:16.900 But, you know, it's the year of an election and this isn't really an issue.
00:05:21.320 And you kind of wonder, like, have, you know, has the right missed a trick in not bringing this front and centre?
00:05:27.480 Because, I don't know, this could so easily be a central election issue.
00:05:31.840 I mean, even if there was a candidate like Andrew Yang that kind of ran a sort of quixotic campaign on tech censorship,
00:05:39.460 and much as Yang was focused on the issue of AI taking jobs, if someone was focused on the power these oligarchs have.
00:05:47.140 But that hasn't happened.
00:05:48.480 And I don't think it's going to be an election in 2020.
00:05:50.620 And, you know, Trump made a lot of noise about doing something about this.
00:05:55.020 But, I mean, from Trump's perspective, I don't think it's a problem because, like, these social media giants now,
00:06:02.720 they're not really censoring the likes of, you know, the sort of Republican conservative types.
00:06:07.900 And any time there was a hearing with Mark Zuckerberg, you've someone like Ted Cruz bringing up that, I don't know,
00:06:13.000 an anti-abortion Facebook page was censored or something.
00:06:15.680 And, you know, I wouldn't be surprised if they actually did do something about that.
00:06:19.360 Maybe they did pull back their censorship of, you know, the turning point USA type stuff.
00:06:24.240 And, you know, from Trump's perspective, that is the conservative movement.
00:06:27.360 Like, Trump is promoting Charlie Kirk's latest book on the MAGA movement.
00:06:32.380 Charlie Kirk wrote the MAGA book.
00:06:34.740 So, I don't know.
00:06:35.620 I mean, I think no matter who wins in 2020, I don't think anything is going to be done about this.
00:06:39.600 And it's kind of, I think it's going to get to the point where it's just going to be too late.
00:06:42.820 And, you know, YouTube is more and more becoming like a Netflix type.
00:06:47.240 I mean, the idea that these are private companies.
00:06:49.360 I mean, I know you've been, we've been over all the ridiculous libertarian arguments against this.
00:06:54.360 But, like, it is very serious.
00:06:55.920 I mean, anyone that has any pretense of believing in liberalism or liberal democracy.
00:07:01.000 I mean, you know, liberalism, you read someone like Habermas and, you know, how important dialogue.
00:07:06.800 And, like, in early liberalism, the coffee houses was where liberalism was birthed, if you believe, you know, the liberal genealogy of this.
00:07:15.440 And it was these free speech places where people could go and buy their coffee and newspaper and discuss the issues of the day.
00:07:21.820 And we don't really have a public space anymore.
00:07:25.600 I mean, what's the public space?
00:07:27.000 You know, it's a mall or something else consumerism related.
00:07:30.620 And, you know, dialogue is almost completely confined to the Internet.
00:07:34.760 And then you have, you know, algorithms deciding what routes you go on.
00:07:38.480 So, I mean, I just think it's the biggest issue of our day, really, in a lot of ways.
00:07:42.800 Because I think what happens in the next few years could, you know, dictate the direction the Internet goes for the next century.
00:07:49.080 But we're not hearing anything that would inspire optimism.
00:07:53.560 I think we should make it, you know, in the alt-right or dissident-right, whatever you guys want to call it.
00:07:58.640 I think we should make it our big issue, if not our only issue.
00:08:03.700 Yeah.
00:08:04.260 Because this is where we were.
00:08:05.540 I mean, I can talk about this a little bit just because I have, you know, longer experience of this.
00:08:11.760 How old are you again, Keith?
00:08:13.640 24.
00:08:14.660 Oh, 24.
00:08:15.440 Okay.
00:08:15.640 So, yeah, you're much, that's way older than, you know, the Zoomer crowd who are, you know, 16, you know, living in a chat room, effectively.
00:08:25.340 But I can remember all these things in the sense that I worked for a brick-and-mortar publication for a time after I left graduate school in 2007.
00:08:37.360 I worked for the American Conservative, and I went to work at 9 a.m. and left at 5, and we produced a real publication, et cetera.
00:08:44.320 But basically, everything that I have done since then has been digital.
00:08:51.140 I've published a number of books.
00:08:53.000 But even those books, people know about them and will buy them because they know about my digital life.
00:09:00.880 And Silicon Valley was a huge benefit to that, an indispensable component of this in the sense of using Stripe, which I used very early.
00:09:11.640 I think I might have used that in 2010, even, PayPal, MailChimp, and then using, you know, hosting is much more, you know, democratized.
00:09:23.600 You know, they don't have the monopolies that Stripe does now.
00:09:27.480 It didn't when I started.
00:09:28.760 But using all of these systems to get the word out, uploading a video to YouTube.
00:09:34.180 I mean, ultimately, we're uploading, you know, gigabits worth of content.
00:09:38.540 They are hosting it.
00:09:39.580 They are delivering it.
00:09:40.720 They are dealing with customer service.
00:09:43.420 It is a massive benefit as opposed to, say, our producing VHS tapes and having to mail them out to people or something like that.
00:09:53.020 It's kind of unworkable.
00:09:55.040 And what we're able to do is amazing.
00:09:56.500 And so Silicon Valley was unquestionably helping dissonant movements, the alt-right, up until 2016-ish when, I mean, I got banned from Twitter in 2016, and then I was allowed back on.
00:10:13.120 They took away my other accounts.
00:10:15.000 They basically said, you can just have this one.
00:10:16.900 And in 2017, basically, you know, a ton of bricks were falling down on us in the sense of mostly getting banned from payment platforms, getting kicked off bank accounts.
00:10:30.520 I've been kicked off a dozen bank accounts of large and small banks.
00:10:36.260 And they just put us in this digital gulag, and, you know, we benefited from it.
00:10:47.100 We went into this digital prison, and we're like, oh, look how fun it is.
00:10:50.400 Look, the water works.
00:10:52.440 We've got all these utilities.
00:10:53.700 There's a cool big screen TV.
00:10:55.460 My prison is great.
00:10:56.740 Why would I want to get out?
00:10:57.700 But the problem is we're still in that prison, and they can clamp down upon us any time they want.
00:11:04.660 But I don't really know of a way out.
00:11:08.360 If we had full just, you know, you are not allowed on the internet.
00:11:13.660 We are going to clamp down on anything you do.
00:11:15.900 That might be better in the sense that we would say, okay, we're going to go back to the 90s.
00:11:20.960 We're publishing newsletters and books and so on.
00:11:24.760 That would at least be clear.
00:11:26.480 But where we are now is this ambiguous state of being in a prison where when you're in a prison, you get treated well.
00:11:33.960 You get three square meals a day.
00:11:35.680 You get a little bit of entertainment.
00:11:37.200 You get to go out in the yard and run around a little bit.
00:11:39.460 But you're still in a prison, and that is where we are.
00:11:43.220 And, again, in terms of the Knicks situation, I mean, I'll just say it.
00:11:48.220 I cannot stand the guy.
00:11:50.160 I, it's just a bunch of takes that are, like, racist, Republican, like, hateful, mainstream screaming at people.
00:12:02.040 I think it's useless.
00:12:03.900 But I really get, I do have some sympathy for him.
00:12:09.640 And I really, we're all in that same exact situation that he is in.
00:12:15.080 And I think we need to kind of have a, you know, have a little bit of solidarity on this issue and make this the one issue that we can actually all agree on.
00:12:24.640 Whether we want to go, I mean, I don't believe in this, you know, liberal notions of free speech, but just in terms of our very survival, we need to demand our ability to exist on these platforms.
00:12:40.820 I mean, I did notice a problem as well, like, when RedEyes got shut down.
00:12:45.720 I mean, you know, RedEyes had over 300,000 subscribers, which is four or five times bigger than Fuentes' audience.
00:12:52.300 But there wasn't really a huge backlash because it's gotten to the stage, and this isn't right, that when someone like RedEyes gets shut down, people kind of go, well, we saw that coming.
00:13:01.080 And they might make noise about it for a day, but then they move on.
00:13:03.820 And, yeah, I mean, if there was any kind of cohesion, I mean, if there was a sizable, you know, whatever amount of people there is in the distant right, surely there'd be a sizable enough number to pledge abstention from the 2020 election unless Trump did something about this.
00:13:19.180 So I don't know what difference that would make, but, you know, it would be something.
00:13:22.120 And maybe that's the space for a distant movement now is to sort of kind of quixotically try and meme single issues like that into the public sphere.
00:13:30.980 The problem is people like Nick Fuentes, and to a far lesser degree, but kind of to a degree of RedEyes, are pushing against the notion of acting like that.
00:13:41.960 I've known RedEyes for a while.
00:13:43.720 I was interviewed by Henrik, I think, after I was jailed in Hungary in 2014.
00:13:50.640 That was the first time I found them.
00:13:52.520 And I was listening to some of their content.
00:13:54.500 A lot of it's really wild.
00:13:56.020 Some of it I find, you know, interesting but clearly wrong.
00:14:00.980 You know, the moon landing kind of stuff, revision, historical revisionism, etc.
00:14:06.860 But it was at least interesting.
00:14:09.060 I noticed a distinct trend of theirs of talking about demographics, which is the red herring, seemingly pragmatic, but ultimately red herring thing of just playing the optics game.
00:14:22.120 They're a problematic newspaper because on the one hand, they want to be the newspaper that's the radical right newspaper that's rebellious against the leftist establishment and all this kind of thing.
00:14:31.480 And they will go for that and they will they will do sensationalist articles that are about that, that are pro-Brexit or whatever, in an extreme way.
00:14:39.460 But then on the other hand, some at some point, someone at the hierarchy says, oh, no, we better jump on this bandwagon of political correctness.
00:14:47.360 Otherwise, we'll look bad and it might be bad for sales or whatever.
00:14:50.040 And so then you can get the other extreme, almost like they're overcompensating for the perception that they're this right wing newspaper that you're not allowed to cite on Wikipedia or whatever.
00:14:58.120 They overcompensate by being really, really woke on certain things.
00:15:01.120 So they're a very schizophrenic newspaper in some way.
00:15:04.180 Yeah. And again, I've not I don't really need to go on red ice at this point and I don't have anything against them.
00:15:10.720 But I think it's somewhat telling that I am kind of persona non grata.
00:15:16.360 I mean, whatever. I don't really care.
00:15:18.900 I would add that people have been saying that they never got a strike.
00:15:21.660 That's not true.
00:15:22.580 They did an interview with me on my book, Race Differences and Ethnocentrism, and they got a strike.
00:15:26.660 Right. No, and I absolutely support them.
00:15:30.400 But again, at least where we were at a time in the alt-right was we're demanding free speech.
00:15:39.100 We're even demanding speech for Anglin.
00:15:41.600 And, you know, when you play the optics game, you basically destroy all of your leverage.
00:15:47.180 So if we are a truly dissonant movement, people call themselves the dissonant right as this weird.
00:15:53.720 It's this kind of funny way to be more PC or something.
00:15:56.660 But anyway, if you were a real dissonant movement, we would say this.
00:16:01.240 We talk about all sorts of issues, but this is this is where the rubber hits the road and we will not vote for you.
00:16:07.540 We will condemn you unless you do something about this.
00:16:10.500 But again, what was the alt-right played the optics game in 2017 and 20, particularly in 2018.
00:16:19.080 I think it's actually lightened up in 2019 and early 2020.
00:16:23.060 But we played that game and we removed all of the leverage we have.
00:16:27.180 So it's about us being American nationalists or we're just we're just conservatives.
00:16:31.520 Why are they calling us Nazis, whatever?
00:16:33.460 And what we did by that was destroy our leverage.
00:16:36.680 I think we should say we are radical.
00:16:39.440 We are saying things that go against the grain.
00:16:43.140 We aren't just normal, you know, in the worst sense of that word.
00:16:48.180 And but we demand our existence online.
00:16:51.240 And that's how we could play this game.
00:16:53.080 I think it's very hard to play that game at this point.
00:16:55.500 And with the movement such as it is, also, I mean, I think it's it's it's it's clearly a ever present danger to democracy.
00:17:04.380 If if if certain extra governmental elements such as trade unions in the 60s and 70s in the UK, such as the multicultural lobby, whatever, become too powerful.
00:17:17.980 It's clearly a danger to democracy.
00:17:19.420 That's why in England we have the Monopolies and Mergers Commission, because the attitude is that if a company becomes too too powerful, has too much of a monopoly on the market, it's a danger to a democratic free society.
00:17:30.580 And so it has to be broken up for some reason in Finland.
00:17:33.300 They don't seem to have a monopoly to Mergers Commission.
00:17:35.580 And so a vast percentage of shops in this country are part of a duopoly of two groups, the S group and the K group.
00:17:43.620 It's vast.
00:17:44.440 Almost all supermarkets you're going to go to, apart from Lidl, and that only came in about so nine, eight years ago, or other part of it.
00:17:52.180 And this is YouTube is and Twitter are sufficiently power and Facebook are sufficiently powerful.
00:17:57.680 They are a danger to democracy.
00:17:59.540 Clearly.
00:18:00.160 And so I think there is a very good case for anybody, whether you are left wing and you object to the capitalism, whatever element, or whether you are right wing and you're in favor of free speech.
00:18:10.100 That these things should be broken up, or they should be compelled in America to comply to American law on free speech.
00:18:17.560 I mean, YouTube banned YouTube banned the speech by Rand Paul.
00:18:21.340 I don't know, was it today or yesterday that he met in Congress?
00:18:24.400 It's like, this is just absolutely ridiculous.
00:18:26.520 And the people like Elizabeth Warren and a couple others who have demonized Mark Zuckerberg in particular and talked about monopolistic practices and so on have been doing really the worst possible – they have the worst take on this.
00:18:42.520 Which is that you're a monopoly, and we demand that you do more censoring.
00:18:50.480 And I actually kind of get it in the sense of like outright fake news.
00:18:56.920 It's just like there are people who, particularly in 2016, who were just putting out clearly incorrect news items and so on.
00:19:06.480 You know, the Pope has endorsed Donald Trump.
00:19:09.420 That was a famous one.
00:19:10.420 Or, you know, there's the Pizzagate thing.
00:19:13.780 There are all these things that, you know, I kind of get where they're coming from.
00:19:18.080 But their, again, their general impulse is you need to censor more people, you know, or else we're going to go after you as a monopoly.
00:19:26.560 Whereas the only way for this to work for us is something like an opposite approach, which is basically saying you are not just some company.
00:19:38.180 You are using the infrastructure of the government, which is the internet, period.
00:19:45.400 And you are a more than just some company.
00:19:49.040 You are the lifeblood of free speech.
00:19:52.660 And therefore, you are a public utility.
00:19:55.960 You can still profit off this in the way that, you know, Comcast or an IP provider profits or a telephone company profits.
00:20:05.400 But you actually have to, you know, adhere to these regulations.
00:20:09.820 I would submit that some things, I admit, there is a, people have commented on a socialist side to me.
00:20:15.960 That socialist side to me comes out when I'm in America.
00:20:18.600 And I can't believe how the lack, there's no health service, for example.
00:20:21.860 But you can pay waiters sub-minimum wage and expect that that's dealt with by voluntary tips.
00:20:28.780 I think these things are outrageous.
00:20:30.260 But certainly...
00:20:31.860 Particularly when you're eating with Ed, who, you know, rather be waterboarded than give someone a tip.
00:20:38.840 I don't, but I made it clear, I don't like tipping.
00:20:41.840 I'm happy for them to just add it to the bill, add it to the costs, just add 30% to the cost of the items on the menu.
00:20:50.560 Yeah, I do basically agree with it, yeah.
00:20:52.900 It's so obvious, it's obvious.
00:20:54.760 Anyway, I don't, and tipping under the wing, the thing is they want you to tip for as well.
00:20:58.500 It's not just like restaurants, they want you to tip cleaners in hotels.
00:21:02.060 Yeah.
00:21:02.320 Oh my goodness.
00:21:03.220 And then you find in Europe, then, they try and gradually introduce tips so that then they can eventually lower the wages.
00:21:10.280 That's true.
00:21:10.980 No, don't tip.
00:21:12.260 I mean, you don't tip certain people.
00:21:15.480 Oscar Wilde wrote an essay against charity.
00:21:18.020 You know, don't give to charity because then you're just, you're sustaining these people and it's the system that's putting them on the street.
00:21:23.980 Leave them to starve and then, you know, let the system's guilt sort of take care of it systematically.
00:21:28.660 So, you know, don't tip.
00:21:29.880 An excellent idea.
00:21:30.840 There was one person once when I was stopped on the motorway and he was trying to collect for charity.
00:21:34.720 He said, I said, no, I don't want to give to your charity.
00:21:37.160 He said, what charities do you normally give to?
00:21:39.320 I said, I pay income tax.
00:21:41.060 What are you talking about?
00:21:42.920 I assume the government is confidently spending my money.
00:21:45.560 But no, it's totally happening.
00:21:46.880 I think this is literally a scene from Reservoir Dogs that we are reenacting.
00:21:50.160 It was also with the tipping.
00:21:52.820 It was always 10%.
00:21:53.920 Spencer Theatre Group.
00:21:55.020 The tip was always 10%.
00:21:57.120 And now you have these places in London.
00:21:58.820 About 20 years ago, it was a 12% tip.
00:22:01.920 A 12% voluntary tip will be added to your bill.
00:22:05.440 So you have to tell them to remove it.
00:22:07.380 And then it became a 12.5% tip.
00:22:09.640 And then a 13% tip.
00:22:11.280 In America now, you're expected to pay a 20% tip.
00:22:13.880 Yes.
00:22:14.600 When I did these conferences at the Reagan building, which were really great but very expensive, they would give you the price and then they would tell you the tip.
00:22:25.280 And then it's kind of like, what is that?
00:22:28.780 So that 20% is just built in and it's non-voluntary.
00:22:33.780 It is legally voluntary.
00:22:38.840 It's legally voluntary.
00:22:39.980 But you have to cause a load of fuss.
00:22:42.420 Like if you go to these museums in New York.
00:22:43.780 Hire a lawyer or something to not pay a tip.
00:22:46.280 I mean, you're not going to do that.
00:22:47.680 They're all free, these museums in New York.
00:22:49.200 But they make out you've got to pay.
00:22:50.720 But they're free.
00:22:52.060 But anyway, moving off.
00:22:53.260 I would say certain things should be in a complex society, should be under the ownership of that society.
00:22:59.840 One of them is railways.
00:23:00.960 I see no reason why it seems to be perfectly logical that a railway system that spans the whole society should be run by.
00:23:05.940 I think it was crazy that they privatised the railways in the UK.
00:23:08.380 The Post, that should be privatised.
00:23:10.320 Probably even the telecom system should probably be privatised.
00:23:13.400 And I think that it should be, when these kind of institutions have such a monopoly, they should be either compelled to abide by the laws of the society, i.e.
00:23:24.480 you could only be chucked off YouTube if you have to make a police complaint.
00:23:28.460 If you've been literally cited murder or engaged in what a court.
00:23:33.580 And Nick did not do that.
00:23:36.060 However kind of edgy, I think he talked about a car accident with someone.
00:23:41.160 I mean, however kind of distasteful it was, it is not something that would be illegal in a public space.
00:23:48.920 So it is illegal in a public space for me to go up to someone and start just yelling at them and harassing them and telling them I'm going to kill them and whatever.
00:23:58.440 That is harassment.
00:23:59.880 That is illegal.
00:24:00.800 If you do that on YouTube, if Nick or anyone does that on YouTube, they should be actually at the very least banned for a time or have that video removed.
00:24:10.300 But what he did was simply not, did not rise to that level.
00:24:16.520 It might have risen perhaps to a TOS level.
00:24:19.780 But the fact is the terms of service are so, no one reads them.
00:24:22.780 They're inherently ambiguous.
00:24:24.260 They can be enforced at the whim of a, you know, middle management administrator.
00:24:29.060 It's just, it's not workable.
00:24:31.500 And just the simple solution is to say what you can legally do on a public sidewalk, you should be able to do on one.
00:24:40.420 So I had this video, the first one, the first time I got a warning was I did a video that was a critique of this ludicrous book by Angela Saini, this book in theory, which all I did was start off by doing an impression of her, which I think is perfectly reasonable.
00:24:54.400 She's a public figure.
00:24:55.400 Why the hell shouldn't I do an impression of her?
00:24:56.800 I did an impression of her, and then I just went in detail, in tremendous detail about everything that was wrong with her book, and then I wrapped up in actually quite a nice way.
00:25:05.400 But this groupie of hers called Jess Wade, who's a Jewish physicist or something at Imperial College London, and is really like a lover, really loves this Angela Saini woman, a real kind of worshipper of her, wears a T-shirt on her Twitter picture with the name of Saini's book on it.
00:25:24.300 And she just got released, she tweeted, this is disgusting.
00:25:28.020 When someone uses the word disgusting, you know there's something going on.
00:25:30.620 This is disgusting.
00:25:32.060 How could this be?
00:25:32.900 She said, this is hate speech.
00:25:34.680 It's literally inciting hate, she said.
00:25:38.680 She said it should be illegal.
00:25:40.680 It should be illegal, this video, and all this, and pressured YouTube to take it down.
00:25:45.080 And they did.
00:25:45.460 And videos that were far more edgy than that, like one where I did an impression of a Somali sort of chic, or far more edgy, stayed up.
00:25:55.260 And that video was monetized by manual review.
00:25:58.460 This Angela Saini video, before they took it down, it was flagged as being a bit dodgy, because it had the word race in it or something.
00:26:04.400 And then it was monetized after manual review, supposedly.
00:26:07.200 And I said to YouTube, I don't understand this.
00:26:10.240 You're saying that it's okay to have advertising.
00:26:13.060 It's suitable for advertising, but it's hate speech.
00:26:17.400 How can it be both?
00:26:20.080 And the bimbo that I was corresponding with, the person that worked with him, said, oh, the community guidelines rules are different from the monetization rules.
00:26:28.260 And I was like, yes, but you're saying that under your rules, something can be okay for advertisers to want to advertise on it, but also be hate speech.
00:26:38.100 It's insane.
00:26:39.640 I had a shitposting page on Facebook a few years ago that was shut down four different times and then reinstated four different times on appeals.
00:26:47.780 So, like, it's a really Kafkaesque reality where you don't know who's making the decision.
00:26:53.220 You don't know what the guidelines are.
00:26:55.160 You don't know why it's been reinstated.
00:26:56.820 You don't know if it's going to be gone again.
00:26:58.880 And it's just this constant sort of paranoia and nervousness.
00:27:02.940 I had a video.
00:27:05.280 Sorry.
00:27:05.580 I had a video that they demonetized, that they flagged.
00:27:08.520 I appealed.
00:27:09.620 They said the appeal is rejected.
00:27:12.080 We're going to demonetize your video.
00:27:13.720 And then a month later, it was monetized.
00:27:16.280 Yeah.
00:27:16.580 The other thing is that it's permanent.
00:27:18.460 And I actually think this is so much worse than crimes that you might commit in the physical world.
00:27:25.780 So, if I go steal a car and I get caught for it, I have a trial that is public and that is explicit.
00:27:35.400 I can serve a jail sentence with good behavior.
00:27:38.740 I could lessen that jail sentence.
00:27:40.440 And then afterwards, I might be kind of haunted by this crime for the rest of my life.
00:27:46.100 But I can live as a citizen.
00:27:47.720 I can vote.
00:27:48.520 I can hold a job.
00:27:49.460 I can do whatever.
00:27:50.640 You get banned from social media.
00:27:52.980 I have not—I don't—this is apparently a lifetime ban.
00:27:59.200 I mean, there's nothing you can really do to get back outside of pulling a full Christian Piccionili and, in Christian's case, you know, just lying and kissing ass and so on.
00:28:11.780 Um, and just totally apologizing and going in the total reverse way.
00:28:17.120 But I couldn't do a video—let's say I was a little bit tipsy or angry or whatever and I did something that was stupid.
00:28:25.620 I would simply be banned for life for that one little instance, whereas you are not just removed from society if you steal a car or do drunk driving or whatever.
00:28:38.860 You have to murder someone in cold blood in order to be removed from the world.
00:28:45.280 But that's what hate speech is.
00:28:46.920 It's like murder, effectively.
00:28:48.880 Yeah, but it's true.
00:28:49.580 But that's true.
00:28:50.420 It's a—you've got to think about it in a historical context.
00:28:53.020 What we're dealing with is a replacement religion.
00:28:54.940 What we're dealing with, to a certain extent, is Christianity, is the second religiousness that Spengler talks about, which is that it takes aspects of the first religiousness and sort of revives them in some mutated form.
00:29:08.980 And the worst thing is that with this, it takes away the God.
00:29:12.020 It takes away the moral God that's telling you to act in this moral way and not in this moral way.
00:29:17.080 And it creates this kind of moral realism where you worship certain concepts and you're either in the group or you're not in the group.
00:29:23.360 And you will know implicitly that something's hate speech or it's not.
00:29:26.640 That's why they don't need to define what hate speech is, because if you are—it's a marker of group membership.
00:29:31.800 A Christian—a Christian fundamentalist, Pentecostals will talk about, you know, I think the Holy Spirit is not with him.
00:29:38.660 And it's just—it's a phrase I've heard used in my field work.
00:29:41.520 And it's just something they know.
00:29:42.960 They implicitly know that person's not really a Christian.
00:29:46.200 He's not really one of us.
00:29:47.260 He's just doing it to get status or whatever.
00:29:49.300 Be careful of him.
00:29:50.240 The Holy Spirit is not with him.
00:29:51.400 It's nothing that you can articulate why.
00:29:53.380 It's just an instinct.
00:29:54.880 When I visit fundamentalist Christian sects, they hand me a snake.
00:29:59.860 They're like, this guy is one of us.
00:30:03.340 So I've never experienced what you've experienced.
00:30:07.160 No, I'm not saying they've said that the Holy Spirit is not with me.
00:30:09.200 I've said they've commented—
00:30:09.820 Well, maybe it's because I'm speaking in tongues at the time.
00:30:12.600 Yeah, you're speaking in tongues, and you're handling a rattlesnake.
00:30:14.720 So they kind of—you're kind of indicating you're—
00:30:17.140 I was at a snake-handling church in a place called Lafoyette, Tennessee.
00:30:21.080 We went down there.
00:30:22.180 I have a 22-year-old father of five, Andrew Hamblin, who—
00:30:26.020 I think I've told you this story, haven't I?
00:30:27.580 I don't think so.
00:30:29.660 We wanted to go to a place called Jolo, West Virginia, to the snake-handling church there,
00:30:33.460 but we couldn't, because shortly before we were about to go there,
00:30:35.620 the pastor was bitten to death by snakes.
00:30:37.280 And then—
00:30:39.280 That's the way to go, I guess.
00:30:41.760 So I had to kind of make a sudden rearrangement in my itinerary.
00:30:44.020 So we didn't go to West Virginia, really, at all.
00:30:46.600 We went down to this place, Lafoyette, Tennessee, middle of the countryside,
00:30:49.700 past trailer parks, Confederate flags in the windows, things like this.
00:30:53.540 And I said to this Andrew Hamblin guy,
00:30:54.800 did you know this guy—I can't remember his name—who died in Jolo?
00:30:58.840 And he goes, yeah, he was my best friend.
00:31:01.560 And then I said, are there any rules here?
00:31:03.780 He said, we only ask you to stay off the hardwood.
00:31:05.660 And my friend, who's from New York, he went and sat at the back, little pussy.
00:31:10.120 And I said, come on, I'm up for doing this.
00:31:12.460 If the Holy Spirit inspires me, I'm up for doing this.
00:31:15.500 And, you know, that would be interesting.
00:31:17.200 But no, the only person that handled the snakes was this Andrew Hamblin guy.
00:31:20.200 He'd been bitten twice.
00:31:21.320 He'd lost the use of a finger on one hand,
00:31:23.700 and he'd also been bitten on the back of the head.
00:31:24.940 And he said it was because when he handled the snakes,
00:31:27.380 he wasn't quite sure that it was God's will.
00:31:30.140 Right.
00:31:30.440 And then, of course, he was bitten.
00:31:32.300 So I'm imagining some fantastical image of like,
00:31:36.720 let the Holy Spirit be with you.
00:31:38.500 We've got to support Israel.
00:31:40.320 And they're just crusade against the Persian terrorists.
00:31:44.480 Oh, God, Jesus, these snakes!
00:31:47.520 That would be, yes.
00:31:50.820 But maybe both Trump and Bloomberg should have to handle snakes in the debate.
00:31:54.340 We might ironically get banned for this.
00:31:57.440 I'm not wishing the death on anyone.
00:32:00.280 That was a comedic sketch.
00:32:03.720 But anyway, the point is, we're dealing with a religious group.
00:32:07.960 You implicitly know, you just know how to behave or you don't.
00:32:11.980 And so the hate speech are a marker that you don't know how to behave.
00:32:14.820 You're not part of their group.
00:32:15.820 You're not part of the religion, and they want you out.
00:32:18.080 It was interesting, this chap out in Rutherford,
00:32:19.760 who's this woke anti-scientist who does PC science.
00:32:23.600 He's just published a book called How to Argue with a Racist,
00:32:26.960 which is not the kind of book you'd expect scientists to publish, really,
00:32:30.300 which I'm looking forward to.
00:32:31.640 It's a science hack, really, which I'm looking forward to reading.
00:32:35.280 But he noted on Twitter that, oh, it's a bit of a red flag
00:32:39.860 when an academic uses a non-university email address on papers,
00:32:43.460 i.e., the person is an independent academic.
00:32:46.180 They're not working at a university.
00:32:47.760 Why is it a red flag?
00:32:48.800 Well, obviously, because these days, working at a university
00:32:51.140 is evidence that you are part of the religion.
00:32:53.880 It's evidence that you either pay lip service to the religion
00:32:55.920 or you genuinely believe in it.
00:32:57.580 So, of course, it's a red flag if you use your own email address.
00:32:59.720 It means you don't work for a university,
00:33:01.000 and you might be advocating truth that is not limited by religious truth.
00:33:05.440 Right.
00:33:05.780 So, by the new religious truth, by the multiculturalist truth,
00:33:09.740 the moral truth.
00:33:12.060 And so that's what we're dealing with here.
00:33:14.180 And so that's why they can do these things.
00:33:16.660 It's a religious cult which is allowing a church, really.
00:33:20.560 That's what YouTube is kind of, a big cathedral
00:33:22.460 in which the infidels are permitted to enter.
00:33:28.820 But if those infidels break the rules by, you know,
00:33:32.520 besmirching the shrine or whatever,
00:33:34.380 then they will be removed forever.
00:33:36.360 They will be burned, indeed.
00:33:37.820 It's unforgivable.
00:33:39.100 And that's where the comparison you made to murder fits in.
00:33:41.360 Heresy is worse than murder.
00:33:43.220 Heresy was always worse than murder.
00:33:44.660 The sentence for murder, it was a felony.
00:33:46.720 The sentence under English law for murder
00:33:48.120 was the same as the sentence for stealing.
00:33:49.520 You'd hang him.
00:33:50.740 But the sentence for heresy, you're burned at the stake.
00:33:54.260 Tortured to death, yeah.
00:33:55.700 Tortured to death.
00:33:56.600 And it's to death.
00:33:59.560 So I think it makes sense.
00:34:01.160 Permanent destruction and no going back
00:34:03.820 because you're a heretic.
00:34:06.820 Well.
00:34:11.640 We're the heretics.
00:34:13.800 Well, yeah, I'm the jolly heretic, you know.
00:34:16.160 Various different.
00:34:18.000 We have the Irish heretic and the American heretic,
00:34:22.980 the southern heretic, Texas heretic.
00:34:24.980 So, but yeah, it's, this is, this is, this is it.
00:34:27.940 You know, I think.
00:34:28.600 So it's terrible what's happened to him,
00:34:29.820 but it's, he's, he's, he's been burned.
00:34:31.660 He's been, he's been burned.
00:34:33.340 Yeah.
00:34:33.820 And optics won't save him.
00:34:35.360 I mean, that I, it's kind of also the lesson.
00:34:37.240 And his optics were god awful to begin with.
00:34:40.060 But it's always kind of a joke.
00:34:41.480 But anyway, it is definitive proof
00:34:45.300 that even being, ultimately telling your followers
00:34:48.160 to support Republicans won't help you either.
00:34:52.280 Yeah.
00:34:52.720 It's about those, those times where he is ironically
00:34:55.560 or unironically spoken something true.
00:34:58.940 And that is unforgivable.
00:35:01.800 Dude.
00:35:02.840 I mean, when you, when you look at the size
00:35:04.860 of these tech giants and the power to have it,
00:35:06.900 and you hear some of the sort of mainstream discourse
00:35:09.640 around these issues,
00:35:11.560 it kind of shows how outdated our approach
00:35:14.520 to these things are.
00:35:15.580 Like, you know, when you hear people talking about
00:35:17.100 their private property rights or whatever,
00:35:19.520 or the fact that they, you know,
00:35:20.600 they acquired this power fairly or whatever.
00:35:22.880 It's like, you know, so much of these companies
00:35:24.880 was just timing and kind of getting there first.
00:35:27.720 Like, I mean, there's not anything especially superior
00:35:30.720 in, you know, the layout of the Facebook website.
00:35:34.220 I mean, maybe they're...
00:35:34.920 To Twitter's technology or something.
00:35:37.020 It's, it's, it's technology that was around for a decade.
00:35:40.680 It's blogging effectively.
00:35:41.960 It's just kind of done.
00:35:42.460 Yeah.
00:35:42.700 And even then it's like,
00:35:44.240 maybe you have the right to capitalize on that
00:35:46.060 for a few years,
00:35:46.760 but like two decades after to still just be having a monopoly
00:35:50.680 and just endlessly accumulating on the basis of,
00:35:53.500 you know, doing well 20 years ago
00:35:56.480 with that business model.
00:35:57.300 And then, again, the separation as well of public and private.
00:36:01.480 I mean, like the internet basically came out of the Pentagon.
00:36:04.220 And, you know, that's just fine division then
00:36:06.700 that any, anyone, you know,
00:36:09.080 any Mark Zuckerberg that comes along
00:36:10.760 and becomes a multi-billionaire out of it,
00:36:14.400 that that's entirely his property rights, whatever.
00:36:17.480 I mean, I just think the scale of,
00:36:19.680 the scale these things are moving up,
00:36:20.960 even looking at someone like Mike Bloomberg,
00:36:22.860 like the power he has to shape culture,
00:36:24.460 like he's putting,
00:36:25.060 he's put 300 million into targeting social media
00:36:28.140 and spreading memes and all this stuff.
00:36:30.140 Like just the whole,
00:36:32.860 the whole sort of models we have to discuss this
00:36:35.120 just seems so outdated now.
00:36:37.160 Like I kind of,
00:36:38.300 that's why it's support of wealth tax
00:36:39.800 because there's, you know,
00:36:41.840 there's a stage of accumulation now
00:36:44.700 where it's a smaller and smaller elite,
00:36:47.020 but their, you know,
00:36:47.620 their wealth and power is just beyond any scale conceivable.
00:36:51.480 And the power it's given them is just insane.
00:36:54.760 What about a national maximum wage?
00:36:58.060 No.
00:36:59.040 I don't think it's necessarily the wage that's the issue.
00:37:01.940 It's not,
00:37:02.760 it's not even the income,
00:37:04.100 it's the wealth in that sense.
00:37:07.580 Yeah.
00:37:09.380 Just the potential for sort of endless accumulation.
00:37:14.220 Yeah.
00:37:14.740 I just,
00:37:15.680 I think ultimately,
00:37:16.520 I mean,
00:37:16.660 this is a much bigger issue,
00:37:18.080 but I think we need to get away
00:37:21.020 from having a wealth,
00:37:22.740 middle-class based elite.
00:37:24.380 I mean,
00:37:24.500 I think that is kind of the root of the problem.
00:37:27.740 An Aryan tripartite society,
00:37:30.900 there are those who fight,
00:37:32.220 those who work,
00:37:33.020 and those who pray.
00:37:34.460 And effectively,
00:37:35.660 we have those who work,
00:37:37.160 a kind of bourgeois elite,
00:37:38.940 and those who pray working together.
00:37:41.480 There does have to be an elite that's outside of that,
00:37:44.720 that's about blood and iron.
00:37:48.080 And we don't have that.
00:37:49.520 I mean,
00:37:49.720 the military people,
00:37:51.140 I mean,
00:37:51.320 they're tools of the system.
00:37:53.660 I mean,
00:37:53.860 they're not guarding anyone.
00:37:57.140 And we need to ultimately cultivate that.
00:37:59.580 But that's a,
00:38:00.420 you know,
00:38:01.460 intergenerational project that will,
00:38:04.860 if it is completed,
00:38:06.180 will be completed after our lifetimes.
00:38:10.880 I mean,
00:38:11.440 even if you look at the university system,
00:38:13.840 you can count on one hand,
00:38:15.520 like,
00:38:15.860 you know,
00:38:16.220 you'd think academics having tenure,
00:38:18.760 that there'd be more people maybe speaking out.
00:38:20.820 But I mean,
00:38:21.200 you know,
00:38:21.440 you can count them on one hand,
00:38:22.620 like Kevin MacDonald and Ricardo Duchesne.
00:38:25.600 Think of the social pressure you're under though.
00:38:28.620 You're working in these departments,
00:38:30.480 and you have to work with these people.
00:38:31.780 Do you want to go into,
00:38:32.540 even if you've got tenure,
00:38:33.620 do you really want to go in every day and be hated?
00:38:36.900 Yeah.
00:38:37.040 And everyone,
00:38:37.500 yes.
00:38:38.000 I mean,
00:38:38.180 that's what's the problem.
00:38:39.560 You'd hope there'd be a few academics that would put,
00:38:42.580 you know,
00:38:43.120 I would kind of love that actually.
00:38:44.200 The integrity of the work ahead of that.
00:38:46.240 Yeah,
00:38:46.580 there are a few.
00:38:47.340 There would be a minority,
00:38:48.620 like Jay-Philippe Rush,
00:38:49.620 and get a rise out of that.
00:38:51.300 Fine.
00:38:51.680 They'd be happy to do that.
00:38:52.860 In fact,
00:38:53.060 they would enjoy doing it.
00:38:54.540 They would double down.
00:38:55.860 They don't have.
00:38:56.560 But most people,
00:38:58.660 they absolutely don't want that.
00:39:00.440 And as someone,
00:39:01.020 I was in graduate school,
00:39:02.320 and I was hiding out.
00:39:05.180 I did have my views while I was at Duke University,
00:39:08.420 and at the University of Chicago,
00:39:09.640 but I kind of wouldn't let it slip too much.
00:39:12.700 But,
00:39:13.320 you know,
00:39:14.000 as Ed knows this as well,
00:39:15.860 the kind of,
00:39:16.880 I don't know the right word,
00:39:18.340 this just mundane,
00:39:20.520 unbearable totalitarianism,
00:39:22.420 of the faculty meeting,
00:39:24.140 all of these just pudgy people
00:39:26.620 with ponytails sitting around
00:39:28.640 talking about minutiae for two hours,
00:39:31.800 it's just unbearable.
00:39:34.500 I mean,
00:39:34.720 do you even get through that is something.
00:39:38.240 And if you're a heretic in that system,
00:39:40.580 they will just,
00:39:41.740 they'll kill you by a thousand cuts.
00:39:43.680 They'll just slowly eat you alive.
00:39:46.800 My PhD supervisor...
00:39:47.520 My PhD supervisor...
00:39:48.520 It's unbearable for someone who's a heretic.
00:39:51.320 My PhD supervisor was a New York rabbi.
00:39:54.260 Mm-hmm.
00:39:54.780 So,
00:39:55.640 you know,
00:39:56.080 and he was very, very,
00:39:57.680 he was a Marxist.
00:39:58.440 He was a New York rabbi
00:39:59.420 and a Marxist.
00:40:00.460 Makes sense.
00:40:01.280 He was also a reasonable guy.
00:40:03.780 He was one of those leftists
00:40:04.920 who he wouldn't agree with you,
00:40:06.420 and he wouldn't go mad and emotional
00:40:07.800 if you criticized the orthodoxy.
00:40:10.920 Right.
00:40:11.020 And the problem is,
00:40:12.100 particularly, I think,
00:40:12.560 with the rise of women in universities,
00:40:14.080 because they are less emotionally stable than men,
00:40:16.880 this is increasingly a problem.
00:40:19.000 And that's why I really am pushed towards the view
00:40:21.940 that something needs,
00:40:22.800 their numbers need,
00:40:23.540 they're ruining higher education,
00:40:25.200 and something needs to be done about it.
00:40:26.600 Something very serious needs to be done about it.
00:40:27.940 It's just the feminization.
00:40:29.400 I do wonder if it was only men in academia.
00:40:32.340 I think it'd be a lot different in that sense.
00:40:35.620 I think people would like,
00:40:37.820 men also like the confrontation more.
00:40:41.720 They would like the,
00:40:43.220 oh, let's actually get in a debate with a fascist.
00:40:45.880 This will be really,
00:40:46.620 sparks will fly.
00:40:48.060 Women do not tolerate that.
00:40:50.340 They want to attack you and silence you
00:40:53.700 and be passive aggressive and so on.
00:40:55.480 What I always found-
00:40:56.900 I do think it would be better.
00:40:57.860 I don't think it would be a total solution
00:40:59.280 because this is an ideological thing,
00:41:01.260 but it would certainly be a lot better.
00:41:04.080 I mean, I always found in college
00:41:06.980 doing social sciences,
00:41:09.360 it was like 90% women
00:41:11.520 in that like 110, 120 IQ range
00:41:14.760 that just kind of were capable enough
00:41:17.600 to repeat back what they could figure out
00:41:19.480 their lecture would want them to say.
00:41:21.580 And they just kind of happily coasted through
00:41:23.660 for four years like that
00:41:24.780 and then went into some sort of
00:41:26.280 mid-level management or graduate job.
00:41:28.520 And then, you know,
00:41:29.140 there are a few men there as well
00:41:30.640 that kind of repeat the same thing
00:41:32.500 to impress the overwhelming majority of women.
00:41:36.180 Yeah.
00:41:36.480 Yeah.
00:41:36.720 I remember the last time-
00:41:38.260 Real quick, I'll let-
00:41:40.160 The last time I was in an academic environment,
00:41:43.260 strictly speaking,
00:41:44.380 was actually 2018
00:41:45.500 and it was at Georgetown.
00:41:47.340 And I was still in the DC area at that time.
00:41:50.840 And I went to hear this lecture
00:41:52.440 at Georgetown on the alt-right.
00:41:54.640 And I guess they were probably
00:41:55.800 a little bit surprised that I arrived there.
00:41:57.800 But there was this pudgy woman
00:42:00.160 who came up and she had a really round head
00:42:03.100 and she just kind of waddled up
00:42:05.460 and she was saying,
00:42:06.760 she was just saying all of this nonsense
00:42:09.260 to introduce everything.
00:42:10.740 Just stuff that just,
00:42:12.180 I can't even repeat it
00:42:13.580 because it doesn't make sense.
00:42:15.080 It's, you know,
00:42:15.780 it's just all of these buzzwords
00:42:17.580 and non-logical connections
00:42:21.080 between concepts and so on.
00:42:23.720 And she looked so much like
00:42:26.860 an American Puritan type.
00:42:29.640 Just this boring,
00:42:31.960 not good-looking,
00:42:33.820 not dynamic person
00:42:35.460 just spouting off dogma.
00:42:37.900 And I was just thinking to myself,
00:42:39.600 she is that type.
00:42:40.940 Her great-great-grandmother
00:42:42.300 was doing the same thing
00:42:43.800 at her Presbyterian church
00:42:45.340 that she's doing now,
00:42:46.940 which is basically maintaining
00:42:49.220 ideological order
00:42:51.380 while being just goofily incoherent
00:42:54.940 while doing it.
00:42:55.840 And just being a total mediocrity
00:42:58.280 and kind of benefiting
00:43:00.160 from your own mediocrity,
00:43:01.600 which is something that I despise.
00:43:04.880 That's the brilliance of Jim Goode's concept
00:43:06.800 of the new church ladies.
00:43:08.720 Yeah.
00:43:09.920 It's exactly true
00:43:11.620 and it can be sort of proven,
00:43:14.120 really,
00:43:14.380 through historical data.
00:43:15.460 that women were more religious
00:43:18.380 than men
00:43:19.100 until the 50s.
00:43:22.220 At that point,
00:43:23.340 the church,
00:43:23.920 or the 60s,
00:43:24.940 at that point,
00:43:26.120 due to the undermining
00:43:28.480 of the traditional culture,
00:43:30.340 the church began to collapse.
00:43:32.980 They were no longer inculcated
00:43:34.500 with religion
00:43:35.420 and then they started
00:43:36.940 to be inculcated
00:43:38.000 with feminist ideas,
00:43:39.220 essentially anti-religious ideas.
00:43:41.000 And now women
00:43:42.100 are less religious,
00:43:44.700 they're more left-wing than men.
00:43:46.820 Women are more religious,
00:43:47.580 women are more left-wing than men.
00:43:49.540 They used to,
00:43:50.040 I've got this bit mixed up,
00:43:50.920 sorry,
00:43:51.240 women were more right-wing than men
00:43:52.940 until the 50s.
00:43:54.580 And that was because
00:43:55.540 they were more religious than men
00:43:56.760 and the religiousness
00:43:57.680 was a traditional religiousness
00:43:58.800 that was promoting
00:43:59.520 right-wing traditional things.
00:44:01.440 With the collapse
00:44:02.280 of this religiousness,
00:44:03.460 they're still more religious
00:44:04.420 in the sense that
00:44:05.600 they're more likely to believe
00:44:06.300 in God or whatever,
00:44:06.900 but they're now more left-wing
00:44:08.140 than men.
00:44:08.660 So that same religious impulse
00:44:11.180 of sort of certainty
00:44:13.580 and social stuff
00:44:15.120 being more important than truth
00:44:16.520 and the groupishness
00:44:17.860 being more important than truth
00:44:18.980 has been taken
00:44:19.860 from being a Puritan,
00:44:21.260 like this woman's
00:44:21.960 great-great-grandmother,
00:44:23.020 to being a woke person.
00:44:24.620 Yes.
00:44:25.280 And that's the brilliance
00:44:26.640 of the concept
00:44:27.140 of the new church lady
00:44:27.980 because it's exactly
00:44:28.660 what she is,
00:44:29.180 the new church lady.
00:44:30.340 Yes.
00:44:30.860 It shows you how this happens.
00:44:34.260 Let's put a bookmark in it
00:44:35.440 because I think we could
00:44:36.080 go all day on this one.