TRIGGERnometry - June 19, 2024


Andrew Tate, Elon Musk, AI Girlfriends and the Immigration Crisis - Ashley St. Clair


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

188.95796

Word Count

11,875

Sentence Count

830

Misogynist Sentences

42

Hate Speech Sentences

55


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of Trigonometry, we sit down with writer and podcaster Ashley Sinclair to discuss her new book, "The Dark Side Of: Andrew Tate's New York Times Opinions," and why she thinks the right wing is obsessed with Andrew Tate.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.700 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:00:06.520 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:00:11.780 including America, Forever in Blue Jeans, and Sweet Caroline.
00:00:15.780 Like Jersey Boys and Beautiful, the next musical mega hit is here, The Neil Diamond Musical, A Beautiful Noise.
00:00:22.660 April 28th through June 7th, 2026, The Princess of Wales Theatre.
00:00:27.120 Get tickets at Mirvish.com.
00:00:30.000 Andrew Tate's not the problem, because if you take Andrew Tate away, you still have this massive group of young men who are clinging to that message on shows like the Whatever podcast.
00:00:41.960 They're bringing them on for this humiliation porn.
00:00:44.740 I think you're going to be really surprised at the number of people, the number of men who do have options, who are choosing to have AI girlfriends or live in this digital world.
00:00:53.300 I don't think people realize how historic it is.
00:00:56.020 For him not only to spend $44 billion for free speech, but to have someone who goes and tells Disney and their mountains of money, fuck off, I don't need you.
00:01:07.900 I think that's monumental.
00:01:09.980 Illegal immigration, which is something you've been focusing on a lot, I mean, it's insane.
00:01:14.660 So we have more people, almost more people coming here illegally, just under the Biden administration, than the entirety of Ellis Island.
00:01:22.900 Ashley Sinclair, welcome to Trigonometry.
00:01:24.820 Thank you for having me.
00:01:25.700 I'm excited to finally sit down with you guys.
00:01:27.480 I'm a big fan.
00:01:28.320 We're both big fans of all the stuff that you put out.
00:01:30.780 You're very funny.
00:01:31.820 We met a couple of weeks ago, so we thought we'd sit down and have a conversation.
00:01:34.760 For a woman, right?
00:01:35.380 Funny for a woman.
00:01:36.160 For a woman.
00:01:36.700 Now, you know, this is actually one of the things that we wanted to talk a little bit about.
00:01:41.920 Women.
00:01:42.840 Yeah.
00:01:43.820 Because the online conversation, certainly in our space, about women has kind of gone a little bit crazy.
00:01:52.040 I think you said it really succinctly when you described the sphere.
00:01:55.680 You called it the retardo sphere.
00:01:57.120 Yeah.
00:01:57.340 And I think that's exactly what it is right now.
00:01:59.540 And it's, for whatever reason, so much of the right is devolving into these Andrew Tate-isms, which, good or bad, I just think there's a lot more important conversations we could be having than whether or not it's been a good thing that women vote.
00:02:14.280 And that's...
00:02:14.880 Yeah, well, that's obviously not.
00:02:17.940 But I also think a lot of these people who say, and for those unfamiliar, there's a lot of discourse eating up the right on the 19th Amendment, whether or not the 19th Amendment was a good thing.
00:02:28.820 If we repealed the 19th, if women didn't vote, we'd always have a Republican president.
00:02:32.840 Well, you could apply that to other groups as well, besides women, but it's just acceptable to hate women and say these things about women.
00:02:39.180 Yes, if only white men voted, we would always have a Republican president.
00:02:42.620 But that's not the reality.
00:02:44.100 We're not going backwards.
00:02:45.340 Women vote more than men.
00:02:46.780 That's just the reality of the situation.
00:02:48.620 So I think it's such a pointless conversation, but it's taking up so much of the dialogue that they want to blame women for all of the ails in society, which I just think is absurd.
00:02:59.160 Why do you think it's happening?
00:03:00.840 Two questions.
00:03:01.860 A, do you think it reflects in any way like the normal person on the street?
00:03:06.460 Because I'd probably argue it doesn't.
00:03:09.280 In a way, I want to say it doesn't.
00:03:11.020 I don't think most people would say we should repeal the 19th.
00:03:14.300 But I think...
00:03:14.780 Yeah, I think that's a fair comment.
00:03:16.040 I think most, especially young men, I think if you ask them, who's Andrew Tate?
00:03:20.540 They would know that answer.
00:03:21.740 I think they've seen his content.
00:03:23.140 They've digested this content.
00:03:24.800 And so often when the topic of Andrew Tate comes up, I always say, Andrew Tate's not the problem.
00:03:30.480 Because if you take Andrew Tate away, you still have this massive group of young men who are clinging to that message.
00:03:36.520 Because I think for whatever reason, men do not really have role models or anybody to guide them in any way.
00:03:43.240 So Andrew Tate is the loudest in saying these things.
00:03:46.580 And then I think there's these folks in the right wing like Charlie Kirk who are clinging on to it because they see it's popular and they're grifting off of that narrative.
00:03:54.280 So all of a sudden, Charlie Kirk, who runs one of the most important organizations on the right, you know, they're bringing in millions and millions of dollars.
00:04:02.420 All of a sudden, he's repeating these Andrew Tate-isms that, you know, women should just not work.
00:04:07.500 They should stay at home, even though his organization was built on women sacrificing their 20s and working at his organization.
00:04:14.040 And it's fundamentally absurd, you know, that he's spending his time as someone who's that important on the right to say, well, you know what, women, if you don't get married in your 20s, you're less desirable at 30.
00:04:25.540 This is a man who married his wife at 31.
00:04:28.160 Yeah. And the fundamental thing that I think we can all agree with is when people have these conversations, I go, I think to myself, number one, you're a moron.
00:04:38.800 Let's be brutally honest about it.
00:04:40.440 And number two, you're not a serious person because we have got very serious problems within our society that need dealing with.
00:04:46.900 You've just touched on one of them, the cultural element of young men lacking role models, lacking direction, feeling purposeless.
00:04:55.620 Why are we not talking about that?
00:04:57.160 And why are you talking about the fact that if a woman doesn't get married by the age of 30, she's barren, dry and all the rest of it and fundamentally undesirable?
00:05:05.920 Don't clip that. It's not what I think.
00:05:07.440 Because it's attention porn, right? It's more attractive to watch and very much society hasn't changed that it's really fun to watch a witch be burned at the stake, right?
00:05:18.260 It's really fun to watch this humiliation of women on shows like the Whatever podcast.
00:05:24.960 Jesus Christ wants you to do OnlyFans work, just so that we're clear.
00:05:28.580 Yes.
00:05:28.960 If God wanted you to do OnlyFans, then what would Satan want you to do?
00:05:34.240 To stop doing OnlyFans?
00:05:35.920 Yeah. Why do you think he'd want you to stop?
00:05:37.580 Because I love making sexual content.
00:05:40.120 Yeah, that makes sense.
00:05:41.080 So you think that God, his preference would be that you continue to make sexual content on your OnlyFans?
00:05:46.760 Yes. He wants me to do it.
00:05:49.260 You have these women who they know, you know, they're just OnlyFans models.
00:05:53.380 They're not exactly the brightest bulbs in the bunch.
00:05:55.460 They're not bringing them on for intellectual conversation.
00:05:57.700 They're bringing them on for this humiliation porn.
00:06:00.260 And I think that's become really attractive and rewarded in algorithms.
00:06:03.900 This sort of nastiness, this division of the sexes is rewarded a lot more than talking about what you or I or so many people might find more important.
00:06:13.680 You know, for example, what's something you might find more important than the gender divide?
00:06:16.900 You can name a plethora of things.
00:06:19.380 The issue in this country and the world and the West is not the division of the sexes.
00:06:24.100 We're not going to look back and define this era by the division of the sexes or by the female era.
00:06:30.880 We're never going to call it that.
00:06:32.120 Our times are defined by technology and our innovations.
00:06:38.660 So we should be focusing more on that, what's going on in terms of technology as opposed to whether or not this woman's a three or four or whatever it is.
00:06:48.180 I think that's fundamentally absurd.
00:06:49.960 This is the question I was going to ask you.
00:06:52.100 For me, I've said this for a very long time.
00:06:54.620 And I said this about there was a moment probably five or six years ago when sort of like these really dumb feminist narratives were rampaging through the culture.
00:07:03.340 You know, the air conditioning in office buildings is sexist.
00:07:06.560 And likewise, now when you see the opposite thing where it's like anyone who wants to put a wedge between men and women is not a healthy influence on our society.
00:07:22.020 So the question for me is where do you think this male thing is now coming from?
00:07:27.980 I just, again, I think it's really attractive to a really dumb population.
00:07:33.520 I think men and especially those who are, you know, men are having less sex than ever.
00:07:38.620 It's really easy to blame women for things that are going wrong right now.
00:07:43.280 It's really easy to point to that and say it's women's fault that the country is going, you know, down the toilet.
00:07:47.880 It's women's fault that I'm not getting laid.
00:07:50.380 It's women's fault that I'm alone.
00:07:52.340 I think it's a lot easier to say that than to, you know, self-analyze what they're not doing right.
00:07:58.920 What, you know, what society as a whole is not doing right.
00:08:02.320 I think there's a lot.
00:08:03.360 We have a loneliness epidemic.
00:08:05.060 We have, you know, a confidence epidemic.
00:08:07.040 I think there's more people than not on both sides, whether it's men or women, who are just fundamentally crawling out of their own skin.
00:08:13.960 They have no sense of identity.
00:08:15.400 It's not really, I think especially my generation, Gen Z, and even millennials, they're defined by narcissism, which I don't think that's accurate at all.
00:08:27.020 And Bo Burnham, if you're familiar with Bo Burnham, he speaks about this.
00:08:30.920 And he puts it really succinctly.
00:08:33.400 He says, you know, it's not narcissism.
00:08:35.880 It's actually this generation is like a hyper consciousness of self.
00:08:39.260 It's the first generation that has social media and this performative nature where it's like you want to perform.
00:08:46.360 This is the market's answer.
00:08:47.760 Now perform everything all the time for everybody.
00:08:50.520 And they're sitting back and watching their life as a satisfied audience member, which is so dystopian.
00:08:55.900 So it's not like this narcissism.
00:08:57.600 It's hyper consciousness of self.
00:08:59.560 And I think we're more insecure than we've ever been, despite, you know, looking in the mirror more than we ever have.
00:09:06.020 I completely agree with you.
00:09:07.900 But just to touch back on what's going on with the right, part of the problem that I have is the left overuse these terms, racism, misogyny, to the point where they don't mean anything.
00:09:20.420 And actually, people who are genuinely racist and misogynistic go, I'm not a misogynist because you said Exa was a misogynist because he said there's two genders, right?
00:09:30.960 But what I see on the right is actual genuine misogyny.
00:09:34.740 Yes.
00:09:35.100 It's a hatred of women.
00:09:36.880 Yes.
00:09:37.300 And I don't know that that's organic.
00:09:39.100 And I may be stepping into my own conspiracies here, but I don't think it's coincidental that right before the election, the right becomes the exact caricature of what the left had said the right was for years.
00:09:53.040 And it wasn't.
00:09:53.840 You know, many of us on the right knew that the right was not, you know, this anti-Semitic, misogynistic group of folks.
00:10:00.160 And now all of a sudden we're seeing this rise in misogynistic rhetoric and people who fundamentally believe that women shouldn't vote and people who fundamentally are siding with the people taking over our universities right now.
00:10:12.340 I think that's, I don't think that's organic.
00:10:15.140 I think there's a lot more foreign bot farms and outside influences boosting these people in the retardo sphere than we have any idea about right now.
00:10:24.340 And it's also-
00:10:25.060 In order to?
00:10:25.760 In order to cause that division, because if you look at it and whether people agree with it or not, what is the top issue going into this election?
00:10:35.640 Immigration and the economy would be the top two.
00:10:38.220 Maybe for a Republican, but, you know, generally it's the issue of abortion and women's rights.
00:10:43.840 That's what the left is focusing on.
00:10:45.540 That's what they're running ads on.
00:10:47.020 Yes.
00:10:47.500 The left is running ads on abortion.
00:10:49.160 They're hyper fixating on this abortion issue, on women's issues.
00:10:53.560 That's what they're hyper fixating on.
00:10:55.160 I don't think Republicans have really put enough emphasis on how much the Karens in Wisconsin or Pennsylvania are running the election.
00:11:04.180 And whether they like it or not, women are voting and they vote more than men.
00:11:08.260 You know, they're doing more activism than men.
00:11:10.200 They do more volunteer hours than men.
00:11:12.300 They are making more of a political difference than men are, whatever men want to say about that on the left or the right.
00:11:18.780 I think it's fundamentally really damaging to allow this rhetoric to go on, this misogynistic rhetoric.
00:11:26.560 I'm just trying to delve deeper into the conspiracy.
00:11:30.060 Are you saying this is kind of deliberate to get Republicans to look dumb and offensive so that they lose the election?
00:11:36.160 I believe so.
00:11:38.360 I really believe that this is boosted by things that are inorganic.
00:11:42.160 Because when you look at the engagement on some of these accounts, it is bananas.
00:11:48.300 I really, at least I don't want to believe that that many people truly hold those sentiments.
00:11:53.940 And maybe I'm just too optimistic about humanity and people.
00:11:57.080 But I don't think it's organic at all, that all of a sudden this is sprouting up and we have this many people who genuinely believe that women shouldn't vote.
00:12:06.140 I don't believe that's the case.
00:12:08.740 Do you not think that every political, every side of the political spectrum has a shadow?
00:12:15.640 The left has a shadow, which is the anti-Semitism in particular.
00:12:19.840 They've got their own version of misogyny, which we've seen.
00:12:23.580 And the right do as well.
00:12:25.300 Now, normally what you see with people who are far to the right, it's a hatred of, you know, LGBT people and also misogyny as well.
00:12:35.480 Do you not think that this is just an unmasking?
00:12:38.320 I think there is some of that.
00:12:39.700 And I think there's some people who really, truly believe that.
00:12:42.560 And I think there's some people who truly believe women are just objects.
00:12:46.560 They really don't have the same value as men.
00:12:49.820 They don't even have an understanding of women's roles throughout history.
00:12:53.420 They don't understand how impactful women have been throughout all of history.
00:12:57.140 Even when they were at home, they were running, you know, our society.
00:13:00.480 They were running the schools.
00:13:01.460 They were raising the children.
00:13:02.560 They were running all of our philanthropic issues and volunteerism and activism.
00:13:07.800 They were really fundamental throughout society.
00:13:10.220 And those people won't recognize that.
00:13:11.580 But I think that's a small subsection.
00:13:13.600 I really do think there's something worse going on.
00:13:16.020 Because why is all of this content allowed on YouTube?
00:13:20.900 YouTube, who has been, you know, at the forefront of we need to protect those who are, you know, a protected class.
00:13:28.060 And women typically fall into that class.
00:13:30.280 So why all of a sudden is all of this content allowed on YouTube that is terribly misogynistic, that's terribly degrading to women, calling women three or four, saying they shouldn't vote?
00:13:40.560 When I don't think that would have been allowed, you know, years ago.
00:13:44.420 I just don't.
00:13:45.360 I think that would have been removed from the platform swiftly.
00:13:48.920 Do you not think, okay, we're just stress testing this idea because it's interesting, right?
00:13:54.660 So one of the obvious challenges that I think some people would put to this is that we've had, and this isn't even a devil's argument thing.
00:14:05.960 It's genuine, I think.
00:14:09.500 It's fair to say that over the last 20, 30 years, you're too young probably to remember the Gillette ads of the 90s.
00:14:18.180 But, like, it was all about men being good.
00:14:21.680 It was all about men looking after their kids.
00:14:23.800 It was all about men going out to work.
00:14:26.680 It was all about, it was all a celebration of healthy masculinity.
00:14:29.620 And then fast forward a few years and Gillette ads are about, you know, you're a piece of shit, buy a razor.
00:14:37.020 That's kind of the message, right?
00:14:38.580 Yes.
00:14:39.020 So, and then men are increasingly struggling in education comparatively to women.
00:14:45.520 And we know, statistically speaking, that women want to be dating men who are on their level or above in all sorts of different ways, including financially.
00:14:54.620 So the opportunity for men to find a mate are reduced.
00:14:59.340 They're being told that they're toxic and this and that and whatever.
00:15:02.720 You put all that together, I'm not condoning it because I don't think it's the right response, being resentful and bitter about that for a man.
00:15:10.940 But do you not think that some of it is just an organic reaction to the demonization of men over a decade or two or however long it's been going on?
00:15:20.680 Yes, I think that's also part of it.
00:15:21.960 I think men have been demonized.
00:15:23.620 I think there's a war on masculinity because it threatens any system of power.
00:15:27.740 But I think when we dive into, okay, well, women are more likely to graduate.
00:15:31.580 Women do better in school now.
00:15:33.080 Those are all facts.
00:15:33.840 I also think women are just better at being oppressed and being, you know, turning into these because who created the school system, the modern school system, as we know it today?
00:15:45.660 That was the Rockefellers.
00:15:46.680 And why did they do that?
00:15:47.960 That was because they wanted better workers, not better thinkers.
00:15:50.620 And women are really good at that.
00:15:52.020 They're like, oh, okay, I'm going to do this.
00:15:53.820 And boys are not.
00:15:54.900 As soon as they're, you know, eight years old, seven years old, they're like, oh, well, you have an attention disorder.
00:16:00.060 We're going to put you on Ritalin or you're going to, you know, you're in trouble in the classroom for not sitting there and being a good little worker bee.
00:16:06.580 And I think women are just taking better to that.
00:16:09.020 I don't know that that has necessarily, you know, I think in.
00:16:13.160 Women are better at following orders.
00:16:14.380 Yes.
00:16:14.980 They are better at following orders.
00:16:17.660 Naturally, I think they are.
00:16:19.240 And so men have, you know, gotten the short end of the stick when it comes to that.
00:16:23.120 And when it comes to those systems that we've created and normalized, I also think there is a fundamental war on masculinity and just gender as a whole.
00:16:32.460 Yeah.
00:16:32.660 I think what that is, is a war on the natural world.
00:16:36.100 I think we're so completely divorced from the natural world now.
00:16:39.780 And gender is kind of like that last stronghold that there's a war, not only against masculinity, but femininity.
00:16:48.240 I've been told and women in my generation have been told that the only way for us to be equal is to become you guys.
00:16:54.260 Right.
00:16:54.580 Is to become men.
00:16:55.480 We have to work like you.
00:16:56.540 We have to fuck like you guys.
00:16:57.880 We have to do this.
00:16:58.700 It's everything that makes us equal is destroying things that made us different.
00:17:03.480 Well, if once you start insisting on this idea that equals means the same, then you're going to erode all the natural differences between men and women, which are the source of the strength of human society, not weakness.
00:17:16.620 Men and women being different and complementary is a source of strength.
00:17:20.920 Yes.
00:17:21.340 Right.
00:17:21.600 Once you start to erode all of that, you get to the position we're now in.
00:17:25.280 Yes.
00:17:25.600 And so my question to you coming back to is, do you not think that rather than this being as much of a conspiracy to undermine the rights electoral chances, it's just a more naturally occurring backlash against that demonization of men and the erosion of femininity and masculinity that's been happening?
00:17:43.360 It's probably a combination of both.
00:17:45.340 And I think there's, again, you know, that has us go into algorithms and big tech and what they allow and don't allow.
00:17:53.140 And they're really deliberate with what they allow and don't allow, as we've seen throughout, you know, since really like 2015, they're really deliberate.
00:18:00.380 So maybe some of that conversation started happening organically because of this tension, because, you know, naturally men and women are both feeling more disconnected than ever from their biology and the world and each other.
00:18:13.660 And then it's pushed into the algorithm at these insane rates and then pushed by what I've seen is looks to be inorganic.
00:18:23.600 And that's really important to consider.
00:18:25.680 What is the algorithm feeding people and why is it feeding people that?
00:18:29.940 And big tech has been really pivotal in what's pushed in these algorithms, especially before elections.
00:18:37.700 How much of this as well, do you think it's pornography as well?
00:18:41.320 Like, you know, a lot of the time, if a man is not having sex, the only access that he has to sex is pornography.
00:18:48.300 And let's be fair, pornography, I think we can all agree the image of women is less than optimal in that type of content.
00:18:57.200 Eventually, you start to see women through that lens.
00:18:59.960 Yes, and I think we're underestimating the number of people and men who would rather that digital world and who would rather watch pornography than have a real connection with people.
00:19:15.700 I think especially as we have, you know, these AI girlfriends and this and that, I think you're going to be really surprised at the number of people, the number of men who do have options, who are choosing to have AI girlfriends or live in this digital world, as opposed to this genuine human connection.
00:19:30.400 I think for whatever reason, it's more of a dopamine rush for them to watch porn as opposed to have a genuine connection and actually have to work on themselves.
00:19:39.340 Because guess what?
00:19:40.280 As you know, you're married, right?
00:19:42.600 It is not an easy thing to be married.
00:19:44.340 You actually have to put in the work.
00:19:46.460 And so often we're watching people replace what is good and better for our souls with what's comfortable and more of a dopamine hit.
00:19:55.600 And I think we're seeing that fundamentally.
00:19:57.500 I don't know that there is this return to tradition, return to traditional marriage and, you know, general monogamy.
00:20:05.200 I don't know that that's possible in the digital world because, again, we've replaced what's good for our souls with this dopamine rush, with this digital era.
00:20:14.340 With this everything's at our fingertips as opposed to, I want to be in a marriage.
00:20:18.400 I want to work on myself.
00:20:19.500 I want to understand another human being at the most intimate level possible.
00:20:24.260 No, it's all very fast and swipe and this dopamine hit.
00:20:27.840 I think it's just we've fundamentally changed the way the sexes interact with the digital age.
00:20:33.420 And I think pornography is a piece of that.
00:20:35.480 But they're also making that choice to do that.
00:20:37.940 This is just more attractive to them.
00:20:39.520 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:20:46.340 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love, including America, Forever in Blue Jeans, and Sweet Caroline.
00:20:55.620 Like Jersey Boys and Beautiful, the next musical mega hit is here, The Neil Diamond Musical, A Beautiful Noise.
00:21:02.320 Now through June 7, 2026 at the Princess of Wales Theatre.
00:21:06.340 Get tickets at mirvish.com.
00:21:10.040 It's also as well, if you think about it, what the internet does in many ways is encourage isolation.
00:21:15.580 We've been fed this narrative that, oh, social media connects people.
00:21:19.460 It doesn't connect people.
00:21:20.500 It connects them in certain ways, of course.
00:21:22.500 But actually, the ultimate form of connection is people sitting down and having a conversation,
00:21:26.880 which is not what the internet is actually about.
00:21:29.680 And I know from being a man, like I'm writing a book at the moment, if I'm isolated writing this book for certain periods of time,
00:21:37.960 I come out of my room really fucking angry.
00:21:40.860 And that anger has to go somewhere.
00:21:43.320 And if you haven't done the work...
00:21:44.820 We need to get you a new girlfriend, right?
00:21:47.660 We need to get him an AI girlfriend.
00:21:49.780 Yeah, exactly.
00:21:51.520 But if you're angry and you're young and maybe you don't understand where it comes from,
00:21:59.660 that has to be directed somewhere.
00:22:02.060 And what it's going to be doing is funneled wherever you spend most of your time.
00:22:06.040 And if most of your time is spent online, that anger is going to be funneled right back there.
00:22:10.460 Yes, absolutely.
00:22:11.520 I would agree 100%.
00:22:13.240 But that's what we have to understand, especially the older generations have to understand.
00:22:17.560 That is the new normal.
00:22:19.040 We have the older generations, by the way, Matt, just saying that.
00:22:21.540 We're young.
00:22:22.260 But especially the generation above you and the boomers.
00:22:24.740 And there's a lot of fish shaking at the cloud like that Simpsons meme.
00:22:28.460 You know, old man yells at cloud.
00:22:30.900 And that's very much, that's characterizing a lot of, you know,
00:22:35.340 especially our policies around technology.
00:22:37.620 There's not really an honest discussion about what the hell is happening in this new world that we're leaving our kids.
00:22:43.940 You know, even 16-year-olds now, they are not getting driver's license.
00:22:48.140 At an alarming rate.
00:22:49.640 They don't even want to go meet their friends.
00:22:51.320 They don't need to.
00:22:52.260 They're doing it all online.
00:22:53.960 And we're seeing this for, you know, exactly.
00:22:56.160 That's the eye roll.
00:22:57.160 That's the world they're in.
00:22:59.640 And so how do we navigate that?
00:23:01.720 You know, you used to be a teacher, correct?
00:23:03.380 Right.
00:23:03.860 That is my brand.
00:23:04.740 And now whenever I...
00:23:06.380 That's the first episode in which someone else has brought that up.
00:23:08.960 Yeah, exactly.
00:23:09.620 So whenever I talk to school teachers now, especially ones that may be a bit older,
00:23:15.180 they always say, you know, the kids didn't used to be this way.
00:23:17.700 They didn't used to have so much anxiety and depression.
00:23:21.360 And it's like, of course they do.
00:23:23.180 They're living in a whole new world.
00:23:25.620 And they have not a damn person to help them figure out how to navigate it.
00:23:30.180 Instead, we're like, well, you should get off that phone.
00:23:32.600 What?
00:23:33.080 You didn't have that when you were a kid.
00:23:34.540 You'd probably be on it too if you were their age.
00:23:37.000 It is this infinite, awesome world where anything is possible
00:23:41.920 and you have access to everything all the time.
00:23:44.760 Of course, they're going to be on it.
00:23:46.480 So how do we navigate that?
00:23:48.060 And I don't think there's enough valuable conversations around that.
00:23:51.520 We're just like, well, we shouldn't do this.
00:23:53.440 Probably not, but we are.
00:23:55.480 And it's also, to go back to the porn thing,
00:23:57.820 if you're a kid watching hardcore pornography,
00:24:00.400 you think you don't understand that what you're being presented with
00:24:03.840 is fantasy.
00:24:05.020 Yes.
00:24:05.480 You don't understand that what this is,
00:24:07.320 is entertainment for want of a better term.
00:24:09.740 So you think that this is how men and women interact.
00:24:12.520 So then when a woman does not present herself
00:24:15.880 or doesn't interact with you in the way that a porn star does,
00:24:19.520 you feel cheated and angry, which is why you then lash out.
00:24:23.920 Yes, they do.
00:24:24.980 I mean, they want this woman who's like pregnant and barefoot all the time,
00:24:27.420 but also, you know, a porn star in the sheets.
00:24:29.860 It's this really terribly idealistic version of women.
00:24:34.660 But, you know, men have that too.
00:24:36.900 Men have that.
00:24:37.660 Women have that for men.
00:24:39.100 And I think everyone just needs to lower their standards.
00:24:43.240 Well, there's the title of the episode.
00:24:45.220 Lower your standards.
00:24:47.480 Ashley Sinclair, lower your standards.
00:24:49.380 That will be your catchphrase.
00:24:50.360 You're not that great.
00:24:51.600 Thanks for looking at me while she's saying that.
00:24:53.580 I appreciate that.
00:24:54.720 You're not that great.
00:24:55.660 Thanks, love.
00:24:56.080 I'm happily married, so you're the only one with that.
00:24:58.620 And she settled for you.
00:25:00.040 Exactly, exactly.
00:25:02.040 Find the one that will lower herself to your level.
00:25:04.900 Yes, lower your standards.
00:25:06.620 You're not that great.
00:25:08.080 But that's interesting.
00:25:09.400 So what are the unrealistic standards that women have for men?
00:25:12.060 Because we've spoken about the unrealistic standards that women have for women.
00:25:17.980 What's the inverse of that?
00:25:19.000 Men have for women.
00:25:19.340 Men have for women.
00:25:20.000 That's the one.
00:25:20.460 I think the inverse is that, especially for women, there's this expectation that men are
00:25:26.120 kind of going to be like your emotional support dog.
00:25:28.720 And they've kind of, you know, put their girlfriends to the side.
00:25:31.780 And what we used to get from community and women just being in the village and talking all
00:25:35.660 the time, we now need that from men.
00:25:37.580 We need you to be our confidant, our gossip, you know, our girlfriend in a way that lives
00:25:43.820 with us all the time.
00:25:44.880 It's like we have this expectation that men are supposed to be like our emotional support
00:25:49.660 teddy bear as opposed to what they are.
00:25:52.240 And that's, you know, our providers, our soulmates or, you know, whatever it is, and not necessarily
00:25:57.720 on top of each other like we've come to be.
00:26:00.040 I think this cohabitation and this connection to our partner all the time that's really perpetuated
00:26:06.700 by women and some really jealous men.
00:26:08.700 But women really like being connected to their partner all the time, bothering them at work,
00:26:13.000 sending them a million texts.
00:26:15.080 And I think that's probably not healthy.
00:26:18.080 And I think women probably need to examine the behaviors that they have that are maybe
00:26:22.720 a bit clingy or not innate to men's ability.
00:26:27.380 Men don't want to be on top of you like that unless they're controlling or jealous.
00:26:31.560 It's so interesting you say that.
00:26:32.860 My wife and I were just having this conversation because we were on holiday together for a week
00:26:37.300 after I'd been away for a bunch of time.
00:26:39.400 And we really were reconnecting and talking a lot.
00:26:41.480 And at one point I was like, OK, I'm really enjoying this, but I need to go and sit in
00:26:45.820 the bar for an hour.
00:26:46.960 And when I came back, we talked some more and she was like, yeah, I really need to connect
00:26:50.880 with more of my girlfriends because if I'm trying to get all my this particular need
00:26:57.560 satisfied by just you, it's too much for both of us.
00:27:01.840 And it's not what actually that's the best way to put it.
00:27:04.340 They try to have all of their needs met by one person.
00:27:06.660 And then they're leaving the person that gives them 80 percent for someone that gives them
00:27:10.820 the other 20 percent.
00:27:12.140 And, you know, it's this constant thing.
00:27:13.400 You're never going to get it from from everybody.
00:27:15.240 As long as they're not, you know, abusive or controlling, you know, there's a lot that
00:27:19.320 you can look past and just settle for.
00:27:22.540 But, you know, men used to go out and hunt for weeks at a time and we'd be by ourselves
00:27:27.160 or in the village or with our girlfriends.
00:27:29.360 And now it's all the time.
00:27:30.860 It's we're just around each other all the time.
00:27:33.340 And I think it's really rather toxic.
00:27:35.820 It's funny you say that, because whenever I'm going away on a trip or whatever, I say
00:27:39.920 to my wife, like I'm out hunting.
00:27:41.940 I literally say that to make myself feel better.
00:27:45.420 But I was going to ask you, I mean, one of the things that I think you're very good on
00:27:49.660 is tech.
00:27:50.940 And we've alluded to it earlier.
00:27:52.720 When I was in Australia, I did a few talks.
00:27:54.680 And in the Q&A, someone would always ask me if I see any positive signs.
00:27:58.280 And I would always mention that Elon Musk taking over Twitter really changed everything.
00:28:02.980 And this is an audience of people who I wouldn't necessarily have thought of as like these young,
00:28:08.240 hip, tech-aware people necessarily.
00:28:11.220 He would always get a round of applause from people because it's genuinely something that
00:28:16.560 I just think a lot of people genuinely felt this was a big step in the right direction.
00:28:21.900 The single, this cabal, and I use that term very advisedly.
00:28:26.720 I'm not someone who thinks the world's run by a cabal of pedophiles in a pizza parlor or
00:28:30.320 whatever the story is.
00:28:31.660 But you saw the collusion between big government and big tech, particularly during COVID.
00:28:37.720 And it was scary because you go, well, they're all sitting down in a room together going,
00:28:42.540 what's the right censorship policy on this?
00:28:45.180 All getting together.
00:28:46.260 And no one's thinking independently.
00:28:47.880 And no one's thinking from first principles.
00:28:50.020 So what you've had, I certainly would argue, I don't know if you agree with this, is one
00:28:56.080 guy with just too much money to give a shit, come in and just upend the whole system.
00:29:01.280 I've been very excited by that.
00:29:02.760 Not to say that Elon's perfect because he's definitely not because he's human, blah, blah,
00:29:06.220 blah, blah, blah.
00:29:06.740 What do you think the impact of that particular thing is?
00:29:09.760 I don't think people realize how historic it is for him not only to spend $44 billion for
00:29:16.500 free speech, but to have someone who goes and tells Disney and their mountains of money,
00:29:23.380 fuck off.
00:29:24.240 I don't need you.
00:29:25.140 I think that's monumental to encourage courage and bravery.
00:29:30.240 I think we don't have enough people who are brave enough to give a middle finger to the
00:29:33.780 system, especially when starting in like 2015.
00:29:37.940 And Mike Benz, if you're familiar with him, he speaks about this a lot, this essentially
00:29:42.100 internet death star that they built within big tech and all of these companies.
00:29:47.080 And I think everybody gives you the round of applause because everyone's online.
00:29:51.520 We're essentially already cyborgs.
00:29:53.620 We are, you know, like transhumanism is already here.
00:29:56.760 You said the longest you went without your phone was two hours lately.
00:29:59.840 I'm very proud of that.
00:30:01.480 It's like when I sleep.
00:30:02.740 And even then it's like by my head in case anyone needs anything.
00:30:05.760 And so we're already there.
00:30:07.280 And so what Elon did was historic.
00:30:10.460 I think that's something we're going to look back in the history books as historic, but
00:30:13.360 we need more of that.
00:30:14.980 There is such an empty space, especially with people who are even just right of center or
00:30:20.740 hard Republicans in the AI and tech industry, because this is fundamentally going to change
00:30:25.820 our entire world.
00:30:27.120 The value of free speech is not just really important for the constitutional value anymore.
00:30:31.700 It is arguably more important because for the AI aspect, for the technological aspect, because
00:30:38.700 if we're feeding the system a dishonest and filtered and biased version of humanity, you
00:30:44.120 know what's going to happen?
00:30:45.120 It's going to spit out recommendations based on a dishonest and filtered version of humanity.
00:30:49.180 And I've been saying this for ages because if you think about it, the way that the AI is
00:30:53.960 learning is from online.
00:30:55.940 Yes.
00:30:56.240 And we all know that what happens online is not reflective of like going and talking to
00:31:00.600 people on the street.
00:31:01.600 Correct.
00:31:02.220 And they know this.
00:31:03.600 Right.
00:31:03.960 So we're feeding it a false picture of reality.
00:31:06.360 And part of the reason is people can't say what they actually think.
00:31:09.820 And that's very deliberate.
00:31:10.980 And so what was so, and I think this is so illustrative of the right and just how much
00:31:17.040 of a retardosphere they are.
00:31:18.980 There was a clip of Kamala Harris, and I don't know if you remember this, but months ago or
00:31:23.100 last year, and she's talking about AI.
00:31:25.500 She was named the AR Cesar.
00:31:27.340 And again, you know, it's like humiliation porn for them to make fun of people.
00:31:30.900 So this clip of Kamala comes out and she's like, AI is a fancy word.
00:31:35.080 It means artificial intelligence.
00:31:36.840 And it gets millions of likes and millions of Republicans and people on the right are
00:31:40.340 making fun of her for saying this.
00:31:41.760 She doesn't even know what it means.
00:31:43.400 If you watch the full clip, she says.
00:31:45.460 Ultimately, what it is, is it's about machine learning.
00:31:49.360 And so the machine is taught.
00:31:53.400 And part of the issue here is what information is going into the machine.
00:31:58.760 That will then determine, and we can predict then, if we think about what information is
00:32:06.500 going in, what then will be produced in terms of decisions and opinions that may be made
00:32:13.400 through that process.
00:32:14.980 What you teach the machine then impacts the decisions and opinions.
00:32:20.000 And that, to me, is the scariest thing that possibly could have come out of her mouth, is
00:32:25.020 that this terribly biased administration is saying, we're going to take a focus on this
00:32:29.900 machine that impacts decisions and opinions of and for humanity.
00:32:35.060 And the fact that Republicans aren't making that their focus, and we know more about Charlie
00:32:40.360 Kirk's opinion on women working in their 20s, is a disaster.
00:32:44.700 We are going to look back and they're going to shake their fist and say, the left just control
00:32:49.040 all of this, while Sam Altman, who most of them can't name, most of them could show you
00:32:54.100 47 photos of Hunter Biden's corn dog before they tell you who Sam Altman is.
00:32:59.700 Sam Altman is the guy asking for $7 trillion and giggling online about asking $7 trillion for
00:33:05.980 these AI projects, for shaping all of humanity and the recommendations for this system.
00:33:13.780 And he's in Congress, walking in there every day, talking about the regulations for AI and
00:33:19.680 trying to kick the ladder out from under him.
00:33:21.840 And the fact that we don't understand how much of a focus the left is putting on AI
00:33:26.880 is really terrifying.
00:33:29.240 That's what they're talking about at the World Economic Forum.
00:33:31.580 We can make fun of the World Economic Forum all day.
00:33:33.740 Do they have any idea what they're talking about there?
00:33:36.040 They're talking about AI.
00:33:37.440 They're talking about AI being a human right.
00:33:39.720 This is verbatim.
00:33:40.740 They're talking about the democratization of the access to AI.
00:33:44.480 And they are talking about UN special envoys for AI.
00:33:48.620 And the right has no discussion about this, no presence in the industry at all.
00:33:53.200 And that is more dangerous than anything going on, in my opinion.
00:33:57.620 Do you think part of it as well is that people who are older tend to skew Republican and conservative?
00:34:03.240 And people who are older know less about this stuff.
00:34:06.740 They shake their fist and the old man yells at Cloud.
00:34:09.780 That's exactly right.
00:34:10.860 Because when you have these old boomers in Congress and Senate that are asking the TikTok CEO,
00:34:17.720 if TikTok uses Wi-Fi, I think you should be removed from office.
00:34:22.420 That actually happened.
00:34:24.160 And we can laugh.
00:34:25.260 And we're laughing as it all crumbles.
00:34:27.040 And they grab these institutions by the balls.
00:34:30.660 Because this is what happened.
00:34:33.320 These people should be removed from office.
00:34:35.020 I think if you can't name the top five people in the race to AI, you should be removed from office.
00:34:41.080 Mark Zuckerberg is a supervillain in my eyes, not just because of the censorship, but because the potential for this data and this technology that he has,
00:34:51.900 has the power to absolutely destroy the world, his AI capabilities.
00:34:56.400 And when you look at the technology he has, and when you look at the data he has, and meanwhile, he is in Hawaii raising wagyu beef and picking macadamia nuts.
00:35:06.420 If I had that much data and power and the potential to destroy the world and I'm picking macadamia nuts, that's a supervillain in my eyes.
00:35:15.500 And they just don't care.
00:35:16.960 There are more concerns that, you know, this thing about the COVID job was censored on Facebook, which is a big deal.
00:35:23.520 But there's a bigger problem.
00:35:25.200 And that this guy is in, and these other people like Bezos and Zuckerberg and Sam Waltman,
00:35:30.360 they are building the system that makes recommendations for all of humanity and filters everything.
00:35:36.380 And they know not a damn thing about them.
00:35:39.320 So what would you like to see happen on that?
00:35:41.920 I think we need to, so also at the World Economic Forum, one of the people on stage at one of these panels discussing AI was Governor Jay Inslee.
00:35:52.180 Do you know who that is?
00:35:52.980 Yes.
00:35:53.440 No, I don't.
00:35:54.020 He is, I don't know how much you know about the state of Washington here, but you can't really get further left than the state of Washington and Oregon.
00:36:01.520 This is Antifa leftist central.
00:36:04.920 And Jay Inslee, who's the governor of Washington, is on the stage talking about the University of Washington partnering with Microsoft,
00:36:11.700 Microsoft, and a Chinese university to go into machine learning and all of this stuff.
00:36:17.100 He's also talking about the apprenticeships that they're guaranteeing for coders.
00:36:21.960 The left is doing this, and the right should absolutely be doing this.
00:36:25.060 They should be guaranteeing apprenticeships for coders.
00:36:27.540 They should be guaranteeing that we're in these industries, that we're in AI, that we're in gaming,
00:36:31.580 that we're in all of these tech industries that are absent anybody outside of the Bay Area in Silicon Valley.
00:36:38.480 Like, why are they not doing that?
00:36:40.960 Well, part of the reason, psychologically speaking, is that people on the right tend to have a different psychological profile.
00:36:47.760 They tend to be pretty good at, like, running an office, not so good at being creative.
00:36:53.060 This is generalization, obviously, but that's kind of...
00:36:55.880 So, the reason a lot of professions that involve creativity don't attract as many people on the right is people on the right generally generalization,
00:37:04.420 but are not as creative in that way.
00:37:06.380 Well, so the people running the AI industry, they don't necessarily have to be creative.
00:37:10.760 They can be very logical.
00:37:13.260 They can be very math-centric, whatever it is.
00:37:16.220 It doesn't need creatives at all.
00:37:19.680 You should have some, but I also reject that idea that there's people...
00:37:24.660 We also need to encourage people to be more creative on the right.
00:37:27.340 I think one of the worst things we did was mock the liberal arts and say,
00:37:31.180 well, it's so stupid to get a liberal arts degree.
00:37:33.740 It's so stupid to care about those things.
00:37:35.540 Well, guess who runs it now?
00:37:38.220 Not a damn Republican.
00:37:39.800 And the same thing with AI.
00:37:41.340 We mock it and shake the fist and whatever it is.
00:37:44.360 I just guess what I'm saying is people like you and like us who are different and funny and like...
00:37:50.180 Go on.
00:37:52.320 But look at us like none of us is...
00:37:55.340 I don't know about you, but most people who I find who are like you
00:38:00.540 will tend to find a lot of things that they disagree with on their own side.
00:38:04.920 Whereas what the tribe wants is a good little worker bee who's going to, you know,
00:38:10.160 hold the right flag and say the right things and vote the right way and not...
00:38:14.080 Do you see what I mean?
00:38:14.760 And pop out a few kids.
00:38:16.200 Yes, but fundamentally, it's still...
00:38:19.240 No matter what, I disagree with.
00:38:21.360 Yes, I think it's ridiculous that they're talking about these andretatisms
00:38:24.760 and that's become like a pillar in our attention economy on the right.
00:38:29.660 Fundamentally, we all agree on the same things, right?
00:38:31.720 Mass migration is a really big problem.
00:38:34.220 Free speech is a core fundamental value.
00:38:36.480 We haven't really diverted on those core fundamental values.
00:38:39.600 I think all of this is just noise.
00:38:41.840 Well, if we focus...
00:38:42.700 Sorry to interrupt.
00:38:43.220 But if we focused on those things that you'd find the overwhelming majority of the country
00:38:47.960 actually agreed on...
00:38:49.100 Yes, yes.
00:38:49.840 If we focused on those things.
00:38:51.020 That's a bigger threat.
00:38:52.120 Mass migration is a bigger threat.
00:38:53.360 I mean, historically, throughout history, again, you want to go back to the Roman Empire
00:38:56.580 and the mass migration of the Huns and the Germanic tribes.
00:38:59.400 I mean, it's going to lead to what I believe is the fall of the American Empire is this mass
00:39:04.300 migration and the fall of the West.
00:39:06.180 Yeah, I also see as well, like the right, and you've touched on it, but I sometimes see the stuff the right talks about and go,
00:39:14.860 why are you banging on about this all the time?
00:39:18.200 All the time.
00:39:19.020 Look, the trancing is important.
00:39:20.820 And of course, protecting kids.
00:39:22.260 Yeah, we'd never have lots of people to talk about it.
00:39:25.120 But they constantly talk about it like it's the only thing to talk about.
00:39:31.160 And you go, it is important, but it's not the most important.
00:39:35.260 There are other things that we need to discuss.
00:39:37.360 But they have their very set topics and they kind of don't deviate away from that.
00:39:42.620 And you want to say to them, do you not think that there is other things here to discuss?
00:39:47.300 But they can't seem to do that because it's like they've applied these blinkers and like,
00:39:52.400 no, we are just going to talk about this.
00:39:54.760 Because it's the attention economy that gets eyeballs, that gets this.
00:39:58.820 And we need we need like a Dave Ramsey for the attention economy.
00:40:02.460 You know, like stop spending.
00:40:03.760 You need to budget your attention.
00:40:04.960 You need to have it here or there or whatever it is.
00:40:07.580 And when you turn on Fox News and there was one time I was on Fox and they had me talking
00:40:11.820 about Dolly Parton.
00:40:13.180 I'm like, why are we talking about Dolly Parton right now?
00:40:16.040 We're talking about Dolly Parton's outfit like this is E or this is ridiculous and absurd.
00:40:21.220 But we keep giving these people our attention.
00:40:23.360 We keep giving these people our money, no matter how many times we see what they're doing
00:40:27.540 is fundamentally horrible and bad.
00:40:30.180 Because at the end of the day, as much as we want to say we're principled and we believe
00:40:33.640 these things, I think we're actually lazy and we'll sacrifice all of our morality and
00:40:40.000 our morals for comfort.
00:40:41.760 Absolutely.
00:40:42.560 It's like when you had all of these Republicans who were cheering on the Bud Light boycott
00:40:46.620 as they're having Amazon packages arrive at their door.
00:40:49.460 And it's like, OK, so you boycotted Bud Light because of Dylan Mulvaney.
00:40:53.460 Well, you're giving Jeff Bezos money.
00:40:55.120 This guy who, like, fundamentally is ideologically different to everything that you're doing,
00:41:01.140 who has a monopoly on government contracts, who has a monopoly on web services and AI and
00:41:05.820 owns the Washington Post.
00:41:07.020 You're going to give your money to him day in and day out.
00:41:09.740 All of these people have Amazon packages coming to their door as they're boycotting Bud Light.
00:41:14.560 Because I think that's that is a really big ill of capitalism is the consumerism that follows
00:41:19.900 and the comfort that follows.
00:41:21.240 We've we've essentially put a lot of people in, like, consumerist hospice, like they're
00:41:26.440 paying just to be comfortable.
00:41:28.360 And the the only place typically that you would see that is hospice, like you'll pay to
00:41:32.740 be comfortable until you die.
00:41:34.220 And that's that's kind of what people have chosen willingly just to be comfortable.
00:41:38.680 And they're sacrificing their life and their morality and the life of our children and
00:41:42.840 the future of humanity for this comfort and consumerism that I think is just it's going
00:41:48.260 to be our downfall, I think.
00:41:49.620 Yeah, that craving for comfort is the enemy of growth.
00:41:54.280 Yes.
00:41:54.900 Personally and spiritually.
00:41:56.460 Yes.
00:41:57.140 And it disappoints me that the right have gone this way because the right traditionally have
00:42:04.180 always been about personal development.
00:42:06.360 They've always been about taking responsibility for your actions, you know, self-fulfillment,
00:42:12.560 all of these things.
00:42:13.620 And you look at kind of the way they're going now and you're going, you're just a different
00:42:19.940 form of moron to the left.
00:42:22.380 Yes.
00:42:23.040 That's all.
00:42:23.620 That's all it seems to be.
00:42:24.860 Not all of them, of course.
00:42:26.300 No, it's a it's a fringe.
00:42:27.440 That's why I call them the woke right.
00:42:28.680 It's a fringe on the left and the right.
00:42:30.440 But what they are is they're becoming disproportionately more powerful.
00:42:34.760 Well, louder online, I think.
00:42:37.180 No, I think it's actually just becoming really it's becoming harder to criticize them because
00:42:42.200 they've conflated criticism with them to criticisms of like principles and ideas.
00:42:46.680 So if you criticized, you know, someone on the right in the petite bourgeoisie for the
00:42:51.620 way they spend their money or this or that, well, now you're an enemy of capitalism.
00:42:55.280 Well, you know what?
00:42:56.020 I get to spend my money however I want.
00:42:57.680 And it's like, no, you're actually like a greedy son of a bitch that we should probably
00:43:01.520 shame into oblivion.
00:43:02.920 But they've they've conflated criticisms of themselves with criticisms of a movement
00:43:07.500 or principles that's just not true.
00:43:10.020 So I think it's just become a lot harder to criticize those who really need to be removed.
00:43:16.680 It's so funny you say that.
00:43:17.980 I've told the story of Francis Huda a few times because it just amused me so much.
00:43:21.320 So, you know how I wrote an article about Tucker that was I praise him for some things,
00:43:27.280 but I criticize some of the things that he did in Russia on a factual basis.
00:43:31.380 Like I just said they weren't true.
00:43:33.120 I was uninvited from a party that I'd been invited to for that.
00:43:37.160 And I was just like, you guys don't see like you spent six years talking about the importance
00:43:41.860 of diversity of opinion.
00:43:43.740 How dare you?
00:43:44.440 How dare you?
00:43:45.960 He's our God.
00:43:47.060 He's our, you know, he stuck it to Fox News, right?
00:43:50.780 Like that's no, you're not allowed to do that.
00:43:53.840 Let's talk about one issue that I think Tucker would agree with us on, which it would be
00:43:58.160 an issue you just talked about, which is illegal immigration.
00:44:00.760 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:44:07.680 The true story of a kid from Brooklyn destined for something more, featuring all the songs
00:44:12.460 you love, including America, Forever in Blue Jeans, and Sweet Caroline.
00:44:17.220 Like Jersey Boys and Beautiful, the next musical mega hit is here.
00:44:21.280 The Neil Diamond Musical, A Beautiful Noise.
00:44:23.940 April 28th through June 7th, 2026, The Princess of Wales Theatre.
00:44:29.020 Get tickets at mirvish.com.
00:44:30.860 Because mass immigration is a somewhat separate conversation for me.
00:44:37.080 Like, I think Elon is right that having lots of talented and driven people come into a country,
00:44:41.920 particularly like the U.S. where there is space and there is land and there's the ability
00:44:45.640 to build infrastructure quickly if you put your mind to it, et cetera, et cetera.
00:44:50.140 Separate conversation.
00:44:51.020 But illegal immigration, which is something you've been focusing on a lot, I mean, it's
00:44:56.520 insane.
00:44:57.280 It's out of control.
00:44:58.500 And I don't think most people understand the scale of it.
00:45:01.100 For example, I think it was like 12 million immigrants who came through Ellis Island, like
00:45:05.240 for over the 60 years that Ellis Island was operating.
00:45:09.560 And just under the Biden administration, they're estimating between 7 and 10 million illegal immigrants
00:45:14.960 have come in.
00:45:15.660 So we have more people, almost more people coming here illegally just under the Biden administration
00:45:20.240 than the entirety of Ellis Island.
00:45:22.300 And, you know, just from an infrastructure standpoint, we can't sustain that.
00:45:27.340 New York City alone, we're going to spend $12 billion, with a B, $12 billion into fiscal
00:45:32.720 year 2025 dealing with this migrant crisis.
00:45:36.160 And most people, they don't understand just the volume of people coming in and where they're
00:45:43.140 coming in.
00:45:43.720 There's this misconception that they're all coming from Mexico or wherever.
00:45:47.400 And it's not.
00:45:48.240 Our southern border is an open border to the world.
00:45:51.600 We've seen a 900% increase in Chinese nationals.
00:45:54.080 We're seeing immigrants from Africa, from the Middle East, from all over.
00:45:57.260 They're flying down to these South American countries and being aided by a lot of American
00:46:01.540 NGOs to come here.
00:46:04.340 But again, it's not just unique to America.
00:46:07.160 They had this playbook in Europe.
00:46:08.700 It happened in Europe.
00:46:09.860 Every time I go to Europe and I talk to people, like in Italy, it's the same problem.
00:46:13.700 It's these NGOs that are funding this mass immigration of illegals into the countries and the NGOs
00:46:20.640 are helping them.
00:46:21.360 And they're all doing it under the claim of asylum.
00:46:23.880 But see, the NGOs are going to...
00:46:26.120 There's a bunch of dumb people who are really motivated by compassion and blah, blah, blah.
00:46:30.720 And by the way, some people are refugees and do need help.
00:46:33.820 The problem is the government's supposed to be in charge of this.
00:46:38.420 But I think that's dangerous too.
00:46:40.180 Tell me why.
00:46:40.540 I have to stop you on...
00:46:41.960 Well, you know, there are some people who need help.
00:46:44.200 Well, where do you draw that line of what help we should give?
00:46:47.720 Because what the left will do is they'll say, okay, well, that comes from a point of privilege.
00:46:52.040 Basically, the whole world needs help if you compare it to America.
00:46:55.340 The whole world is hungry.
00:46:56.700 The whole world is this.
00:46:57.420 No, I'm not talking about economic migration.
00:46:59.300 I'm talking about, generally speaking, when people are fleeing persecution, war.
00:47:04.060 But what does that mean?
00:47:05.800 Persecution.
00:47:06.440 It means when someone is trying to kill them.
00:47:08.540 Yes.
00:47:09.140 But what does that mean?
00:47:10.300 Because they blurred that line is why I want to get an exact definition.
00:47:13.960 Fine.
00:47:14.260 Well, what I'm saying is, for example, people fleeing Hong Kong because they're pro-democracy protesters.
00:47:19.080 People fleeing Ukraine because they're fleeing war, right?
00:47:21.700 I don't have a problem with people who are aligned to our culture, who don't hate our societies,
00:47:26.860 who are going to come and integrate, learn the language, in limited numbers that we voted
00:47:30.760 to have, right?
00:47:32.800 There's a democratic mandate.
00:47:34.140 There's a lot of history in accepting refugees in those small numbers.
00:47:37.540 A lot of the Jewish community here in New York are a product of that sort of system, right?
00:47:42.520 So I think a small number of people who are carefully selected because they genuinely are,
00:47:49.940 A, fearful for their lives, and B, beneficial to the societies that they're going to come to,
00:47:54.280 I don't have a problem with that.
00:47:55.240 What I have a problem with is you open the border and you don't check who's coming.
00:47:59.260 As you say, most of those people are not fleeing genuine persecution at all.
00:48:04.240 And we don't know who they are.
00:48:05.660 And as you say, in Europe, we have exactly the same problem in the UK, in France, in Italy,
00:48:11.400 and other places.
00:48:14.200 But anyway, we got sidetracked because you were saying what I said is dangerous, which
00:48:18.380 I don't think is true.
00:48:19.760 I think eroding that boundary is dangerous.
00:48:22.700 I think allowing a small number of people in because they're genuinely in fear of their lives.
00:48:27.520 Well, we have to be really precise and say, in fear for their lives.
00:48:30.500 Or, you know, they're fleeing violence or whatever it is.
00:48:33.140 Yeah, they didn't convert to Christianity two days ago suddenly.
00:48:35.800 Because when we say they need help, we want to help people, that is really dangerous and
00:48:39.880 has been weaponized by the left, this help.
00:48:42.320 We need to help people.
00:48:43.580 Yeah.
00:48:43.720 What does that mean?
00:48:44.880 What people do we help and where do we draw that line?
00:48:47.800 What I'm saying is I don't think that line is at zero.
00:48:50.200 I think there are some people that we might choose democratically to say, you know, we
00:48:55.660 as a rich country will happily welcome, I don't know, 30,000 Christians fleeing persecution
00:49:00.860 in the Middle East a year.
00:49:02.580 Well, I would agree.
00:49:03.740 And again, I'm also not, I'm pro-legal immigration as well.
00:49:06.900 I've always said this for years.
00:49:08.940 But again, the issue is this asylum.
00:49:11.760 We're not vetting them.
00:49:12.620 We don't actually care if there are genuine claims for asylum.
00:49:16.600 And because there's international treaties for these refugees and whatever, it's really
00:49:20.200 hard to stop it.
00:49:21.280 Right.
00:49:21.560 So that was my point, right?
00:49:22.960 NGOs are going to NGO.
00:49:25.060 It's the government's job to set the rules and then to enforce the rules.
00:49:29.460 Why do you think that isn't happening?
00:49:30.640 But who funds the NGOs?
00:49:32.280 You tell me.
00:49:33.260 The government.
00:49:34.060 I mean, most of these NGOs are heavily funded by the United States government as well.
00:49:38.080 Right.
00:49:38.260 Or, you know, international agencies, international government agencies.
00:49:41.640 So I don't even know why most of them are called NGOs because they're really, they don't
00:49:46.500 really exist.
00:49:47.160 They're geos.
00:49:47.640 Yes, they're geos.
00:49:48.940 Because they don't exist without the funding of the government.
00:49:52.460 And it's become, especially NGOs and nonprofits have really become a vector for the government
00:49:56.760 to do unspeakable and horrible things with a lack of transparency.
00:50:00.440 Because guess what?
00:50:01.940 I get to hide so much with a 501c3 or an NGO.
00:50:05.720 That is the best way to hide whatever the hell I'm doing.
00:50:08.840 And they know this.
00:50:09.960 And so they do this.
00:50:10.720 And we don't really have some like nationwide audit of, you know, hey, are you actually
00:50:14.880 charitable?
00:50:15.440 Hey, where is all your money going?
00:50:17.020 Why do we allow people who are at charitable 501c3s to be paid $2 million a year?
00:50:22.820 That's fundamentally insane when you think of the purpose of a 501c3 or charity or anything
00:50:28.880 like that.
00:50:29.860 So I think the NGOs are a really big problem because it's essentially money washing.
00:50:34.820 They're money laundering a lot through these NGOs.
00:50:37.220 There is so much money in NGOs.
00:50:39.960 That's why every billionaire, you know, has a 501c3 or a nonprofit or whatever it is.
00:50:45.840 There's so much money washing and laundering that goes through, goes on through these NGOs.
00:50:50.240 Yeah, it's I would also say as well, there's another point, which I think America does have
00:50:56.380 a moral obligation to take in certain types of refugees.
00:50:58.920 For instance, let's look at Afghanistan.
00:51:00.720 There's a lot of Afghanistan, people in Afghanistan who actually risk their lives to help America
00:51:06.980 because they wanted to see a positive change in Afghanistan.
00:51:11.260 They wanted they they wanted to see the Taliban eradicated and taken completely away from power.
00:51:18.060 However, America withdrew.
00:51:20.100 Some of these people were murdered and actually because by the Taliban, because they saw them
00:51:25.760 as being traitors.
00:51:26.960 And I really think America should have done more to help those people.
00:51:32.020 Yes, I agree.
00:51:33.640 But however, when you think from a technical standpoint, right, you're like, well, we should
00:51:36.660 let in all these people who were actually really good to America and they actually believe
00:51:40.240 these things.
00:51:40.720 Well, how do you vet that?
00:51:43.700 Well, if they were translators for the U.S.
00:51:46.080 forces.
00:51:46.600 Technically, what do you do?
00:51:47.700 At what scale do you have to increase our legal immigration facilities and bureaucracy and
00:51:53.060 whatever?
00:51:53.820 And how do they vet that?
00:51:55.000 Is it because someone told a good story?
00:51:57.040 Do we need an investigator to go down and see if they're actually telling you?
00:52:00.180 No, but if you were a translator that was attached to U.S.
00:52:02.860 forces, you probably shouldn't be left to your own devices once the Taliban take over.
00:52:07.260 But I just think it's at scale really hard to vet a lot of these claims.
00:52:12.540 It's almost impossible to vet a lot of the sincerity of a lot of these claims.
00:52:17.640 So I don't know that even that I would agree with you.
00:52:20.800 Yes, that ideally sounds awesome.
00:52:23.440 But practically, that's kind of how we're in the situation that we're in now, because
00:52:29.600 there's just not a way to vet that at scale.
00:52:32.420 There's not a way to vet millions of people at scale, which it ends up becoming.
00:52:36.120 If you say there's not millions of people in the world who are fleeing violence, there
00:52:41.300 definitely are.
00:52:42.440 There's millions of people.
00:52:44.060 So at scale, if we're actually thinking practically, how do we implement a system to vet these people?
00:52:49.940 You don't.
00:52:51.060 It just doesn't happen.
00:52:52.740 So again, where do you draw the line?
00:52:55.740 I think what you need to do is have a full, frank and honest conversation where there are
00:53:01.140 people who decide, who go, look, we can't take everybody in.
00:53:04.400 Looking at our infrastructure, looking at our economy, we can take X amount.
00:53:09.580 Certain percentage of them are going to be refugees.
00:53:12.420 We're going to prioritise people who've actually worked with, helped and supported the US in
00:53:17.960 places like Afghanistan.
00:53:19.680 They're going to take priority.
00:53:20.980 And you just have to accept that you wanted to be in government.
00:53:26.260 And part of being in government, in fact, a large part of it is actually making unpleasant,
00:53:30.760 unpopular decisions.
00:53:32.360 And if you want the part where people know you and you're the one making those decisions,
00:53:38.060 then you've got to accept that part of what you do is going to be unpleasant and you're
00:53:43.320 going to be hated.
00:53:44.020 And the fact that people are going to call you racist and white supremacist, I'm sorry,
00:53:48.780 that's just an excuse.
00:53:50.080 Get over it.
00:53:50.680 That would be really easy if we didn't make careers out of being politicians and bureaucrats.
00:53:55.240 They're not incentivised to do that.
00:53:56.660 They're actually incentivised to do the opposite.
00:53:59.040 Politicians and bureaucrats, they need us to like them for as long as possible.
00:54:03.220 Or they're thrown to the dogs.
00:54:05.360 They need us to like them.
00:54:07.240 That is actually a fundamental part of their entire job is being likeable.
00:54:12.300 We don't encourage courage or bravery, like I said, because they have to be liked.
00:54:19.760 Their entire livelihood depends on being liked.
00:54:22.340 Do you know what, Ashley?
00:54:23.300 I think more and more people are starting to move away from that.
00:54:26.800 I think, you know, the number of people that I spoke to, you know, in places like Texas
00:54:31.820 and even in New York, where people are going, look, I don't like Trump.
00:54:37.500 I'll be honest with you, I think he's an arsehole.
00:54:39.360 But he was better than what we've got now.
00:54:43.840 And that's why I'm going to vote for him.
00:54:45.660 I think we're getting to that point.
00:54:47.160 I think people are sick of mealy-mouthed politicians who essentially have no backbone
00:54:54.060 and are not prepared to make difficult decisions.
00:54:56.840 I think people are starting to get to the point, maybe I'm wrong, where they're going,
00:55:01.180 you know what, I just want someone to be a leader.
00:55:03.500 They don't have to be the nicest person in the world.
00:55:06.120 I just don't want 100,000 people dying from fentanyl.
00:55:09.340 You disagree with us, though.
00:55:10.460 You think Republicans are going to lose the election.
00:55:13.380 I'm not as optimistic as some people that Trump is going to win.
00:55:18.740 And part of that is the way that the media is talking about Trump,
00:55:22.480 is that, you know, you had these people who were fudging poll numbers all throughout 2016,
00:55:27.840 2020, and now all of a sudden, and I think I spoke briefly about this recently,
00:55:34.200 but especially with the Iowa caucus,
00:55:36.260 they're calling the caucus for Trump within a minute of voting.
00:55:39.820 By all professional standards, we've never seen that before.
00:55:42.820 So when you have the New York Times, MSNBC, CNN,
00:55:46.940 all of these people calling it for Trump and showing polls where Trump is leading by double digits,
00:55:51.600 I think this is another form of voter suppression
00:55:53.980 because they know that if Republicans think they're leading by double digits,
00:55:57.840 they're going to be fat and lazy and they're not going to vote.
00:56:00.980 I think they know that if Republicans don't feel some sort of exigency
00:56:04.680 and like there's some eminent threat, they're not going to vote.
00:56:08.340 They're like, we got it in the bag.
00:56:09.460 Of course, nobody wants four more years of Biden.
00:56:12.480 I think that's my concern going into 2024.
00:56:16.860 I would say maybe that's true,
00:56:19.640 but then how much power do these organizations have now?
00:56:23.600 Is it that they're still ultra powerful or is their power radically diminished?
00:56:29.860 Which organizations?
00:56:31.040 Like the New York Times, MSNBC.
00:56:33.320 I mean, they're nowhere near as powerful as they used to be, is my point.
00:56:35.920 So they're not going to have a generational stronghold like they may have before.
00:56:39.800 So, you know, the boomers' kids are not going to find as much sincerity in the New York Times and CNN
00:56:44.820 as maybe previously before Trump, you know, blew the lid off of it
00:56:48.740 and were like, oh, you guys don't tell the truth all the time.
00:56:50.820 But they do have a stronghold on the older voting generation.
00:56:54.500 That's what they're watching and reading every single day is these mainstream media outlets.
00:56:58.200 And these are the people who vote because as politically charged as Gen Z is in activism,
00:57:03.740 historically, the younger generations just don't vote.
00:57:06.780 They'll vote Tom, but not in the way that the older generations and the boomers do.
00:57:10.360 They've got better things to do.
00:57:11.520 Yeah, they have the world at their fingertips.
00:57:14.540 They're busy out there not having sex.
00:57:17.260 Not having sex, playing video games, you know, posting on Instagram, making TikToks.
00:57:21.380 And watching other people have sex.
00:57:22.860 Yes.
00:57:24.120 So I just, yeah, they still do have a really strong hold on the boomer mindset.
00:57:31.920 Like, especially when I talk to my grandmother or, you know, my parents, they'll say, oh,
00:57:37.360 I saw this on the news.
00:57:38.900 And we might scoff at that and say, well, why are you reading that?
00:57:41.640 They still very much find these, you know, that's where they get their news.
00:57:45.660 Where else are they going to go to?
00:57:47.040 They are going to have a brain aneurysm opening act.
00:57:49.740 You know, they're like, what is happening?
00:57:51.700 It's like they're walking into a casino and they're on drugs.
00:57:55.300 They're like, they're on LSD in a casino is like boomers opening social media.
00:57:59.940 They're like, what's going on?
00:58:01.300 So they're still very much, oh, look, my top five headlines for the day.
00:58:04.940 They love that.
00:58:05.460 There's going to be a lot of angry boomers in the comments.
00:58:07.360 The boomers are going to be so mad.
00:58:09.060 They're like, I'm on X just fine.
00:58:11.220 But, you know, they're not on TikTok.
00:58:13.200 They're not on Instagram.
00:58:14.100 They're not on these.
00:58:14.920 I'm not on TikTok.
00:58:16.720 Exactly.
00:58:17.420 Because it's too much for so many people.
00:58:22.440 So, yeah, they still have to go back.
00:58:24.860 Yes, they still have an instrumental amount of power in terms of voting.
00:58:28.680 You know, when I have my grandmother asking me if Elon Musk is really a Nazi, yes, I think
00:58:34.940 they have an instrumental amount of power in terms of voting.
00:58:39.120 That's really interesting.
00:58:40.980 And one of the things I actually really wanted to talk to you about, actually, is, you know,
00:58:45.540 I kind of thought 10 years ago, we're kind of done with the conversation about anti-Semitism.
00:58:50.780 And now it seems on the left, we've just seen it, you know, absolutely come back from full force.
00:58:59.960 But I'm starting to see it on the right as well, where I'm going, I didn't expect it from your guys.
00:59:06.680 I've always said anti-Semitism is a bipartisan position.
00:59:10.400 Yes, and especially what's characterizing it now.
00:59:15.820 First of all, we just can't catch a break, can we?
00:59:17.840 But what's characterizing it now is really Marxism.
00:59:21.940 And it's hard not to laugh at the institutions like Colombia being taken over by these people.
00:59:29.400 And you have the Jewish professors.
00:59:30.800 I think it was David Shea or one of them who, you know, he had his card turned off at Colombia.
00:59:37.820 And if you go back and you see his tweets and his posts from during the height of the BLM riots,
00:59:43.200 he's saying, well, we don't need police and we should defund them.
00:59:45.720 And all of a sudden he's like, where are the police?
00:59:48.540 The Marxists have taken over the institutions that I raised them in, you know?
00:59:52.540 And it's fundamentally to me, yes, there are some people who think it's actually the Jews
00:59:57.740 that like run the world and this and that.
00:59:59.180 But I think it is inherently anti-white and anti-West rhetoric.
01:00:04.040 That's really, it's Marxism at its core.
01:00:07.180 It really doesn't, the Jews are just really easy to say it about because, you know, they do run so much.
01:00:12.560 It's like, oh, look, it must be them.
01:00:14.240 They're at the top.
01:00:15.240 In the same way, it mimics a lot of the DEI conversations.
01:00:19.440 It's like, well, the problem must be white men since they're a majority of CEOs.
01:00:23.700 It mimics that in a lot of ways.
01:00:25.280 But I don't think the issue is fundamentally this circa 1930s issue.
01:00:30.080 I think it really is Marxism.
01:00:32.020 I think it really is this anti-white rhetoric, this anti-oppressor rhetoric that we need to redistribute everything.
01:00:38.280 Because again, at all of these institutions, it's happening at all the universities,
01:00:42.000 that they're being the perfect little Marxist soldiers that you train them to be.
01:00:46.800 Of course they are.
01:00:48.480 The woman who, if you saw the video, there was the video of the woman who's like,
01:00:52.420 we need humanitarian aid after they took over Colombia.
01:00:56.440 We need water.
01:00:57.140 We need food.
01:00:58.100 This woman, I shit you not, she is, her focus, she was an assistant teacher, something like that.
01:01:05.740 PhD student.
01:01:06.560 PhD student with an emphasis in poetry through a Marxian lens.
01:01:12.360 That's not a joke.
01:01:13.680 That's not a Babylon Bee headline.
01:01:15.040 That is reality.
01:01:16.160 This woman had a, you know, she was a PhD student for poetry and imagination through a Marxian lens.
01:01:21.180 That is exactly what's happening.
01:01:22.620 Because I'm watching the same people who are rioting and looting through BLM and destroying my city.
01:01:28.400 Now they're sitting here and they're saying, you know, from the river to the sea.
01:01:31.440 It has nothing to do with Palestine.
01:01:33.180 Just like it had nothing to do with the black community and what they were going through either.
01:01:37.540 And I actually think the actual issue is being completely obfuscated.
01:01:42.060 The real issues in the black community were completely ignored and obfuscated.
01:01:45.920 The real issues for the Palestinians are completely obfuscated and ignored.
01:01:49.960 And you have these spoiled little rich kid brats asking for humanitarian aid at Columbia University where they're paying, you know, an ungodly amount of money.
01:01:58.800 Well, all this humanitarian aid talk is making me hungry.
01:02:02.240 Eat some humanitarian aid.
01:02:03.840 Let's order some from Uber Eats.
01:02:06.700 You've been in America too long, my friend.
01:02:08.720 I'm very hungry.
01:02:09.360 So, Ashley, it's been great having you on.
01:02:11.720 We will ask you a bunch of questions from our supporters that they've already submitted that only they will get to see on Locals in a second.
01:02:18.380 But we always end our conversations with the same question, which is what's the one thing we're not talking about that we really should be?
01:02:24.200 AI.
01:02:25.120 Absolutely.
01:02:25.700 AI and how we can have our principles, especially as it relates to free speech and liberty, how we can apply that to this new world that's run by AI and robots.
01:02:36.340 So, I think that's what we absolutely need to focus on.
01:02:40.160 Awesome.
01:02:40.520 Well, head on over to Locals where we continue the conversation.
01:02:44.520 How does fear of losing subscribers handicap people at the Babylon Bee when they're satirizing people on the right?
01:02:50.660 How does fear of losing subscribers handicap people at the Babylon Bee when they're satirizing people on the right?