The Joe Rogan Experience - February 04, 2026


Joe Rogan Experience #2448 - Andrew Doyle


Episode Stats

Length

2 hours and 39 minutes

Words per Minute

201.94646

Word Count

32,197

Sentence Count

2,861

Misogynist Sentences

52

Hate Speech Sentences

69


Summary

In this episode of The Joe Rogan Experience, Joe talks to Andrew Yang, a writer, activist, and author about the end of the "woke" movement, and the rise of authoritarianism in the culture war.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:03.000 The Joe Rogan experience.
00:00:06.000 Train by day, Joe Rogan, podcast by night, all day.
00:00:12.000 Yes, Andrew.
00:00:14.000 Hello.
00:00:14.000 Good to see you, brother.
00:00:15.000 Good to see you.
00:00:16.000 It has been, you said, six years almost to the day.
00:00:19.000 The last time.
00:00:19.000 Almost to the day.
00:00:20.000 Lots changed.
00:00:21.000 Right before everything went crazy.
00:00:23.000 That's it.
00:00:24.000 Right before.
00:00:25.000 Yeah, the whole world sort of shifted.
00:00:26.000 Because everything went kooky around March, right?
00:00:28.000 Yeah, so it was February 2020, and then we have COVID, and then we have, you know, we've had Trump in between and that, we had BLM.
00:00:34.000 That summer of 2020, everything just exploded and went away.
00:00:38.000 And yeah, and then everything shifted.
00:00:40.000 And then you wrote a book.
00:00:42.000 It's called The End of Woke, How the Culture War Went Too Far and What to Expect from the Counter Revolution.
00:00:49.000 Isn't that how it always goes, though?
00:00:50.000 It goes like we go too far and then we overcorrect and we become Nazis.
00:00:56.000 That's it, exactly.
00:00:58.000 It's the opposite.
00:00:59.000 We go socialist.
00:01:01.000 It's a big pendulum.
00:01:02.000 It sort of goes back and forth.
00:01:02.000 I get that.
00:01:04.000 I mean, I was trying to, in that book, I'm trying to make the point that what woke was, was like a kind of the latest manifestation of a kind of innate authoritarian impulse.
00:01:13.000 I think human beings are by default quite inclined towards just shutting people up if they don't like them.
00:01:19.000 Yeah.
00:01:19.000 Just imposing their authority.
00:01:21.000 And so woke what, I mean, a lot of people are annoyed that I've called it the end of woke.
00:01:25.000 I'm not saying it's all over.
00:01:26.000 Let's just go home, forget about it.
00:01:27.000 It's still going on.
00:01:28.000 But the point about it is that in its current manifestation, things are changing now so rapidly.
00:01:33.000 We are moving into some sort of new phase.
00:01:35.000 And that authoritarianism, which we've associated with the left, might come up from the right.
00:01:40.000 It could come up from anywhere.
00:01:41.000 It's what you say about the pendulum.
00:01:42.000 So you just have to be kind of vigilant about it.
00:01:44.000 I don't think we were vigilant.
00:01:45.000 I think that's why woke happened.
00:01:47.000 We weren't vigilant against this prospect that authoritarianism could emerge in what we thought was a free society.
00:01:54.000 Well, authoritarianism, it snuck in through a sheep costume.
00:02:01.000 A wolf in a sheep's costume.
00:02:02.000 Yeah, it was a costume of being more inclusive, being more open-minded, being a better society, being kinder.
00:02:12.000 It led to child trans surgeries, led to chaos.
00:02:16.000 It led to a lot of really fucking freaky things that you would have never expected.
00:02:22.000 People saying that the First Amendment's not important.
00:02:25.000 It's more important.
00:02:26.000 It's protecting people.
00:02:27.000 Well, that was the key, wasn't it?
00:02:28.000 The point was that the way it worked was that it was gulling people through language that sounded pretty sweet and kittenish and fluffy.
00:02:36.000 Things like equity.
00:02:37.000 Well, that sounds a lot like equality, doesn't it?
00:02:40.000 It means treating people unequally to ensure equal outcomes according to group identity.
00:02:40.000 It doesn't mean equality.
00:02:45.000 That's a very different thing.
00:02:47.000 You say you're talking about, let's make everything inclusive, but what you really mean is let's exclude anyone who disagrees with what we've got to say.
00:02:54.000 So you're using language to mean the exact opposite.
00:02:57.000 They say gender affirming care.
00:02:59.000 Or do they mean affirming what is effectively a pseudo-scientific belief among vulnerable people?
00:02:59.000 Do they mean that?
00:03:04.000 So it's all about misusing language because most people I think, or I like to think, are pretty decent.
00:03:11.000 Most people want to be kind and want to be fair.
00:03:13.000 And when you hear these activists saying, be kind, be compassionate, or else, right?
00:03:18.000 You know, you kind of think, okay, well, maybe their intentions are good, but also they're pretty scary.
00:03:24.000 I mean, there's a weird, there was a weird thing with the woke thing, which was that on the one hand, it proclaimed to be this sort of great, virtuous, kind, progressive, right side of history.
00:03:34.000 How often did you hear that phrase?
00:03:37.000 And at the same time, they're like dangerous dogs.
00:03:39.000 Like, you're like, oh, I better not piss them off.
00:03:42.000 I better not say the wrong thing in the workplace because they'll destroy you.
00:03:46.000 Well, I always find that the most preposterous the idea is and the least capable it is to stand up to scrutiny, the more violent the enforcement of that idea will be because you cannot combat that.
00:04:00.000 You can't defend that idea with logic, so you have to defend it with fear and force and just shouting people down.
00:04:07.000 And that's what we saw.
00:04:08.000 And that's it's a natural impulse of human beings.
00:04:12.000 Like when you're arguing with a kid, you know, when you're a kid and you're arguing with a kid and you say something, you don't even know, you shut the fuck up.
00:04:12.000 Absolutely.
00:04:18.000 Like you just started scaring you.
00:04:20.000 So why is it, though, that some countries and some societies seem to protect themselves better than others against that impulse?
00:04:27.000 And I feel at the moment that the UK is kind of failing where America is to a degree succeeding, not obviously in all ways, but when it comes to the idea of freedom and free speech, like I think UK has pretty far has pretty fallen to the kind of the woke insistence that you need to control people's language so that you can create this perfect society which can never come anyway.
00:04:48.000 It's just you.
00:04:48.000 Well, I think it's been co-opted.
00:04:50.000 I think whatever organic version of that emerges naturally from society where people want, where there's an overcorrection, I think in the UK, because you guys don't have free speech laws, because it's just different over there.
00:05:04.000 Yeah.
00:05:04.000 You can get away with a lot of crazy shit.
00:05:07.000 Like, first of all, like, we should explain what we're talking about.
00:05:11.000 More than 12,000 people have been arrested in the UK in the past year for social media posts.
00:05:18.000 And if you read some of those social media posts, they're not even remotely terrifying.
00:05:23.000 It's not like I'm going to grab a knife and go cut the head off of every immigrant I see.
00:05:28.000 Like, hey, buddy, maybe we should lock this guy up and evaluate him.
00:05:31.000 He sounds like a crazy person.
00:05:32.000 Like, no, the immigrants are coming into this fucking country and creating all this crime.
00:05:36.000 Knock on the door.
00:05:38.000 I worry that Americans think we're mad.
00:05:38.000 You're going to jail.
00:05:40.000 Sometimes I think...
00:05:41.000 We do.
00:05:42.000 We do now.
00:05:42.000 Do you?
00:05:43.000 Yeah, we think you've lost it.
00:05:44.000 Yeah.
00:05:45.000 Well, we also think something happened where your leaders are intentionally trying to tank your country.
00:05:52.000 It seems like they're trying to bring in as many migrants as possible, cater to them, not to the British people, and do it openly so that everyone knows what they're doing and then create chaos on the streets because of it.
00:06:06.000 Yeah, I mean, people have a phrase for that, anarcho-tyranny, you know, where you punish people who aren't breaking the law, but you protect those who are.
00:06:13.000 That's right.
00:06:14.000 I think with the, I mean, I don't know the extent that Americans know the, I mean, the stat you quoted, that came from the Times newspaper in London, which did a freedom information request to the police, found out that it's 12,000 a year on average.
00:06:26.000 So that's like 30 a day, not just being investigated or looked into, but being arrested.
00:06:31.000 But over the last few years only, if you go back, it's only like 1,000 or 500.
00:06:37.000 It was 3,000 last time we spoke, back in 2023, was it really?
00:06:40.000 That was.
00:06:41.000 Back then?
00:06:42.000 Oh, my God.
00:06:42.000 Yeah.
00:06:43.000 So we already had that problem.
00:06:45.000 I mean, we already didn't know it was that many.
00:06:46.000 That's crazy.
00:06:47.000 Even back then?
00:06:48.000 It was already really high.
00:06:49.000 I mean, we had stuff like the old stories of like, there was that guy in 2010 who made a joke online about, he was at Doncaster Airport in the UK.
00:06:57.000 He said, oh, if this queue doesn't hurry up, I'm going to blow up the airport.
00:07:00.000 Just a stupid, funny tweet.
00:07:02.000 He went all the way to court.
00:07:03.000 That was a full trial.
00:07:05.000 So these laws, and I think what happens with this stuff is people don't realize how long this has been embedded in the UK.
00:07:12.000 We have hate speech laws that are encoded in a number of different legislations.
00:07:16.000 We have a thing called the Public Order Act.
00:07:18.000 We have a thing called Malicious Communications Act.
00:07:20.000 That's from 1988.
00:07:21.000 We have the Communications Act from 2003.
00:07:24.000 And all of these things criminalize.
00:07:26.000 I tell you, I kid you not.
00:07:28.000 The language in the statute books is if it's grossly offensive.
00:07:31.000 That's the phrase.
00:07:32.000 If you post something that is grossly offensive, you can go to court.
00:07:35.000 You can be prosecuted.
00:07:36.000 But, you know, I find... So subjective.
00:07:38.000 Well, that's it.
00:07:39.000 What does that even mean?
00:07:40.000 I find laws against free speech to be grossly offensive.
00:07:44.000 So should the British state be arrested?
00:07:46.000 I don't know.
00:07:47.000 And there's one, I think it's in the Malicious Communications Act, where it talks about needless anxiety, causing needless anxiety can get you arrested.
00:07:56.000 And you think that's not a thing.
00:07:59.000 I can give you a specific example.
00:08:00.000 Do you smoke cigars?
00:08:02.000 I have once.
00:08:03.000 My friend Winston Marshall.
00:08:05.000 I worry that if I try it, I'll cough and I'll look really wimpish and pathetic.
00:08:10.000 And it'll be good for your arguments.
00:08:12.000 It will backfire.
00:08:13.000 It'll undermine everything.
00:08:14.000 It'd be like I'm sitting here with a paper hat on at Christmas, undermining all of my key points.
00:08:19.000 Right.
00:08:20.000 I like the flavour and I like being around smokers because my grandmother used to chain smoke around me.
00:08:25.000 So it's kind of...
00:08:26.000 Oh, boy.
00:08:27.000 Well, she's Northern Irish, you know.
00:08:28.000 It's the way they do.
00:08:29.000 She used to give me whiskey when I was three to calm me down, you know.
00:08:31.000 Oh, wow.
00:08:32.000 It's that sort of family.
00:08:33.000 That's an old thing they used to do with kids.
00:08:35.000 They put it in their babies and put it in their mouth.
00:08:38.000 Like they would dip their finger in whiskey and rub it on the inside of a kid's mouth.
00:08:42.000 If you're struggling with a child, get it drunk.
00:08:44.000 That's how you do it.
00:08:45.000 It's old Northern Irish wisdom.
00:08:48.000 I don't think you should scoff at it.
00:08:50.000 It's a good thing.
00:08:50.000 But I'll be more than happy to.
00:08:51.000 It's grossly offensive.
00:08:53.000 It's grossly offensive.
00:08:55.000 The example I was going to give was this guy called Darren Brady.
00:09:00.000 And this sounds made up.
00:09:01.000 And whenever I tell people this, it sounds made up.
00:09:03.000 He posted a meme.
00:09:05.000 I don't know if you saw this meme where it was the four Progress Pride flags.
00:09:09.000 You know, it's got the crazy triangles and stuff in it.
00:09:12.000 You put them all together and they become a swastika.
00:09:13.000 Exactly that.
00:09:14.000 And that was going everywhere.
00:09:15.000 And he posted it.
00:09:16.000 And there's a video of him being arrested, put in handcuffs.
00:09:18.000 He's an army veteran, by the way, right?
00:09:20.000 Put in handcuffs by the police.
00:09:21.000 And the policeman says in the video, you caused someone anxiety.
00:09:25.000 So the actual language from the law is being used for this rearrangement of the...
00:09:30.000 And you know what?
00:09:31.000 That's quite a good satirical point that he was making.
00:09:34.000 It wasn't even his meme.
00:09:36.000 He was just retweeting a meme.
00:09:37.000 But even if it was some horrible, offensive thing, who cares?
00:09:40.000 How is that offensive?
00:09:42.000 Well, I guess, I mean, well, you can find that's the problem.
00:09:45.000 You could find anything offensive.
00:09:47.000 You could find anything grossly offensive if you're extremely sensitive.
00:09:52.000 You could.
00:09:53.000 But wasn't there a point to that?
00:09:54.000 I mean, he was kind of saying that the LGBTQIA plus movement has become quite authoritarian.
00:10:00.000 Yeah.
00:10:00.000 He's not saying they're actual Nazis.
00:10:02.000 And he's saying, oh, isn't it quite funny that when you put them together, it looks like a swastika.
00:10:06.000 The idea that you get handcuffed for that.
00:10:08.000 Especially for a retweet.
00:10:10.000 Yeah, that's crazy.
00:10:10.000 That's crazy.
00:10:12.000 It's retweets, it's tweets, it's posts.
00:10:14.000 We've had memes are the big ones.
00:10:16.000 So there was a guy called Lee Joseph Dunn who went to prison for eight weeks.
00:10:20.000 That was last year, I think, for three memes that he posted.
00:10:23.000 Eight weeks.
00:10:24.000 Eight weeks in prison.
00:10:26.000 So again, I'll tell you what the most offensive of the three memes was, and you can tell me whether you think it was worth prison time.
00:10:33.000 He put a picture of some immigrants with knives, and underneath it said coming to a town near you.
00:10:40.000 And that was it.
00:10:42.000 So I don't know if you think that's worth prison time.
00:10:44.000 That's the most offensive one?
00:10:45.000 Of the three, that's the most.
00:10:46.000 What's the least offensive one?
00:10:47.000 I can't remember what the other two were.
00:10:49.000 Because I remember I looked at them, I thought, well, that's not even worth, that's not even worth thinking about.
00:10:53.000 But this one was the one that really, because they say in England, you're stirring up hatred against minorities through the spreading of the meme.
00:11:02.000 Right.
00:11:02.000 You know, but that's clearly not sufficient.
00:11:06.000 And I think in the US, you have far more protections.
00:11:09.000 I wonder whether it's to do with the fact that in the US you have the First Amendment.
00:11:12.000 So you have something codified that says you can say what you want.
00:11:17.000 We've never had that.
00:11:18.000 It's very important.
00:11:19.000 And it didn't seem important 20 years ago or 30 years ago because no one ever looked at England as being that kind of a country that would just put people.
00:11:28.000 Well, obviously, this was all pre-social media.
00:11:31.000 Yeah, yeah.
00:11:31.000 And England has always been a fairly polite society.
00:11:34.000 Yes.
00:11:36.000 But the thing is, like, now pub talk has become illegal, right?
00:11:40.000 Like if you say something offensive in a pub, you're subject to be arrested and they're asking people to turn people in.
00:11:40.000 Yeah.
00:11:46.000 There's a thing called the banter ban, which the Labour government was trying to put in.
00:11:51.000 Here's the logic of the banter ban.
00:11:54.000 I've forgotten about this, but now you've mentioned it.
00:11:56.000 They wanted to introduce this law so that, for instance, if you're working in a bar or a pub and you overhear someone who says something against your protected characteristic, say you're a gay barman, and someone says, oh, I don't like the gays or something, and you overhear it, your employer has a duty to protect you from that kind of hate speech, that kind of harm.
00:12:13.000 So therefore, there's going to be a blanket ban on speech, on certain kinds of speech within the pub, right?
00:12:20.000 I would say the guy who's eavesdropping, he's the problem, right?
00:12:23.000 You shouldn't be listening in on other people's conversations.
00:12:25.000 So that's a real thing.
00:12:27.000 And I guess it all comes down to this view, which I think is completely wrong, that words and violence are the same thing, that words can create a more violent society, that there's a direct causal link between the stuff that people say and the stuff that people say online to how people behave in the real world.
00:12:44.000 And I think you guys have got it right, because you've got the Brandenburg test.
00:12:47.000 You know about the test for incitement to violence in the US.
00:12:50.000 No, what is that?
00:12:52.000 It's basically a test that was established, I think, back in the 60s.
00:12:55.000 It was a KKK leader called Clarence Brandenburg who was prosecuted for incitement to violence.
00:13:00.000 And the test that was established since that precedent was that any words that can be convicted for incitement to violence, they have to be intended to cause violence, likely to cause violence, and the violence must be imminent.
00:13:13.000 And if you satisfy that threshold, you can be prosecuted in the US for incitement to violence.
00:13:18.000 So it'd be like kind of imagine a demagogue surrounded by all his fans, whipping up a frenzy and then pointing to a guy on the front row and saying, kill him now.
00:13:25.000 That would qualify for the Brandenburg test.
00:13:28.000 But in the UK, because we don't have that test, all we've got is whether people found it offensive.
00:13:35.000 That's the difference of the threshold.
00:13:37.000 So it's a massive difference between what the US has and what the UK has.
00:13:40.000 Massive.
00:13:40.000 It's insane.
00:13:41.000 I mean, to give the most obvious recent example, because I don't know if people know about this, there's a woman called Lucy Connolly in the UK.
00:13:49.000 I don't know if this was reported over here at all.
00:13:51.000 Do you remember we had all these riots last year during the summer against hotels which were housing asylum seekers and people were setting fire to them?
00:13:59.000 There were genuinely racist stuff going on during those riots.
00:14:03.000 And this was off the back of a guy who'd murdered a bunch of little girls in a dance class.
00:14:08.000 And there were rumors going around that this was an asylum seeker, right?
00:14:11.000 And this one woman, a mother, who'd lost her daughter, very sensitive about the idea of Lucy Connolly, very sensitive about the idea of loss of kids.
00:14:19.000 She tweeted in a fit of anger, go and burn down all the hotels for all I care.
00:14:25.000 If that makes me racist, so be it.
00:14:27.000 And take the government with you, something like that.
00:14:29.000 And she deleted it within a couple of hours.
00:14:31.000 She went out and walked a dog, she deleted it, and she thought, I really, that's not me, that's not who I am.
00:14:35.000 Deleted it.
00:14:36.000 Police came, went to court, sentenced to 31 months in prison for that swiftly deleted tweet, and she served over a year.
00:14:46.000 Oh, my God.
00:14:47.000 Now, I'm not saying the tweet was nice, right?
00:14:49.000 The tweet was a horrible tweet.
00:14:51.000 And she says it was a horrible tweet.
00:14:52.000 That's why she deleted it.
00:14:54.000 But because we don't have that Brandenburg test, we don't have a test for incitement to violence.
00:14:58.000 Because the key is that tweet, there was no way it could have, she was a nobody, you know, she wasn't someone with influence.
00:15:03.000 She didn't have many followers.
00:15:06.000 No one was going to read that and go and act upon it.
00:15:10.000 And if they did, that would be on them, right?
00:15:12.000 Because this is a myth, this myth that people act on cue to what they read online.
00:15:18.000 Well, it isn't real.
00:15:19.000 It influences people, for sure.
00:15:21.000 But at what point are you required to have sovereignty over your own mind and your own actions?
00:15:29.000 Yeah.
00:15:30.000 Well, I think what it does is it raises the temperature, particularly when political leaders do it.
00:15:34.000 Right.
00:15:34.000 But when political.
00:15:35.000 But my point is, like, it's not going to incite you to violence.
00:15:39.000 It's not going to incite me to violence.
00:15:41.000 So who are we talking about?
00:15:43.000 This is part of the thing is like they're protecting the dumbest members of society.
00:15:48.000 This is like the thing about banning, you know, crazy talk online.
00:15:52.000 If you're talking about witches or, you know, whatever it is, flat earth.
00:15:56.000 Like, we have to stop misinformation.
00:15:58.000 From who?
00:16:00.000 It's not working on you, right?
00:16:01.000 Yes.
00:16:02.000 So who are we protecting?
00:16:02.000 You don't believe it.
00:16:04.000 We're protecting the dumbest people.
00:16:06.000 Also, aren't you kind of letting them off?
00:16:07.000 Like, if someone goes and commits an act of violence and said, oh, I did it because someone told me to do it, aren't you kind of letting them off the hook?
00:16:14.000 Right.
00:16:14.000 Exactly.
00:16:15.000 And sort of displacing the blame.
00:16:16.000 You know, it's like that guy who shot John Lennon, who said Catcher in the Rye made him do it.
00:16:22.000 Reading the book Catcher.
00:16:23.000 Are we now blaming J.D. Salinger for the murder of John Lennon?
00:16:25.000 It was John Lennon, wasn't it?
00:16:26.000 I think he did.
00:16:27.000 So do you, I think the safest approach is to say people are responsible for their own actions.
00:16:33.000 I think the best that you could say is when political leaders and people with clout say things like that, it'd say, you know, it's fine to go out and commit violence.
00:16:42.000 I think what they do is they create a kind of imprimatur of approval.
00:16:46.000 They create this kind of sense that if you do it, the people in charge will have your back.
00:16:50.000 If you do it, it's okay.
00:16:51.000 Well, this was the argument with Trump for January 6th.
00:16:55.000 And that's why the BBC edited his speech to make it look as if that's what he was saying.
00:17:00.000 You saw that clip, right?
00:17:01.000 Oh my God, it's fucking crazy.
00:17:03.000 I mean, I've been saying for a long time the BBC has a real, like what I will say in the BBC's defense is they've always been pretty good at being party politically neutral.
00:17:12.000 Like they will interrogate someone in the right and someone in the left in a pretty neutral way.
00:17:18.000 They don't, I think they do pretty good.
00:17:19.000 I know people will be annoyed at me for saying that, but I think they do.
00:17:22.000 But I think in terms of the ideology, the woke ideology, they got captured.
00:17:25.000 They have a thing at the BBC called the LGBT desk, or they had it up until recently, which could veto any news story, which meant that any story that was slightly critical of transactivism or anything like that just didn't get reported.
00:17:38.000 So I'm not surprised that the BBC gave them veto power?
00:17:42.000 They gave them veto power, yeah.
00:17:43.000 That's crazy.
00:17:44.000 This all came out in a report, quite a recent report just a few months ago, which led to the resignation of Tim David, the director general.
00:17:49.000 And he resigned ostensibly because of that Trump clip, which, by the way, that wasn't the first time they did it.
00:17:55.000 There was another clip about a year before in a different program that did the same thing, took the clip, re-edited it, and made it look like he had said something he absolutely had not said.
00:18:06.000 So I think the BBC quite obviously has an ideological bias, if not a party political bias.
00:18:12.000 But that's more than a bias.
00:18:14.000 It's misleading, right?
00:18:16.000 It's completely deceptive.
00:18:18.000 You're editing something and change.
00:18:20.000 I mean, they took out a giant chunk of his speech.
00:18:24.000 This episode is brought to you by 1-800Flowers.com.
00:18:27.000 Valentine's Day is coming up.
00:18:30.000 It always sneaks up on people.
00:18:32.000 If you want an easy way to absolutely crush it this year, this is it.
00:18:37.000 1-800 Flowers Roses.
00:18:39.000 They're bigger and actually last.
00:18:41.000 Plus, they back it with a seven-day freshness guarantee so you can feel confident that you're sending the best.
00:18:48.000 Here's the deal.
00:18:49.000 Right now, they've got this double blooms offer.
00:18:52.000 You buy one dozen roses, they double it to two dozen for free.
00:18:57.000 No catch.
00:18:58.000 Same price, way bigger statement.
00:19:00.000 They also do same-day delivery nationwide.
00:19:04.000 So even if you waited longer than you should have, you're still good.
00:19:07.000 This is one of those rare situations where doing something big is actually easy.
00:19:13.000 Go to 1-800Flowers.com/slash Rogan to get the double blooms offer.
00:19:18.000 By one dozen, they double it to two dozen roses free.
00:19:22.000 That's 1-800flowers.com/slash Rogan.
00:19:27.000 I forget how many minutes it was.
00:19:28.000 They left like 45 minutes or something like so he said something crazy like that.
00:19:32.000 Yeah, he said it made him look like he was saying go and commit the He was in tongue-in-cheek talking about the very fine center, that they're doing a great job, the senators and congresspeople and said all this other stuff.
00:19:48.000 It's so weird that you have to fight like hell to keep your country.
00:19:51.000 I mean, no offense, but you can find daft stuff that Trump says pretty easily, right?
00:19:56.000 You don't need to edit that stuff down.
00:19:58.000 Well, it's because they had an opportunity to, like what we were saying before earlier, we were talking before the show, you can put out a narrative and it doesn't have to be true, and then that's the one that sticks.
00:20:10.000 So that's the one that spreads wide.
00:20:11.000 And then when all these years later, they have to have this trial and everybody finds out it's not true.
00:20:19.000 But the damage is done.
00:20:20.000 I mean, that's what they did with Trump during the whole Steele dossier.
00:20:24.000 You know, the hookers and peeing on people and all that crazy shit.
00:20:28.000 Remember that?
00:20:29.000 I remember the idea that he'd hired hookers to urinate on the bed that was once occupied by the Obamas.
00:20:34.000 Something longer.
00:20:36.000 The reason I didn't believe that is I don't think Trump is that avant-garde.
00:20:38.000 I don't think he's that creative.
00:20:40.000 Like if he'd have come up with that, I'd have been actually applauding that.
00:20:42.000 That's kind of amazing.
00:20:43.000 But obviously he did that.
00:20:45.000 It's not even applaud.
00:20:46.000 That just sounds like a work of art.
00:20:47.000 It's ridiculous.
00:20:49.000 Putting urination on the bed of your enemy through the medium of prostitution.
00:20:52.000 I think that's kind of an artistic thing to do.
00:20:54.000 But I don't think he did it.
00:20:55.000 Obviously, he didn't do it.
00:20:56.000 None of it's true.
00:20:57.000 Right.
00:20:57.000 But you put that.
00:20:58.000 But isn't that weird that that in particular, that's like something I don't think anyone seriously could believe.
00:21:02.000 Well, there's plenty of people that believed it.
00:21:04.000 Yeah, they don't have to believe it.
00:21:06.000 They just say it.
00:21:07.000 Like, that was the whole point about, you know, the trial where he got arrested and convicted of 34 counts that are a felony, none of which are actually a felony.
00:21:20.000 That's all bookkeeping deception.
00:21:23.000 That was the paying off of the girl.
00:21:25.000 So now you can say he's a convicted felon.
00:21:28.000 You can just say that.
00:21:30.000 And even though all those counts were misdemeanors, all of them had passed the statute of limitations.
00:21:36.000 But for some reason, through no legal way that anybody could ever really honestly explain, they decided to label it a felony.
00:21:45.000 And it was just to turn him into a felon.
00:21:47.000 I saw even left-leaning anti-Trump lawyers saying this is not how the law should work.
00:21:52.000 You can't artificially elevate a misdemeanor to a felony outside the statute of limitations.
00:21:56.000 The thing is, if you do that, they're going to do that to you.
00:21:56.000 Crazy.
00:22:00.000 It's like we're going to give that kind of power to the Republicans.
00:22:02.000 And now when they're in office, they're going to start doing things like that.
00:22:05.000 Are we crazy?
00:22:06.000 Well, also, this really bothers me.
00:22:08.000 One of the key things that I think has happened over the past few years is this complete lack of fealty to the truth from both sides.
00:22:15.000 It's whatever is convenient matters more.
00:22:17.000 A complete lack of intellectual curiosity.
00:22:19.000 A complete lack of investigating and looking and thoroughly checking.
00:22:23.000 And by the way, with the BBC, that really matters because unlike the news media here, which can be as partisan as it likes, the BBC is the state broadcaster.
00:22:31.000 It's got a responsibility by charter to not be, you know, to be balanced, to be even-handed.
00:22:37.000 And it completely failed.
00:22:39.000 And I saw today, just this morning, some people, you know, we've got all the mania about the Epstein files at the moment.
00:22:44.000 Some activists have now said J.K. Rowling once invited Epstein to the opening of her theater, her play.
00:22:51.000 Never happened.
00:22:52.000 But because there's a furore about Epstein at the moment, they're just saying it happened.
00:22:56.000 It gets spread all over the place.
00:22:57.000 That's all you have to do.
00:22:58.000 And that's all you have to do.
00:22:59.000 And then the damage.
00:23:00.000 And then that gets repeated.
00:23:02.000 Oh, didn't this happen?
00:23:03.000 I know.
00:23:04.000 Like what you say about Trump is right.
00:23:05.000 I always hear that he's a convicted felon.
00:23:06.000 He's a convicted felon.
00:23:07.000 Well, why don't you pause for a minute and assess whether or not that conviction is sound or whether it was politically motivated or how helpful that is.
00:23:15.000 But like you say.
00:23:16.000 Also, it's like it's such a dangerous precedent to send.
00:23:20.000 It's terrible.
00:23:20.000 Like if you do that, look, right now in the United States, the media predominantly leans left except for Fox News, the mainstream large-scale media.
00:23:31.000 I guess CBS is probably going to lean more right now.
00:23:34.000 Yeah, yeah.
00:23:35.000 It seems like it's in the process of that.
00:23:37.000 But for the most part, when you watch CNN, if you watch MSNBC, if you watch the mainstream news, it's very left-leaning.
00:23:45.000 But if the fucking, if right-wing people started, if it was like more common for the news to be right-leaning, and then they started doing the exact same thing about a left-leaning candidate.
00:23:58.000 Yeah.
00:23:59.000 This is so dangerous.
00:24:01.000 And the idea that the left doesn't recognize that, which are the people that have always been in support of free speech.
00:24:07.000 It's never been a right-wing thing to support free speech until now.
00:24:11.000 It's always been a left-wing thing.
00:24:12.000 When I was a kid, it was famously the case of the ADL defending Nazis having the right to protest and saying, look, we think what they're saying is abhorrent, but it's very important that you get the right to say whatever you feel.
00:24:25.000 And then the way to combat that is with much better, more concise speech that's much more logical and makes sense.
00:24:32.000 And this is what you do.
00:24:33.000 This is what debate is for.
00:24:35.000 We've always known this.
00:24:36.000 Yeah, but I mean, I agree.
00:24:38.000 I'm so dispirited by that.
00:24:40.000 That very thing that you've identified, that the left used to be about this.
00:24:43.000 The left used to be all about, I mean, that example you mentioned of Skokie, wasn't it, in Chicago, the Nazis marching through Skokie and the ACLU saying, we're defending this.
00:24:52.000 There was a book by a guy called Aya Neyer, who was the head of the ACLU, called Defending My Agency.
00:24:57.000 Yeah, it wasn't the ADL.
00:24:57.000 It was the ACLU.
00:24:58.000 It was the ACLU.
00:24:59.000 And he was saying, you know, he's Jewish.
00:25:03.000 He's got family members who died in the Holocaust.
00:25:06.000 But he's writing a book saying, I'm defending neo-Nazis' right to free speech, not because I support them, but because I don't.
00:25:11.000 And I want to defend the principle whereby I can tackle them.
00:25:14.000 And that's speech.
00:25:15.000 Right.
00:25:15.000 So in other words, the principle is so much bigger.
00:25:18.000 I mean, the thing that I think has been lost.
00:25:20.000 And now, by the way, the ACLU, complete about turn.
00:25:22.000 I mean, there was a lawyer for the ACLU tweeting about how he wanted Abigail Schreier's book banned.
00:25:27.000 And he said, this is the hill I will die on.
00:25:29.000 You know, that's a guy called Chase.
00:25:31.000 I think it's a trans activist called Chase something.
00:25:33.000 I can't remember.
00:25:33.000 Anyway, but the point is, how far have you fallen?
00:25:36.000 When it comes to these free speech issues, left or right, it's nothing to do with it.
00:25:41.000 It should be about this principle of, it's not whether you agree with what they're saying and the substance of what they're saying.
00:25:46.000 It's whether you want the principle intact.
00:25:48.000 And that principle applies to us all.
00:25:50.000 The very same principle that allows the Nazis to say all their crazy stuff is the principle that allows us to challenge it, to tackle it.
00:25:58.000 Well, it's a very short-term win.
00:26:01.000 And it's basically they're playing chess and they decided, I want that rook no matter what.
00:26:05.000 And then they just sacrificed their queen.
00:26:07.000 Like, look what you've done.
00:26:08.000 Look what you've done for this short-term victory.
00:26:11.000 You're essentially tanking civilization for a decade where we have to sort this out and like let the ship wash itself back and forth until it writes.
00:26:21.000 Yeah, so and how do you ensure that it's not going to happen to you?
00:26:24.000 Like I think about that.
00:26:25.000 There was a national conservative conference in Brussels about a year and a half ago.
00:26:29.000 The local mayor said, I don't like this.
00:26:31.000 And he had the police rush it, shut it down.
00:26:33.000 And you had mainstream right-wing figures like Nigel Farage, Sawala Bravman.
00:26:38.000 How do they not think, hang on a minute, if we establish that precedent where you can just shut down your political opponents through the use of police force, how will that not rebound on me?
00:26:46.000 How will that not happen to us?
00:26:47.000 Well, this is the argument that they're using right now for Trump going after his political opponents.
00:26:51.000 Right, right.
00:26:52.000 Because they opened that Pandora's box, right?
00:26:53.000 You guys did that with him.
00:26:56.000 And everybody was saying how damn dangerous it is.
00:26:59.000 You can't fucking do that.
00:27:01.000 Even if you hate the guy, if there's a real crime that you can get someone, but when you take a crime like the bookkeeping stuff and turn it into a felony that could put this man in jail for the rest of his life for doing something that turns out to be legal, you can pay people to shut up.
00:27:17.000 And this is so, it's just, it's so weird that people for this short-term gain are willing to tank, which is essentially this whole structure of our civilization that allows free discourse.
00:27:29.000 You need it.
00:27:30.000 It's so important.
00:27:32.000 So important to be able to communicate and talk.
00:27:34.000 If podcasts didn't exist, there was no way to talk through ideas other than mainstream news, we would still be stuck in some very bizarre 1990s or 1980s narrative about how the world works.
00:27:49.000 We would have real problems.
00:27:51.000 We'd have real problems if there wasn't independent journalism like on Twitter and on wherever they can post.
00:27:58.000 Yeah, so why don't they get it?
00:27:59.000 I mean, we've had like people in left-leaning papers in the UK calling for Elon Musk to be arrested because he's allowing free speech on X or Twitter, whatever you want to call it.
00:28:08.000 Well, their offices got raided today.
00:28:11.000 Did some country?
00:28:12.000 There was a country where X's offices got raided.
00:28:17.000 I think one of the things was they somehow or another let there were I think something had to do with child pornography.
00:28:25.000 Where was that?
00:28:26.000 France.
00:28:26.000 France.
00:28:27.000 Fresh investigation into Grok.
00:28:30.000 And what is it?
00:28:31.000 What are the suspected offenses including unlawful data extraction and complicity in the possession of child pornography?
00:28:41.000 Yeah, but that's not what this is about.
00:28:42.000 This is because people have been misusing Grok to like, put bikinis on women they like, or even, in a few horrible cases, creating child, child sexual.
00:28:51.000 Wait a minute.
00:28:51.000 You can do.
00:28:52.000 You can't create child pornography.
00:28:54.000 I don't think no, or at least I think that's very much been shut down and safeguarded right.
00:28:58.000 I think that's what's happened.
00:28:59.000 I mean, unless there's like some sort of a loophole where you could get it to do it.
00:29:04.000 Among potential crimes it said it would investigate where complicity in possession or organized distribution of images of children of a pornographic nature, infringement of people's image rights with sexual deep fakes okay, the sexual deep fakes, yeah.
00:29:17.000 So sexual deepfakes is like if you put Hillary Clinton in a bikini and made her hot, that's a sexual deep fake.
00:29:23.000 Okay right, fraudulent data extraction by an organized group I think you can still do some of that stuff.
00:29:29.000 You can put people in bikinis yeah, I think you can do that.
00:29:32.000 So like if you wanted to take Shaquille O'neal and put him in a bikini, you could say you're sexualizing him okay yeah, I mean, I guess you can do that.
00:29:40.000 Yeah so, but that's what?
00:29:41.000 So that'll be why.
00:29:42.000 You know, recently Kier Starma, prime minister Uk, said he wanted to, was considering, or not necessarily.
00:29:47.000 He was going to ban X, but it wasn't off the table.
00:29:49.000 It's something like he, as though he's going to do that.
00:29:52.000 But this is always the excuse like yeah, we're protecting children right and, and look, no one wants that sort of stuff, right?
00:29:58.000 No one wants deep fakes of kids, obviously.
00:30:01.000 But there's far I mean looking at the stats on that there's far more child sexual exploitation on snapchat, for instance.
00:30:07.000 But they don't go after snapchat because snapchat isn't the form where Kier Starmer is getting criticized every single day and brutally hauled over the coals by by people checking his facts.
00:30:16.000 One of the best things about X recently is the community notes checking, checking journalists and politicians in real time with facts.
00:30:22.000 They hate it, they hate that.
00:30:24.000 So no wonder they're going after X.
00:30:26.000 Yeah, Biden got cooked by community notes multiple times.
00:30:29.000 Yeah yeah, to the point where the administration was taking down posts.
00:30:32.000 Yeah, so did the uh the Guardian, the left-leaning newspaper.
00:30:35.000 It flounced off X with a big statement saying, we're going to Blue Sky.
00:30:38.000 We've had it, we're off to blue sky.
00:30:40.000 It was such a flounce.
00:30:42.000 And, of course, and then, of course, everyone was retweeting all the their community notes.
00:30:45.000 They had loads, Loads of them.
00:30:47.000 Of course.
00:30:47.000 Just absolutely loads of them.
00:30:48.000 Because it's not true.
00:30:49.000 And, you know, especially when it's open to the whole world.
00:30:52.000 Yeah.
00:30:52.000 And people that aren't stuck under your guidelines, like in America, we could just talk shit.
00:30:57.000 And I think the reason why it's in France probably has a lot to do with Candace Owens.
00:31:02.000 Oh, yes, that makes complete sense.
00:31:03.000 Yeah, that might be.
00:31:04.000 Jean Macron.
00:31:05.000 And like, I mean, how many times did that get shared?
00:31:08.000 Yeah, exactly.
00:31:09.000 I mean, that is.
00:31:10.000 That makes sense of it now.
00:31:11.000 By the way, there's a real quick way to solve that.
00:31:14.000 Open chromosome test.
00:31:16.000 Go ahead.
00:31:17.000 Oh, I thought you were going to be a bit more graphic than that.
00:31:19.000 No, you don't have to.
00:31:19.000 Well, I don't have to.
00:31:20.000 Because that doesn't really solve it because you could, unless, I mean, there's no operation, but if she's gone through a surgery, then you could show a picture and it's probably pretty realistic, especially when was the last time you saw a 70-year-old lady's cooter?
00:31:34.000 Last week.
00:31:35.000 Yeah.
00:31:35.000 Congratulations.
00:31:35.000 Oh.
00:31:36.000 I'm just interested in that sort of stuff.
00:31:37.000 Well, you know, you're allowed to be curious in this country.
00:31:40.000 That's actually a really good example, though, isn't it, of the just something so obviously not true just going all over the world.
00:31:47.000 In a matter of moments.
00:31:48.000 Is it not true, though?
00:31:49.000 Well, that Macron's wife is a man.
00:31:52.000 100%.
00:31:52.000 Yeah, that's not true.
00:31:53.000 Well, you know, the burden of proof is on those who want to say that it is true.
00:31:57.000 The reality of the story is weird enough without it being true.
00:32:00.000 Like the 40-year-old man and the – He was – wasn't she his school teacher?
00:32:04.000 Yeah.
00:32:04.000 40.
00:32:04.000 She was 40.
00:32:05.000 She was 40 if it was, if it is actually a woman.
00:32:08.000 She was 40 and he was 15.
00:32:10.000 That's crazy.
00:32:11.000 And everyone says, well, they're French.
00:32:14.000 That seems to be the thing.
00:32:15.000 What a wild country.
00:32:18.000 People just say that's the way it works in France.
00:32:20.000 Yeah, but again, look, I would say with all of this stuff, you need some sort of proof.
00:32:27.000 Wasn't it the Carl Sagan thing about extraordinary claims require extraordinary evidence?
00:32:30.000 I think that's a pretty safe diktat.
00:32:31.000 The idea that, okay, anything could be true.
00:32:34.000 You know, there have been crazy conspiracies that turned out to be true.
00:32:39.000 So I'm not, I would never rule anything out.
00:32:41.000 But what I'm saying is if you're going to make a claim like that, you better be damn sure you've got really solid evidence of it.
00:32:47.000 She's got hours-long documentaries on this.
00:32:51.000 Yeah, and are they persuasive?
00:32:53.000 I haven't watched them.
00:32:54.000 Do you think I have that kind of time, dog?
00:32:54.000 I haven't watched them.
00:32:57.000 Well, you should do.
00:32:58.000 You should do your research before you're part of the problem.
00:33:02.000 Outrageous.
00:33:03.000 I can't do research on that.
00:33:05.000 I want to wait till it plays out in court.
00:33:07.000 But whenever I do do research, I'll give you the example from this week, just because I'm reading it now.
00:33:12.000 A woman's written a book claiming that Shakespeare was a black woman.
00:33:15.000 Oh, I saw that.
00:33:16.000 Yeah.
00:33:18.000 So this is a major spoiler.
00:33:20.000 Shakespeare wasn't a black woman.
00:33:22.000 Crazy.
00:33:22.000 Yeah.
00:33:23.000 I've got the book.
00:33:24.000 I'm reading the book now.
00:33:26.000 It is worse than you imagine.
00:33:27.000 Part of the evidence.
00:33:28.000 How could it be worse than I imagine?
00:33:29.000 Because, because.
00:33:30.000 It's obviously not true, first of all.
00:33:32.000 Of course.
00:33:32.000 she basically says in the book that it's important that it should be true and therefore And in fact, the book opens with a picture of Shakespeare as a black woman, which was drawn by the author.
00:33:46.000 Is it a good drawing?
00:33:47.000 No, it's okay.
00:33:49.000 I don't want to mock someone else.
00:33:52.000 If it's out, it's the first.
00:33:54.000 Oh, that's the book.
00:33:54.000 That's actually pretty good.
00:33:56.000 No, no, no.
00:33:56.000 No, no, that's not.
00:33:57.000 That's a black woman?
00:33:58.000 No, no, no.
00:33:59.000 That's a portrait of Amelia Lanya, who she says was Shakespeare.
00:34:02.000 And she says that the portraits at the time were whitened to disguise her blackness.
00:34:07.000 In the book itself, in the book itself, you won't be able to get in the book, I don't think, Jamie.
00:34:12.000 But in the book itself, there's a sketch that she's done.
00:34:15.000 So it's like, I can imagine a publisher saying, oh, what evidence have you got?
00:34:18.000 And she's like, oh, well, I'll go and draw it for you.
00:34:20.000 And that's sort of what she's doing.
00:34:21.000 Oh, she's black and Jewish?
00:34:23.000 Yeah, black, Jewish.
00:34:24.000 Well, actually, I mean, Amelia Lanya was part Moorish, but wasn't black, and she wasn't particularly dark-skinned.
00:34:30.000 And she was Jewish as well?
00:34:31.000 Yeah, part Jewish.
00:34:32.000 Okay, so who is this woman that they're saying actually was Shakespeare?
00:34:37.000 So she's called Amelia Lanya or Amelia Bassano.
00:34:40.000 And one of the arguments is that Shakespeare at the time, if she was a woman, wouldn't have been able to get published because women couldn't get published.
00:34:47.000 But Amelia Lanya was published.
00:34:49.000 She had a book of poetry.
00:34:50.000 So all of this stuff falls apart like in two seconds flat.
00:34:54.000 And this is the best one.
00:34:56.000 She even says in the book that the word Shakespeare is an anagram of a she-speaker.
00:35:04.000 I'm not making that up.
00:35:07.000 That's what she says.
00:35:08.000 I mean, you know, listen.
00:35:11.000 What a cover-up.
00:35:12.000 How'd she crack this?
00:35:14.000 Well, actually, it's an old theory.
00:35:15.000 It's like a 20-year-old theory.
00:35:16.000 Is it really?
00:35:17.000 20 years old.
00:35:17.000 I tell you.
00:35:19.000 She's just sort of rehashing it now for this identitarian post-woke world where we're all like, we're desperate for Shakespeare to be a black woman.
00:35:26.000 And it's so funny.
00:35:27.000 It's so pathetic.
00:35:29.000 This was my first encounter with conspiracy theories because my background is, I did a doctorate in Shakespeare.
00:35:34.000 My background was teaching Shakespeare back in the day, like before I did comedy and before I did anything else.
00:35:38.000 And it was the conspiracy theorists around Shakespeare saying Shakespeare couldn't have written his work.
00:35:42.000 They are the most intense, the most angry, the most evidence-free cohort of people.
00:35:49.000 They're angrier than the woke.
00:35:51.000 Like I've tweeted, I've written stuff about Shakespeare online.
00:35:51.000 I promise you.
00:35:54.000 I recently did some lectures about Shakespeare for the Peterson Academy because I'm really into helping.
00:35:58.000 I love the Peterson Academy.
00:35:59.000 I love what they're doing.
00:36:00.000 And I did these Shakespeare lectures.
00:36:01.000 And the conspiracy theorists were on to me online saying, it wasn't Shakespeare.
00:36:05.000 The guy from Stratford didn't write this.
00:36:07.000 And what all these theories have in common is they've just made, there's no evidence.
00:36:12.000 There's no evidence.
00:36:13.000 The key point about Shakespeare is if you're going to say it wasn't the guy who everyone thought it was, you have to answer one key question.
00:36:18.000 Why does everyone who knew Shakespeare, wrote about Shakespeare, say that it was?
00:36:22.000 Can I stop you?
00:36:23.000 Because I'm confused.
00:36:24.000 I didn't even know that there was a conspiracy about Shakespeare.
00:36:27.000 Oh, wow.
00:36:28.000 Yeah, there's lots.
00:36:29.000 I had heard one person say that Shakespeare wasn't real and that it was really someone else's work that he plagiarized.
00:36:37.000 I had heard that.
00:36:37.000 Yeah.
00:36:38.000 But I never even bothered to fuck around with it.
00:36:41.000 Well, it actually came from America.
00:36:42.000 It's you guys.
00:36:43.000 It was.
00:36:43.000 Of course.
00:36:45.000 We're number one.
00:36:45.000 It was a guy called Looney.
00:36:46.000 A guy called Looney, actually, from America.
00:36:48.000 That's hilarious.
00:36:49.000 That's his name.
00:36:50.000 You got to listen to that guy.
00:36:52.000 So he, we're going back like 60, 70 years or something, but he came up with this idea that Shakespeare was actually an aristocrat called Edward de Vere, the Earl of Oxford.
00:36:59.000 Problem is, Edward De Vere died in 1604.
00:37:02.000 That's before Macbeth.
00:37:03.000 That's before Antony and Cleopatra.
00:37:04.000 That's before Coriolanus.
00:37:06.000 That's before The Tempest.
00:37:07.000 So he managed to, I think they get around it by saying he wrote these plays and then he died.
00:37:14.000 Shakespeare found them.
00:37:15.000 Or something.
00:37:16.000 Yeah, so even though some of those plays actually have cultural references from the time after De Vere died, but it doesn't matter.
00:37:22.000 Maybe he was a prophet as well.
00:37:24.000 But all of the guys, you speak to these people, you'll see what I mean.
00:37:27.000 Edward De Vere, they think, some people think it was Francis Bacon.
00:37:30.000 Some people think it was Christopher Marlowe.
00:37:32.000 Some people think it was Elizabeth I. Like all of the candidates they put up, right?
00:37:36.000 The key thing is they're all aristocrats.
00:37:38.000 They're all posh.
00:37:39.000 Because Shakespeare was a middle class, lower middle class, not very rich, didn't go to university, came from the Midlands, you know, up and coming guy who and they say, well, how could someone like that write about kings and lords and ladies?
00:37:39.000 Why?
00:37:52.000 It's snobbery.
00:37:53.000 They're basically saying working class people can't do art.
00:37:57.000 I mean, really, that's what it is.
00:37:59.000 Otherwise, they wouldn't be going after all these aristocrats.
00:38:02.000 And it's the opposite in America, oddly.
00:38:05.000 Is it?
00:38:05.000 Yeah.
00:38:06.000 So if you were a Rockefeller in America, you're from the Rockefeller family and you wrote an amazing novel, no one would believe it.
00:38:12.000 Right.
00:38:13.000 So it's like, okay, there has to be like some guy who or some woman who's like grinding, drinking coffee and smoking cigarettes alone in their apartment to write something that's brilliant.
00:38:13.000 Okay.
00:38:23.000 So I wonder what it is about the UK.
00:38:25.000 Well, although, like I say, a lot of it comes from America.
00:38:27.000 And is it just the need to tear down an icon?
00:38:31.000 Is it that?
00:38:32.000 Is it, I mean, I get it now with this woman who's saying Shakespeare was a black woman.
00:38:37.000 I get that at the moment because we're in this moment of identitarian group identity mania, right?
00:38:41.000 So that makes sense.
00:38:42.000 She's got a political reason why she wants it to be a black woman.
00:38:45.000 So I kind of understand that more.
00:38:47.000 But what is it?
00:38:47.000 I think it might be more to do with the idea that this guy changed civilization, changed literature.
00:38:52.000 No one else has achieved what he achieved in writing.
00:38:55.000 He's up there with Michelangelo, Bach, you know, all of that.
00:38:58.000 Let's tear that down.
00:38:59.000 Let's tear down Western civilization.
00:39:00.000 Let's say none of this is based on anything.
00:39:03.000 This is all untrue.
00:39:05.000 I think it's to do with the innate iconoclasm, that innate, you know, just tearing down the great things about our culture.
00:39:12.000 That's always been the case.
00:39:12.000 For sure.
00:39:14.000 And people always want to tear down idols.
00:39:16.000 They want to tear down, you know, whoever it is, no matter what.
00:39:19.000 I was watching this video we were talking about the other day of this woman talked about how the Beatles were terrible.
00:39:24.000 And this woman was not very articulate, not particularly interesting.
00:39:28.000 Doesn't seem that compelling.
00:39:30.000 And she was going on and on about how bad the Beatles were.
00:39:32.000 I'm like, you're not going to convince anyone.
00:39:34.000 That's just not going to work.
00:39:36.000 But people were going to fucking try.
00:39:37.000 They're going to try no matter what, no matter who it is.
00:39:40.000 Hendrix sucked.
00:39:40.000 I've heard that before.
00:39:41.000 Oh, really?
00:39:42.000 Hendrix sucked.
00:39:43.000 Stop.
00:39:44.000 But at least that's based on an opinion, right?
00:39:47.000 There's a difference between saying Jimi Hendrix sucked and Jimi Hendrix was actually a woman from Liverpool called Maude.
00:39:52.000 Well, you know the theory about Jimi Hendrix in America.
00:39:55.000 Do you know that?
00:39:56.000 No.
00:39:57.000 So it's the people that are like deep into the CIA and CIA conspiracies.
00:39:57.000 Okay.
00:40:04.000 And what is it called?
00:40:05.000 Strange Tales from the Canyon?
00:40:06.000 Is that what it's called?
00:40:07.000 The book?
00:40:09.000 So there's a book on there's a bizarre connection between a lot of the countercultural figures of the 1960s and the intelligence community.
00:40:19.000 One of them is Jim Morrison's father was like a high-ranking military officer.
00:40:24.000 And then there's different people from different bands that were like a key part of the countercultural movement that all have parents that were either in intelligence communities or closely connected to it.
00:40:37.000 Like a society scenes inside the canyon.
00:40:40.000 It's a crazy book.
00:40:42.000 It's kind of fun.
00:40:42.000 It's fun.
00:40:43.000 Is it crazy as in like the revelations are crazy or that it's just not true?
00:40:47.000 Well, they make some broad leaps, right?
00:40:51.000 So there's a lot of, and then a year later, he died in mysterious circumstances, or a year later, he died from suicide, or a year later, he died from an overdose.
00:40:58.000 You know, I'm like, well, okay.
00:40:59.000 You're hanging out with a bunch of people that are doing drugs all the time, and they're all ne'er-dwells, and they're all hanging out in Laurel Canyon.
00:41:06.000 If you don't know Laurel Canyon, Laurel Canyon, at least at the time, I mean, when I first moved to Hollywood, it's like all the weirdos would live in Laurel Canyon.
00:41:15.000 Like, all the weirdos were like right there above Hollywood, and there was all these crazy parties up there.
00:41:15.000 Right.
00:41:21.000 It was like Laurel Canyon was nuts.
00:41:22.000 And they all knew each other, right?
00:41:23.000 So they're all part of that circle.
00:41:25.000 So, I mean, this was like when I moved there in the 90s, this was the case.
00:41:29.000 My friend Dave Foley had a house up there.
00:41:32.000 And it was like all these kooky people.
00:41:32.000 Right.
00:41:34.000 And he's telling me about all these kooky parties and all this different shit.
00:41:36.000 It was like Laurel Canyon was always like kind of, so of course a bunch of people are going to die.
00:41:41.000 Of course a bunch of people are going to be connected to bands and different counterculture movements.
00:41:46.000 The theory is that the CIA sort of engineered this culture to, I don't know why.
00:41:56.000 I'm not exactly sure because I haven't gotten all the way through the book.
00:41:59.000 I'm like, only look at how.
00:42:00.000 You're still reading it.
00:42:01.000 No, I pick it up every now and then.
00:42:03.000 It's just like, it's too kooky.
00:42:05.000 It's not grabbing you.
00:42:07.000 Well, you can't make Jimi Hendrix in a lab, okay?
00:42:13.000 You can't.
00:42:13.000 It's just you can't fucking do it.
00:42:16.000 You can't make someone that good.
00:42:17.000 It's not possible.
00:42:19.000 You can't tell me that if they did, why haven't they done it since?
00:42:22.000 Why don't they do it all the time?
00:42:23.000 Right.
00:42:24.000 The greatest guitarist of all time.
00:42:27.000 And you're telling me the Central Intelligence cooked that guy up?
00:42:29.000 So they invented him.
00:42:30.000 Like, he's like their clone or something.
00:42:33.000 Just think that they had some sort of an influence on these people, on Jim Morrison.
00:42:39.000 Like, there's a thing about Morrison, the Morrison one.
00:42:42.000 Like, what is the connection between Jim Morrison's dad and the intelligence agencies?
00:42:47.000 There's some like tangible connection with Jim Morrison's dad.
00:42:50.000 But wouldn't you just normally assume that if your dad was some high-ranking military guy, first of all, never home.
00:42:57.000 Okay, so where are you?
00:42:57.000 Yeah.
00:42:58.000 You're out running around with your friends, smoking cigarettes and fucking drinking, and you're in a band, and it turns out you got a lot of angst and pain because you're being neglected as a child because your dad works 16 hours a day trying to fuck the country over.
00:43:12.000 And so what do you do?
00:43:14.000 You go counterculture.
00:43:15.000 It's like it's so common.
00:43:16.000 The preacher's daughter, she becomes like a harlot, right?
00:43:19.000 So there you're a high-ranking U.S. officer with, yeah.
00:43:21.000 Right.
00:43:22.000 But that is, okay, but again, like this is a perfect example.
00:43:24.000 Wow, he's involved in the Gulf of Tonkin incident.
00:43:27.000 Whoa.
00:43:28.000 But that's not proof of anything.
00:43:29.000 No, But his dad is.
00:43:32.000 Yeah, but you know, but this is the thing.
00:43:33.000 They'll take something like that.
00:43:35.000 They'll take various strips of coincidences and they say this leads us to this conclusion.
00:43:38.000 But all they're doing is coming up with a conclusion first and working backwards.
00:43:42.000 Like this sort of stuff, you see it again and again.
00:43:46.000 So this is how this connects with intelligence agencies.
00:43:48.000 McGowan, I guess that's the author.
00:43:51.000 Core move is to group Morrison's father with other Laurel Canyon musicians' parents who worked in military, defense, or intelligence-linked roles and to frame this as evidence of a broader covert program around the 1960s rock scene.
00:44:05.000 Come on.
00:44:06.000 So are you saying that the CIA were trying to influence the culture through the medium of rock music?
00:44:06.000 Yeah.
00:44:12.000 Uh-huh.
00:44:13.000 And that's somehow tied to espionage.
00:44:15.000 They also have that film studio.
00:44:19.000 What's that?
00:44:20.000 Jared Leto bought that place.
00:44:20.000 What?
00:44:21.000 There was a film studio in Laurel Canyon, too.
00:44:24.000 Oh, well, it's a base.
00:44:25.000 It's an actual base.
00:44:26.000 Yeah, Jared Leto.
00:44:27.000 A lot of films were.
00:44:28.000 I was talking to Jared about that.
00:44:30.000 I had dinner with Jared Leto one night.
00:44:32.000 He's very cool, by the way.
00:44:33.000 Very, really nice guy.
00:44:34.000 Very normal.
00:44:35.000 And by the way, he looks like he's 30.
00:44:37.000 He's 50 years old.
00:44:38.000 It's crazy.
00:44:39.000 What are you doing with your fucking scare?
00:44:39.000 Moisturizer.
00:44:41.000 You look great.
00:44:42.000 Breakout Mountain Laboratory.
00:44:44.000 So he bought that place and converted it into a home.
00:44:48.000 That's where he lives.
00:44:49.000 It's a dope spot.
00:44:51.000 Soundstage.
00:44:51.000 Looks quite nice.
00:44:52.000 Soundstage film laboratory, two screening rooms, four editing rooms, an animation and still photo department.
00:44:57.000 Sound mixing studio, numerous climate-controlled film vaults.
00:45:01.000 And this is connected to the conspiracy somehow.
00:45:03.000 Well, this was an actual military base.
00:45:05.000 Located in that same neighborhood.
00:45:06.000 Okay.
00:45:07.000 So this Air Force station, whatever it was, I wonder what they were doing.
00:45:11.000 Like, why did they need all that film capability?
00:45:15.000 Why do they need to be?
00:45:16.000 In theory, I guess, like when they would show the atomic bombs going off and would play it in the movie theater for people to see it.
00:45:22.000 That's like that's how they would make the actual reels and whatnot.
00:45:25.000 Well, that makes sense.
00:45:26.000 Right?
00:45:27.000 It makes sense that they were right there in Hollywood if that's what they were doing.
00:45:30.000 On top, I don't know what other things they made.
00:45:32.000 So like, here's the still from it.
00:45:35.000 Lookout Mountain Laboratory.
00:45:35.000 Oh, okay.
00:45:36.000 So it's just a studio then.
00:45:37.000 Especially.
00:45:38.000 But it's in that same neighborhood at the same time.
00:45:40.000 Yeah, but so what?
00:45:41.000 I mean, I think with all of it.
00:45:44.000 He's not arguing for it.
00:45:46.000 God damn it, Jamie.
00:45:47.000 The so what of it is that there wasn't that many of them to begin with, and just they all happen to be in this.
00:45:51.000 But you know what I think with all of this stuff again and again, the pattern is either there's gaps, there's gaps in what we know, and people decide to fill them in themselves because there's a kind of comfort to that.
00:46:02.000 There's also some kind of comfort with, I know something that no one else does.
00:46:05.000 I've got the answer.
00:46:06.000 There's a status element to that.
00:46:08.000 I remember, I read a book when I was a kid, like teenager, called The Sacred Virgin and the Holy Whore.
00:46:13.000 And it was about the sort of books I read.
00:46:15.000 And it was about Jesus, and it was trying to prove that Jesus was a woman.
00:46:19.000 And as you're reading it, you're thinking, yeah, oh, yeah, Jesus is a woman.
00:46:24.000 I can't believe I felt.
00:46:24.000 Look at that.
00:46:25.000 And then you get to the end and you think, what the hell did I just read?
00:46:27.000 And it's that thing of you can marshal any kind of half-baked fact or any, you can marshal certain things that we can see and fill in the gaps yourself and lead to a crazy conclusion.
00:46:38.000 What concerns me isn't so much that people do that because people have done that forever, as long as they've been human beings, is that now people are leaping at it and falling for it in a way that I haven't seen.
00:46:49.000 Maybe it is just social media, right?
00:46:51.000 Can I give you an example of this?
00:46:51.000 It is.
00:46:52.000 Yeah, please.
00:46:53.000 A recent one, which I just thought was nuts.
00:46:55.000 Did you see the portrait of King Charles III by an artist?
00:46:59.000 I think his name is Yeo, Y-E-O.
00:47:02.000 It's a big red portrait, which currently hangs in Buckingham Palace.
00:47:05.000 Oh, I have seen that.
00:47:06.000 It's crazy.
00:47:07.000 If you take a quarter of it, invert it, flip it, add a bit, and squint, it looks like a goat devil, right?
00:47:14.000 But you have to do a lot of steps to find the goat devil.
00:47:16.000 Well, of course.
00:47:17.000 And you could puzzle.
00:47:18.000 You could probably.
00:47:19.000 How dare you?
00:47:20.000 I'm sorry.
00:47:21.000 How dare you dismiss that puzzle?
00:47:22.000 Let's show the photo and show how it's done because it's fun.
00:47:26.000 Oh, yeah, there we go.
00:47:26.000 Can you see the goat?
00:47:27.000 So can you see the goat devil?
00:47:28.000 First of all, just the photo by itself.
00:47:30.000 Like, hey, man, what the fuck are you doing?
00:47:33.000 Oh, it's a creepy picture.
00:47:34.000 Why am I splattered in blood?
00:47:36.000 I've seen it in the flesh.
00:47:37.000 It's a creepy.
00:47:38.000 One thing if he did that in all white, it was an all-white background.
00:47:42.000 That would be one thing.
00:47:43.000 Like, oh, that's kind of an interesting look or, you know, pastel.
00:47:46.000 So what are you saying?
00:47:47.000 Are you already suspicious?
00:47:48.000 Is that what you're saying?
00:47:49.000 Well, the photo's nuts.
00:47:51.000 Like, the painting is nuts.
00:47:52.000 Where's the goat?
00:47:53.000 So all you have to do is put it together side by side.
00:47:55.000 You don't have to do that much.
00:47:57.000 You exaggerated how much you have to do.
00:47:58.000 No, I saw a video that's going to be a little bit of a dungeon.
00:48:00.000 Trust me.
00:48:00.000 Look at it upside down.
00:48:01.000 Oh, no, look.
00:48:02.000 Well, the other way I found a goat.
00:48:04.000 Put it back.
00:48:05.000 Put it back.
00:48:05.000 Wait a minute.
00:48:07.000 I can completely see the goat now.
00:48:09.000 That's 100% a goat.
00:48:10.000 They did it on purpose.
00:48:11.000 It's a sign.
00:48:12.000 That is.
00:48:13.000 That's a sign.
00:48:14.000 Go back to the other one, though.
00:48:15.000 Click on that one.
00:48:16.000 I see a goat there.
00:48:17.000 I see some evil demon.
00:48:18.000 Look at two eyeballs.
00:48:19.000 Yeah, yeah, yeah, bro.
00:48:20.000 Where?
00:48:21.000 100%.
00:48:22.000 Stop trying to gaslight that.
00:48:22.000 Stop.
00:48:24.000 I see a monster.
00:48:27.000 Oh, well, I mean.
00:48:28.000 You can find something in everything, man.
00:48:30.000 It looks all superimposed.
00:48:31.000 I mean, you see.
00:48:32.000 I can see Martha Stewart in that.
00:48:34.000 The Virgin Mary in a grilled cheese sandwich.
00:48:36.000 You can see it in the clouds and the rocks.
00:48:38.000 There's a term for this where our brains look for patterns and things.
00:48:41.000 I had a conversation once with a friend of mine that I didn't know was going crazy.
00:48:45.000 And he goes, hey, you want to see something crazy?
00:48:45.000 Right.
00:48:48.000 And he pulls out his phone and he shows me a cloud.
00:48:52.000 And I go, what is that?
00:48:52.000 Yeah.
00:48:53.000 He goes, dude, I'm seeing this all day.
00:48:55.000 And he shows me some other ones.
00:48:56.000 He's got like hundreds of photos of clouds on his phone.
00:49:00.000 I go, what are you seeing?
00:49:01.000 He said, these are UFOs.
00:49:02.000 He goes, these are spaceships.
00:49:04.000 This is not a regular cloud.
00:49:06.000 And I'm looking at the photos.
00:49:07.000 Like, he's just been picking pictures of clouds all day.
00:49:10.000 And I realize, oh, my God, my friend is going schizophrenic.
00:49:13.000 I didn't know him well.
00:49:15.000 So he's there.
00:49:15.000 He's his friend?
00:49:16.000 Yeah.
00:49:17.000 The more I talked to him, the more I realized there was something cracked.
00:49:17.000 Okay, okay.
00:49:21.000 Like, this is a guy I hadn't seen in like maybe seven or eight years.
00:49:24.000 And I ran into him at a comedy club and he was just showing me photos of clouds on his phone.
00:49:28.000 I was like, during the conversation, I realized, oh, he cracked.
00:49:33.000 But aren't you concerned that that kind of thing is now kind of common?
00:49:36.000 Like that from people who aren't necessarily unwell, people who are just seeing stuff.
00:49:41.000 Well, it's fun.
00:49:42.000 I think it's exciting.
00:49:43.000 It's exciting for people to uncover information that the general public is ignorant of.
00:49:43.000 Yeah, yeah, yeah.
00:49:49.000 And so here's the thing about the Laurel Canyon thing.
00:49:53.000 There's enough of the CIA meddling in cultural events that's absolutely true and provable.
00:50:00.000 And that's MKUltra.
00:50:01.000 And that's what they did with Charles Manson.
00:50:03.000 And that's the book Chaos by Tom O'Neill, which is a brilliant book, which is very well documented and details Jolly West and his influence on the Manson family and how they were influencing these people to try to sabotage the hippie movement.
00:50:16.000 So the hippie movement was this change in culture where all of a sudden people were rejecting the war movement.
00:50:21.000 They were rejecting, you know, they were free love and they were doing acid and people were freaking out.
00:50:26.000 Their kids were just disappearing and following the Grateful Dead around.
00:50:30.000 And they took this guy, Charles Manson, this very charismatic con man.
00:50:35.000 They taught him how to dose people up with acid and influence them, and they got them to commit murders.
00:50:41.000 But there is evidence for this, right?
00:50:43.000 So you're talking about a book that is researched.
00:50:46.000 But you're being logical, and you're correct.
00:50:51.000 But what I'm saying is, because of that, people go, well, what else?
00:50:56.000 And so then they make these big leaps, like Jimi Hendrix is a CIA creation.
00:51:01.000 Right.
00:51:02.000 But if you're a logical person, you just listen to Voodoo Child's Slight Return and you're like, how?
00:51:08.000 How?
00:51:09.000 This is like, if that's true, CIA should get back to work.
00:51:12.000 Make another one of those, bro.
00:51:14.000 So I wonder whether this is, this is, I think this is the fallout of the woke movement.
00:51:18.000 This is the divorcing of reality and truth.
00:51:22.000 The idea that it doesn't matter, not just about what is expedient, but what we want to believe.
00:51:26.000 I've got friendships.
00:51:27.000 I think we should stop saying it's the fallout of the woke movement.
00:51:30.000 I think we should start saying it's a natural pattern that human beings automatically fall into in order to support their belief systems and enforce their particular ideology over whatever opposing ideology is.
00:51:45.000 But it's escalated.
00:51:46.000 It escalates, but it's because of social media that everything is escalating now.
00:51:50.000 But is it just social media?
00:51:51.000 I mean, I think another thing that's a major reason for it, we had COVID.
00:51:55.000 We had all these experts telling us it's a racist conspiracy theory to say that it came from a lab in Wuhan.
00:52:02.000 Now everyone knows that's almost certainly true.
00:52:04.000 We had people in positions of authority lying to us.
00:52:07.000 So it's something about this culture war that we're talking about.
00:52:10.000 But that's not real culture war.
00:52:12.000 That was using the culture war because they were trying to cover something up.
00:52:17.000 But they leapt to race, didn't they?
00:52:19.000 They leapt to race.
00:52:20.000 And because they were using the culture war to cover up their crime.
00:52:24.000 So if that's, but in either case, what you've got effectively is a legitimation crisis.
00:52:28.000 You've got people in charge.
00:52:30.000 We've been lied to so often.
00:52:32.000 But what I don't think you should therefore do, like I'm all for being skeptical about people in authority, academics, politicians, journalists, they've all lied.
00:52:40.000 But that firstly doesn't mean that all experts and all journalists and all people have lied because there have been some good ones all the way.
00:52:45.000 But also that doesn't mean that you automatically leap to any conclusion, evidence-free, that jumps before you without some kind of critical analysis.
00:52:55.000 The same thing that you're criticizing those people for failing at, you're falling into the same trap yourself.
00:52:59.000 I don't mean you.
00:53:00.000 But you're Andrew Doyle.
00:53:02.000 You're a brilliant guy who writes books and you're really smart.
00:53:05.000 The idea is that you are immune to this stuff because you're intelligent, but the unwashed masses are not.
00:53:13.000 I don't think I'm immune at all.
00:53:16.000 I honestly don't.
00:53:16.000 I wouldn't put myself in the middle of the morning.
00:53:17.000 Well, you're immune to the dumbest shit.
00:53:19.000 I'd like to think so.
00:53:20.000 You are.
00:53:22.000 I am.
00:53:22.000 Yeah, but don't you think that all of us in the right circumstances could end up falling 100%, but I'm not in those circumstances currently.
00:53:28.000 But I like to believe, and maybe it's a naivety on my part, but I like to believe that most people have a kind of natural intellectual curiosity.
00:53:38.000 If they stop for a moment and think and don't just trust instinct over reason, I think we're all capable of it.
00:53:46.000 I just think we're not all realizing it.
00:53:48.000 Well, it's not just that.
00:53:49.000 It's like some people are medicated, right?
00:53:51.000 So some people are on a bunch of different medications that dull their senses.
00:53:56.000 And then you've got people that have gotten to wherever they are in life.
00:54:00.000 Maybe they're in their 50s.
00:54:02.000 And they're set in their ways and they have no desire to change at all.
00:54:06.000 And so they've been living a dumb life for 50 plus years.
00:54:10.000 You can't all of a sudden say, hey, Mark, I want you to be logical and introspective and think about this thing and analyze it and for what it really is.
00:54:17.000 Instead of holding on to your ideological beliefs that you've kind of locked yourself into and you identify with and any attacks on those is an attack on you personally, I want you to just, let's look at the facts.
00:54:28.000 This episode is sponsored by BetterHelp.
00:54:31.000 Look, there's a lot of pressure when it comes to dating, especially in February.
00:54:35.000 But you're putting too much on yourself and on your partner.
00:54:39.000 There's no such thing as a perfect relationship, whether you're on a first date or I've been together for years.
00:54:45.000 It's completely normal to go through rough patches.
00:54:48.000 And what matters is how you deal with them.
00:54:50.000 And therapy can be a huge help during any stage of your dating life.
00:54:53.000 You can figure out what you want in a partner or get perspective for a growing problem in your relationship.
00:54:59.000 The point is, you don't have to come up with a solution by yourself.
00:55:03.000 Now, finding the right therapist can be tricky, but that's where BetterHelp comes in.
00:55:07.000 They have an industry-leading match fulfillment rate, and they do a lot of the work finding the right therapist for you.
00:55:14.000 Really, all you have to do is fill out a questionnaire and sit back and wait.
00:55:18.000 Tackle your relationship goals this month with BetterHelp.
00:55:22.000 Sign up and get 10% off at betterhelp.com/slash J-R-E.
00:55:27.000 That's better, H-E-L-P dot com slash J-R-E.
00:55:32.000 But you saying that sounds very persuasive to me.
00:55:34.000 Like the way you put that.
00:55:35.000 Like if I were that guy, I'd be like, oh, I've listened to Joe now.
00:55:38.000 Fucking weird.
00:55:40.000 fucking liberals bullshit with your fucking you're just a fucking they'll come up with some sort of king charles the third is a goat yeah You're controlled opposition or you're a useful idiot or they'll put a label on you.
00:55:54.000 I've been called, I've been told I get dark money.
00:55:56.000 Oh.
00:55:57.000 How do you get any of that?
00:55:58.000 Well, I love it.
00:55:59.000 I want it.
00:56:00.000 I want the dark money.
00:56:00.000 It's so dark I haven't seen any of it.
00:56:02.000 That's how dark it is.
00:56:03.000 What is dark money?
00:56:04.000 I think it's when it's like some rich ideologue who's sort of slipping you money to say the thing.
00:56:08.000 You know what it is?
00:56:09.000 It's that thing of, I don't believe that you disagree with me.
00:56:11.000 I'm too narcissistic to believe that you disagree with me.
00:56:13.000 You must be being paid to have paid off.
00:56:17.000 Trust me, I would love that.
00:56:18.000 If anyone's out there who wants to pay me off, I'll be a mouthpiece.
00:56:22.000 I haven't had that opportunity.
00:56:25.000 It's pretty low.
00:56:27.000 I'm a bit of a whore, truth be told.
00:56:30.000 I've got a mortgage.
00:56:31.000 Come on.
00:56:32.000 I will say any crazy shit if you want me to.
00:56:34.000 Well, there's certainly a lot of people that fall into that category, too.
00:56:37.000 So people do get nervous about it.
00:56:39.000 I mean, obviously you're joking, but there's a lot of people that will change their opinion if money comes their way.
00:56:44.000 But I like to assume people mean what they say.
00:56:48.000 And my logic behind that is even when they don't, you can still dismantle the argument, even if it's authentic or not.
00:56:54.000 You know, even if it's authentically believed.
00:56:56.000 So I think that's just the best way to go about it.
00:56:56.000 Sure.
00:56:59.000 The best way is debate.
00:57:01.000 It's the best way.
00:57:02.000 Or at least conversation.
00:57:04.000 But that's what we've lost.
00:57:05.000 So I think that hits on it, actually, because I said debate, but that sounds formal.
00:57:09.000 No, I know what you mean.
00:57:10.000 You mean so?
00:57:13.000 Can I give you an example of that?
00:57:15.000 So I went to UC Berkeley, the University of UC Berkeley in California.
00:57:21.000 They let you leave?
00:57:22.000 Well, almost not, right?
00:57:23.000 So I, what had happened was, you know, Charlie Kirk's tour was planned to go all the way through, and this was the last date, the Berkeley date.
00:57:31.000 And after his assassination, various people went and did the shows because they said, because Turning Point rightly said, we're not going to give an assassin the veto of our tour.
00:57:40.000 We finished the tour.
00:57:42.000 And Rob Schneider, who I've been working with in Arizona, I've come over here to work with him.
00:57:46.000 The comedian?
00:57:47.000 Yeah.
00:57:47.000 So I've been, this is how I escaped from the UK, I should say.
00:57:51.000 So me and Graham Linnehan, who you've had on your show, comedy writer, my comedy writing partner and friend Martin Gourlay, the three of us, we decided that things were so bad in the UK, we'd rather write and do creative stuff in America.
00:58:06.000 Rob Schneider, who I'd met many years ago, he said, come on over.
00:58:08.000 We'll set up a production company.
00:58:09.000 We've been working in Arizona on all these various projects.
00:58:12.000 It's so liberating.
00:58:13.000 And also, it's the middle of the desert, so I fucking love the heat.
00:58:16.000 And you go from England to that.
00:58:17.000 It's kind of exciting.
00:58:18.000 Nice contrast.
00:58:19.000 So we've been able to, you know, we, and look, I don't want to do down the UK or say, but what I will just say is the creative industries there are pretty stagnant.
00:58:28.000 They're not like here.
00:58:29.000 There's so many more ways.
00:58:30.000 Can you be free?
00:58:32.000 How can you, if you are worried about going to jail for a meme?
00:58:36.000 Well, Graham got me arrested at the end by five armed officers.
00:58:40.000 Right after he left this podcast.
00:58:42.000 Was that it?
00:58:42.000 Yes.
00:58:43.000 And it was.
00:58:44.000 He came over, did this podcast, went back to visit his family and got arrested.
00:58:49.000 And you know what?
00:58:49.000 Shortly after he did the podcast.
00:58:51.000 So when people say to me, that's not a real problem, that, I mean, Graham had done three tweets.
00:58:57.000 One of them was just, they were all joke tweets, by the way.
00:59:00.000 They were all jokes.
00:59:01.000 And one of them was just, it was something like, ladies, if a guy's in your changing room or in your bathroom, scream, make a fuss, call the police.
00:59:08.000 If all else fails, kick him in the balls.
00:59:10.000 And it's obviously a wry way of saying, look, the guy's got genitals, the guy's...
00:59:14.000 That was why he got arrested.
00:59:16.000 On the night he got arrested, he was texting me.
00:59:18.000 He said, I've just been arrested.
00:59:19.000 I've been taken to the hospital because my blood pressure is so high.
00:59:22.000 The police took him to the hospital.
00:59:25.000 And you say there's no problem in the UK with creativity.
00:59:28.000 He's one of our best comedy writers.
00:59:30.000 He's the most beloved comedy writer.
00:59:31.000 He hasn't been able to work in TV for six years, right?
00:59:35.000 Like, he's won all the awards going.
00:59:38.000 And so we just kind of...
00:59:39.000 How can you be creative in that environment?
00:59:41.000 You can't.
00:59:41.000 You can't.
00:59:42.000 And so we just figured, let's get on a raft.
00:59:45.000 Especially someone like you.
00:59:46.000 So if people don't know, I should probably tell everybody.
00:59:49.000 You are Tatiana McGrath.
00:59:51.000 So, yeah.
00:59:52.000 Well, here's what's funny about that.
00:59:54.000 Your satirical character who you created many, many years ago, when did you create her?
00:59:59.000 2018.
01:00:00.000 Okay.
01:00:01.000 When you created her, I had you on the podcast shortly after.
01:00:04.000 We laughed about it.
01:00:06.000 I have seen her, quote, tweeted with people agreeing with her.
01:00:11.000 Yeah.
01:00:11.000 Yeah, even now.
01:00:13.000 Yeah.
01:00:13.000 All the time.
01:00:15.000 So I, yeah, if people don't know, it's a character called Titania McGrath.
01:00:19.000 She's a woke social justice warrior, right?
01:00:22.000 She's so good.
01:00:23.000 It's fucking great.
01:00:25.000 It's one of my favorite follows.
01:00:27.000 But you know, I don't do it as often as I used to.
01:00:29.000 You know, I used to do it all the time.
01:00:30.000 But then I wrote two books as her.
01:00:32.000 I did a live show as her.
01:00:34.000 By the way, when I did a live show, we were booked in for a week in the West End in London.
01:00:39.000 And then the head of the theater found out and scotched it.
01:00:42.000 And actually said, oh, well, I didn't know about this.
01:00:45.000 And the contracts are all signed.
01:00:46.000 Absolutely crazy.
01:00:47.000 Anyway, it doesn't matter.
01:00:48.000 But we did the show.
01:00:49.000 Well, it does matter, I suppose.
01:00:51.000 But the point is that, you know, so I did the character.
01:00:53.000 Do you have satire at your theater?
01:00:56.000 My God.
01:00:57.000 The theater industry in the UK is even worse than comedy if you want to go there.
01:01:01.000 It's really, really bad.
01:01:02.000 But like I've been in two different theatres in London, I've had the same experience of standing at the bar with a woman complaining because there's men pissing in her toilet and they're doing nothing about it.
01:01:14.000 Because all the theaters in London have made it all gender neutral.
01:01:16.000 They've gone completely, completely hardcore.
01:01:19.000 Anyway, that's not the point.
01:01:20.000 But with Titania, what I find so surprising is every now and then if something annoys me, I'll tweet.
01:01:25.000 Or if I think of something, I'll do.
01:01:27.000 I don't do it anywhere near as often as I used to.
01:01:28.000 But even now, I did a tweet about, you know, when all the people in London were marching about the peace deal in the Middle East?
01:01:36.000 And I did a tweet as her saying, I've been marching all day.
01:01:39.000 You know, I want a peace deal that was not arranged by Donald Trump.
01:01:43.000 We're never going to give up this fight, right?
01:01:45.000 And Ted Cruz retweeted it saying, can this be real?
01:01:50.000 Even now, these fucking boomers.
01:01:54.000 He's not even a boomer.
01:01:55.000 He's not even a boomer.
01:01:56.000 I think he's younger than me.
01:01:57.000 How old's Ted Cruz?
01:01:58.000 I think he's younger than me, which is hilarious.
01:02:00.000 I had the same with, I did one about.
01:02:02.000 How does he not know?
01:02:02.000 Does he have no friends?
01:02:03.000 How old is fucking Ted Cruz?
01:02:05.000 Okay, that's crazy.
01:02:05.000 65.
01:02:06.000 So that dude's three years younger than me, and he doesn't know satire?
01:02:12.000 The anger I got from, I did one the other day, or recently, about the Iran protests.
01:02:16.000 When did he send you?
01:02:17.000 Yeah, yeah, yeah.
01:02:17.000 Can I just stop?
01:02:18.000 I want to get into this.
01:02:18.000 Okay.
01:02:19.000 When did he tweet about this?
01:02:22.000 That's hilarious.
01:02:23.000 Yeah.
01:02:24.000 That account.
01:02:25.000 There it is.
01:02:25.000 There it is.
01:02:26.000 How many followers does Tattoo?
01:02:28.000 Okay.
01:02:28.000 So this.
01:02:28.000 Oh, sorry.
01:02:29.000 This is possibly real.
01:02:31.000 No.
01:02:31.000 Well, obviously it's not.
01:02:32.000 So this was actually after Trump's election.
01:02:34.000 So she said, I just fired my immigrant housekeeper because even though I'd educated her about the evils of Donald Trump, she still voted for him.
01:02:41.000 There's no place for racism in my house.
01:02:43.000 Click on your account.
01:02:44.000 I want to see how many followers you have.
01:02:47.000 Okay.
01:02:48.000 733,000.
01:02:49.000 That's a famous account.
01:02:51.000 Like, radical intersexualist poet, non-white, obviously white, eco-sexual, hilarious, pronouns, variable, selfless, and brave by my books.
01:03:04.000 You'd think it was obvious, wouldn't you?
01:03:06.000 Obvious!
01:03:07.000 I mean, maybe he's busy.
01:03:09.000 Maybe he's busy and someone sent him that and he just doesn't know.
01:03:13.000 But it's very funny.
01:03:15.000 It's very funny.
01:03:16.000 I feel slightly bad about those sort of things.
01:03:18.000 But then on the other hand, it does sort of prove the point that the stuff they're really saying can get us close to.
01:03:25.000 But that's very close to real.
01:03:27.000 It's very close to real.
01:03:27.000 Yeah.
01:03:28.000 And it's shifted radically since 2018.
01:03:32.000 I mean, in the eight years since you created her, she has become more real.
01:03:38.000 I was like, AI is going to turn her into a real person.
01:03:38.000 Yeah.
01:03:41.000 Like, oh, oh, maybe.
01:03:41.000 Yeah.
01:03:43.000 I hadn't even thought that.
01:03:44.000 She's going to be a real person.
01:03:45.000 It's going to be a real dangerous Greta Thurnberg type character.
01:03:48.000 But don't you worry about that?
01:03:49.000 I mean, like, AI.
01:03:50.000 Oh, a good example of that.
01:03:51.000 I was just, I use AI mostly as a search engine.
01:03:54.000 Because what's great about it is you can say, oh, I read an article like 10 years ago that said something like this.
01:03:58.000 Yes.
01:03:59.000 And it'll find it.
01:03:59.000 And you'd never find that on Google.
01:04:01.000 And I was trying to find this article.
01:04:01.000 Right.
01:04:02.000 It was from my book, actually.
01:04:04.000 There was a case in the UK where a guy had raped a 13-year-old girl.
01:04:09.000 But because he was Muslim and he'd gone to a madrasa and the judge let him off jail time, said you were very sexually naive.
01:04:16.000 You didn't understand.
01:04:17.000 The guy was saying, oh, I thought women were nothing.
01:04:20.000 And like a lollipop you dropped on the floor.
01:04:21.000 And the judge let him off jail time.
01:04:22.000 And I thought, this is quite extreme.
01:04:24.000 And I found it.
01:04:26.000 It came up on ChatGPT and then it deleted.
01:04:28.000 And I said, oh, I think you just deleted the information for me.
01:04:31.000 It's in the public domain.
01:04:32.000 Why did you do that?
01:04:33.000 It said, oh, you know, it's fine.
01:04:35.000 It might violate my terms of service.
01:04:37.000 And I said, well, how could it?
01:04:38.000 This is an article that's in the public domain.
01:04:39.000 So it gave me the information again, deleted it again.
01:04:42.000 I said, you keep deleting this.
01:04:43.000 Stop it.
01:04:44.000 It said, I definitely won't delete it.
01:04:45.000 Then it did the same again.
01:04:46.000 So what it's doing is it's saying, because this is a news story that could be deemed anti-immigrant, or this is a news story that is politically sensitive, I'm not going to let you see it.
01:04:55.000 Was this in America you were doing this?
01:04:57.000 UK.
01:04:58.000 Oh, I wonder if you could do it in America.
01:05:00.000 Let's find out.
01:05:01.000 Let's try it.
01:05:01.000 Well, let's try perplexity.
01:05:03.000 Put that into perplexity.
01:05:04.000 See, I doubt that perplexity would.
01:05:06.000 I have to find the article he was using, and I don't know what article he looked up.
01:05:09.000 Well, why don't you just ask the question that he asked.
01:05:12.000 So it's a story about a – Yes, it's going to take a minute.
01:05:12.000 It's 10 years ago.
01:05:16.000 That would take a while to.
01:05:17.000 Will it?
01:05:19.000 Maybe he didn't do it 10 years ago.
01:05:21.000 He did it recently.
01:05:22.000 No, no, it was a story.
01:05:23.000 It's a story from years ago.
01:05:24.000 Right, but you found it with ChatGPT, which is obviously recently.
01:05:27.000 I found a Daily Mail article about it.
01:05:29.000 So it's on public domain.
01:05:30.000 It's there.
01:05:31.000 But it didn't want me to find the fact that it decided wasn't good for me to find.
01:05:37.000 But it showed it to you and then it pulled it back, which is crazy.
01:05:37.000 Right.
01:05:40.000 How does it not know?
01:05:41.000 It showed it and deleted it.
01:05:43.000 It showed it and deleted it four or five times.
01:05:45.000 And I realized, I'm not going to get this information.
01:05:48.000 But then I thought.
01:05:49.000 So when it showed it, how long did it show it for?
01:05:50.000 Like about five seconds.
01:05:52.000 You'd see the text appearing and then it deletes.
01:05:54.000 But I'd seen enough to find it then on Google.
01:05:56.000 So I was able to find it and quote it in my book.
01:05:58.000 So it's there.
01:06:00.000 But it made me think.
01:06:01.000 It's like that thing about when people were asking Alexa, you know, do white lives matter?
01:06:07.000 And it was coming up with this kind of very ideological button.
01:06:10.000 And you do wonder with AI and with the computers, you know, if they are created by people who have that bias.
01:06:15.000 I know Grok is very different.
01:06:18.000 But like, for instance, I mean, this is a crazy example.
01:06:21.000 Chat GPT is like an old school mom that wants to make sure that you're protected, right?
01:06:27.000 I was writing, this sounds really wanky, I'm sorry, but I was writing about the Roman historian Suetonius.
01:06:32.000 And there's a passage in Suetonius where he talks about the Emperor Tiberius.
01:06:35.000 And it's very sexually explicit.
01:06:37.000 But I was quoting it for an article, so I wanted to know what it said.
01:06:39.000 And ChatGPT said, I can't translate the Latin for you because this is too sexually problematic.
01:06:46.000 I went on to Grok and it did it straight away.
01:06:49.000 Because Grok isn't saying that you are too delicate to read this stuff.
01:06:54.000 And what's really funny about that is the old dual translations of the old Roman and Greek texts, they're called Lerb editions.
01:07:01.000 You get them from 1900.
01:07:04.000 They translated everything except for the rude bits, which they kept in Latin.
01:07:07.000 So ChatGPT is like the old, you know, patronizing scholars of old who said, this is just for the learned people.
01:07:15.000 You can't learn this.
01:07:16.000 Well, wasn't the worst, the first iteration of Google Gemini, that was the worst cases.
01:07:22.000 That turned Nazi soldiers into black people.
01:07:26.000 I don't know how that's a positive message.
01:07:29.000 Show us photos of German soldiers from World War II and it was all interracial.
01:07:35.000 Yeah, and Vikings.
01:07:36.000 Yes.
01:07:38.000 I don't know if you've been to Scandinavia.
01:07:40.000 Diversity not their big thing.
01:07:41.000 Or certainly wasn't then.
01:07:44.000 You can't say that by the Vikings.
01:07:46.000 Also, the Vikings came and marauded and raped and set higher villages, but at least they were diverse.
01:07:50.000 Hey, you know, at least they had a broad range of ethnicities, right?
01:07:54.000 But I mean, we're nearing a time in America where white people are not the majority anymore.
01:08:00.000 So at what point in time does that stop?
01:08:02.000 And we just call people what they are, just people.
01:08:04.000 But doesn't it bother you a bit that the thing about that kind of thing is this, as I said, this obsession with group identity, which is so of our time, what it now actually means is the revision of history.
01:08:16.000 If you're going to revise history and say, oh, actually, you've seen all these sort of period dramas set in England.
01:08:21.000 There was a black Anne Boleyn, as they went, Henry VIII would have married a black woman.
01:08:25.000 No, he wouldn't.
01:08:26.000 What if she was hot?
01:08:27.000 She was a very attractive woman.
01:08:29.000 Hey, I'm not mocking her or knocking her.
01:08:31.000 But back then.
01:08:32.000 What I'm saying is, you can do anything with colourblind casting.
01:08:34.000 Colourblind casting has never really particularly bothered me, but it's when you are in a, if you're playing hyper-realism, if you're playing verisimilitude, you want people to buy into the reality of it, and you're suddenly populating Edwardian England or pre-Edwardian England as an ethnically diverse place, which it wasn't.
01:08:50.000 I'm not saying black people weren't there, but they were very, very, very small minority.
01:08:53.000 Isn't that a problem in the new Odyssey?
01:08:55.000 Helen of Troy is black.
01:08:58.000 Well, I say that.
01:08:59.000 I just saw it online, so I might be being tricked by someone making something up.
01:09:02.000 You know, a caveat that I think Helen of Troy is black in the new Odyssey.
01:09:06.000 Well, let's find out.
01:09:08.000 Can we check that one?
01:09:11.000 All right, if it's true, I'll tell you why I think that's ridiculous.
01:09:14.000 How far do we have to swing the pendulum until Roots is redone with white people?
01:09:19.000 Can you imagine?
01:09:21.000 Or an all-black Schindler's list.
01:09:22.000 Right, right, right.
01:09:23.000 Can you imagine Helen of Troy to be portrayed by black actress in New Odyssey movie?
01:09:28.000 And look, I'm sure she's very talented.
01:09:29.000 I'm not knocking her.
01:09:30.000 But the thing about the Greek, the thing about Helen of Troy, who probably didn't exist, I mean, even the Greeks knew she probably didn't even exist.
01:09:36.000 She's a myth.
01:09:37.000 She's the epitome of Greek beauty.
01:09:38.000 She's like the, she's described all the time in the ancient texts as fair and blonde.
01:09:44.000 And they're reaching for an ideal of beauty.
01:09:47.000 That's why they went to war because of this woman.
01:09:50.000 So they wouldn't choose what they used to call an Ethiop.
01:09:53.000 The Greeks had a word for it, the black African people.
01:09:55.000 They wouldn't choose an icon of cultural beauty from a different culture.
01:09:59.000 They wouldn't have done that.
01:10:00.000 It's all very well saying Greeks and Mediterranean people and pure white.
01:10:05.000 But Helen of a Troy is a very specific.
01:10:07.000 And it's actually quite important to the plot.
01:10:11.000 And again, if you're doing a, look, for instance, when they did the all-black Wizard of Oz, The Wiz, I imagine that in the late 60s would have been quite radical and fun.
01:10:20.000 And wow, I can't believe they did that.
01:10:21.000 That's brilliant.
01:10:22.000 But doing it now, it's really boring because everyone is doing it.
01:10:25.000 It's basically saying group identity is everything.
01:10:25.000 It's so banal.
01:10:28.000 And you people can't be racist.
01:10:29.000 and so they're all going to do this.
01:10:31.000 But it sometimes throws you out of the...
01:10:33.000 Actually, I'll tell you the worst example.
01:10:35.000 Did you ever see Darkest Hour, the Winston Churchill film?
01:10:38.000 No.
01:10:39.000 So you know, obviously, he took on Parliament.
01:10:41.000 He said, we're not going to appease Hitler.
01:10:43.000 There's a scene in the film.
01:10:44.000 Gary Oldman plays him.
01:10:45.000 He goes down into the tube, the underground, and he's wrestling with his conscience.
01:10:49.000 And there's loads of black people on the tube.
01:10:50.000 There's white people too, but there's loads of black people.
01:10:52.000 The public convince him, no, you need to stand up for Hitler.
01:10:55.000 Now, we know that Churchill was a bit of a racist, didn't really like the, you know, fine, he was off his time.
01:11:01.000 I'm not saying anything more than that.
01:11:02.000 He was of his time.
01:11:04.000 But that, it was so unreal.
01:11:05.000 It was so unrealist.
01:11:06.000 It was so, it was almost like the filmmakers were saying, racism's never been a problem in the UK.
01:11:11.000 Well, actually, it has.
01:11:13.000 Like, and I kind of think this is, I kind of think this is, although it's ostensibly progressive, I think it does the reverse.
01:11:19.000 I think it says, we never had a problem with race.
01:11:22.000 We were all wonderful, kumbaya.
01:11:24.000 No, we weren't.
01:11:25.000 And actually, the abolitionists, the Thomas Henry Huxleys of the world, the people who had to fight for racial equality and parity, they had something to fight against.
01:11:34.000 Misrepresenting stuff in the arts.
01:11:37.000 And then beyond, I'm sorry I'm ranting now because it really bothered me.
01:11:39.000 But beyond that, it throws you out of it in a way that you suddenly think, I'm no longer watching a film.
01:11:45.000 I'm watching a sermon.
01:11:47.000 Oh, so this happened to me last week.
01:11:49.000 Have you seen the Netflix series Ripley about the talented Mr. Ripley?
01:11:53.000 No, I have not.
01:11:54.000 Now, you remember there used to be that film with Matt Damon years ago.
01:11:54.000 Right.
01:11:57.000 It's the same story, same novel, an old Patricia High Smith novel.
01:12:01.000 One of the main male characters in that TV series is a brilliant series, like Andrew Scott is in it.
01:12:06.000 Performances are brilliant.
01:12:07.000 They play it hyper-realistically.
01:12:08.000 It's all black and white.
01:12:09.000 It looks beautiful.
01:12:10.000 On the Amalfi Coast, it's wonderful.
01:12:12.000 Everything's working brilliantly.
01:12:13.000 And I was thinking, this is great.
01:12:14.000 I'm not being preached at.
01:12:15.000 This is great.
01:12:16.000 Then a major male character turns up, played by a woman who calls herself non-binary.
01:12:22.000 And not only are we meant to believe that that's a man, the characters don't notice that it's a woman in man's clothes.
01:12:31.000 So we're meant to believe that these characters don't even, like not one person, Ripley doesn't say, why is she wearing a, why is she wearing a suit?
01:12:39.000 This is set in the 60s, by the way.
01:12:41.000 So I think if they wanted to change the novel and create a kind of, you know, like one of those butch dykes of the day who used to go for sort of like.
01:12:49.000 Just like Ellen.
01:12:50.000 Yeah, or yeah.
01:12:51.000 Or the androgynous type.
01:12:52.000 Like those, those people have always existed.
01:12:54.000 Why not change the character to make it a female character who likes looking like a man?
01:12:59.000 Why not do that?
01:13:00.000 Why tell us, no, this is a man.
01:13:03.000 You have to believe it's a man.
01:13:05.000 do you see what I mean it throws you out it's crazy I I no longer believe in this.
01:13:09.000 I had to stop watching it because I no longer believed in it.
01:13:13.000 Well, I think the problem, the real problem with trying to shove that down people's throats is it creates the opposite reaction.
01:13:19.000 Right.
01:13:20.000 It creates homophobia, transphobia, and racism.
01:13:24.000 Because it doesn't create it, but it makes them feel like they have a point.
01:13:28.000 Well, you've seen recently that the polls regarding gay rights in the US seem to be going down, tumbling support for gay rights, support for gay marriage.
01:13:37.000 We've had, I think, a number of states trying to overturn the gay marriage legislation.
01:13:42.000 And the reason for all of that, I think, is because being gay has been tied to this LGBTQIA identity-obsessed movement that has also involved the medicalization of kids, sterilization of kids, twerking in front of children, all of that stuff.
01:13:58.000 And now people are saying, this is because you gave us gay marriage.
01:14:01.000 This is because you let the gays marry.
01:14:02.000 And because of that, you've allowed all this other stuff.
01:14:05.000 You've opened this box and everything else has tumbled out.
01:14:07.000 And that's not true.
01:14:08.000 That's not true because the fundamental point about the belief in gender identity is it is fundamentally anti-gay as a principle.
01:14:17.000 Because what it says is, I know I'm telling you something you already know, but like gay rights was predicated on the idea that there's a minority of people in every society who are attracted innately to their own biological sex.
01:14:29.000 If you say biological sex doesn't matter, and actually you're attracted to a kind of gendered soul, you're attracted to an essence, you're attracted to how someone identifies.
01:14:38.000 Well, firstly, you don't know gay people if you think that's the case.
01:14:40.000 They're not attracted to how you see yourself.
01:14:44.000 They know gay men, I don't want to be crude, know what a penis is, right?
01:14:48.000 And they know how to sniff one out.
01:14:50.000 And I think this idea, this idea that they're attracted to the way that you perceive yourself.
01:14:55.000 Nonsense.
01:14:56.000 And not only that, then you get, you know, like in Australia at the moment, lesbians are not allowed to gather legally if there's a man who says he's a lesbian and wants to join them.
01:15:04.000 That is against the law in Australia now.
01:15:06.000 So you can't do that.
01:15:07.000 Wait, wait a minute.
01:15:08.000 What do you mean?
01:15:09.000 So the Australian Human Rights Commission ruled that if you are, if you have an all-female event, right?
01:15:14.000 So like a lesbian gathering, maybe something like that, you have to include men who identify as women.
01:15:19.000 Oh, God.
01:15:20.000 Because otherwise you are discriminating.
01:15:23.000 There was a woman who I interviewed on a show.
01:15:25.000 I had a show in the UK on GB News up until recently.
01:15:28.000 And I interviewed this woman called Sal Grover.
01:15:31.000 And she's an Australian woman, used to write for Hollywood, I think.
01:15:34.000 She created a woman's app, a woman's only app.
01:15:36.000 And this was in the wake of me too, you know, so there's all that going on.
01:15:39.000 And she wanted to create a space for women.
01:15:41.000 And a guy called Roxanne Tickle, right?
01:15:44.000 They always have these kind of stripper names.
01:15:46.000 Is that a real name?
01:15:47.000 Roxanne Tickle wanted to get on the app, which was called Giggle.
01:15:51.000 So by the way, this court case is called Giggle versus Tickle.
01:15:53.000 I'm not kidding.
01:15:54.000 Boy.
01:15:55.000 And he said, he got on the app.
01:15:58.000 She kicked him off because it's a bloke in a dress.
01:16:01.000 And he sued and won.
01:16:04.000 And in the court case, the judge actually said sex is changeable.
01:16:09.000 Well, it's not, no matter what a guy in a wig says.
01:16:14.000 But she's now appealing and going through all the awful, all this stuff just.
01:16:18.000 It makes her life hell, and then it discourages anybody else in the future from ever contesting anything like that.
01:16:23.000 And not only that.
01:16:24.000 I mean, we've just had the other day, was it yesterday?
01:16:26.000 Did you see the girl who used to identify as trans, a girl called Fox Varian, has just won $2 million in a lawsuit?
01:16:35.000 Yes, yes.
01:16:36.000 That's big because— She was 16 years old, and they chopped her breasts off, which is fucking horrifying.
01:16:42.000 It's the tip of the iceberg, though.
01:16:43.000 Especially if you have children, you realize they change the way they think about things year to year.
01:16:51.000 And if you, children are so malleable.
01:16:55.000 It's like one of the delicate dances of being a parent is that you have to love them, but you don't want to steer them in any direction.
01:17:05.000 You want to let them be their own person.
01:17:08.000 And, you know, it's like I tried to expose my children to a bunch of different things and find out what they enjoy.
01:17:15.000 And if you do that, you find out that they're all different.
01:17:18.000 They all like different stuff.
01:17:20.000 They just gravitate towards different things.
01:17:23.000 And if you are a domineering, overbearing, mentally ill parent, you can convince your child almost anything.
01:17:32.000 Almost anything.
01:17:33.000 I mean, this is how you get suicide bombers.
01:17:36.000 This is what it is because they're children.
01:17:38.000 This is why you don't get 55-year-old union guys who become suicide bombers.
01:17:42.000 They're like, what?
01:17:43.000 And of course, you know.
01:17:44.000 I get 72 virgins?
01:17:46.000 Yeah.
01:17:46.000 What?
01:17:46.000 Like, it's not going to work.
01:17:48.000 But you can get young, impressionable children, and you can convince them of almost anything.
01:17:53.000 Like convincing them that they're actually a woman in a man's body and don't you want to be a woman?
01:17:58.000 And let's get you on hormone blockers.
01:18:00.000 Okay, mom.
01:18:00.000 And then all of a sudden, you're ruining this child's life.
01:18:03.000 But also, I mean, there will be kids who are struggling with how they see themselves in the world.
01:18:07.000 There's girls in particular who, you know, they're developing into women and they don't like the sexual attention they're getting.
01:18:12.000 They'd love to get a lot of people.
01:18:13.000 Gabrielle Schreier's book.
01:18:14.000 Right.
01:18:15.000 Especially autistic girls.
01:18:17.000 So that's another point.
01:18:18.000 So this is the other reason why I think the movement is essentially anti-gay.
01:18:24.000 Because, you know, the Tavistock Pediatric Clinic in London, which was an NHS gender clinic, which has been closed as a result of the CAS review, this report into pediatric gender care.
01:18:35.000 They found, there's a book by Hannah Barnes called Time to Think, which found that between 80 and 90% of all adolescents referred to that clinic were same-sex attracted.
01:18:43.000 So they were either gay or lesbian or bisexual.
01:18:46.000 Now, that means you've effectively got gay conversion therapy going on on the NHS.
01:18:51.000 And so, you know, I had, you know, I'm friends with a couple of lesbians who run the LGB Alliance in London.
01:18:57.000 They have an annual conference for gay rights, and they're talking about gay rights.
01:19:00.000 You know, these young, non-binary identified people broke in, unleashed locusts and crickets and insects, a plague of fucking locusts into a gay rights conference.
01:19:12.000 Isn't that the sort of thing neo-Nazis used to do?
01:19:14.000 It's insane.
01:19:15.000 So, I mean, I think you need to have sympathy with people and whatever they're going through.
01:19:15.000 Right.
01:19:21.000 But don't tell a child, if a child tells you, I think I'm in the wrong body, don't say yes.
01:19:26.000 Say, that's not possible.
01:19:27.000 Human beings can't change sex, but let's explore psychotherapeutically what needs to happen.
01:19:32.000 Let's look at Los Angeles, which is, in my opinion, one of the most mentally ill spots in this country.
01:19:37.000 It's a very weird place.
01:19:39.000 That's why you left.
01:19:40.000 Well, I mean, I left for a bunch of reasons.
01:19:42.000 Mostly, I really left because they were telling us we can't do comedy.
01:19:46.000 Oh, yeah.
01:19:46.000 Well, that would do.
01:19:47.000 Closed down the comedy clubs in Texas was open.
01:19:49.000 It's the primary reason.
01:19:51.000 And also restaurants and everything.
01:19:52.000 I just knew where it was going.
01:19:54.000 But the point is, like, Los Angeles is a very mentally ill place.
01:19:58.000 Like, if you just looked at just the sheer numbers of people that are medicated and fucked up.
01:20:06.000 If that's the place that's dictating the tone for the rest of the world, that's dangerous.
01:20:12.000 Because these are a lot of people that just desperately want attention.
01:20:15.000 They desperately want to get accepted.
01:20:16.000 They have to go through the audition process.
01:20:18.000 So they have to change who they are to talk to the producers to try to form themselves into something to be accepted.
01:20:25.000 There's a disproportionate amount of trans kids that are involved in Hollywood families.
01:20:30.000 It's largely disproportionate.
01:20:33.000 Some of them have two trans kids, three trans kids.
01:20:33.000 Of course.
01:20:37.000 It's like, what the fuck is going on here?
01:20:39.000 This is not normal.
01:20:41.000 This is not no influence whatsoever.
01:20:44.000 This is, you're using that child as a virtue flag.
01:20:47.000 You're flying that child as a trans flag in the front of your porch.
01:20:52.000 I have a trans kid.
01:20:53.000 But don't you think that a lawsuit like this, that's going to change things?
01:20:57.000 Because no one's going to ensure that kind of procedure anymore.
01:21:00.000 That's a surgeon and a psychotherapist who are now lumbered with a $2 million bill.
01:21:05.000 Yes.
01:21:06.000 It's going to open up the floodgates for all these other lawyers to start pouncing on all these other cases.
01:21:10.000 That's what I mean.
01:21:11.000 The horrible thing about these cases is not just that these children have had their lives ruined by these surgeries and have been sterilized.
01:21:19.000 It's also that they've been attacked so ruthlessly.
01:21:24.000 You mean you're talking about children that have made a mistake or someone coerced them into making this mistake that's changed their body for the rest of their life and they're getting attacked online.
01:21:36.000 Like you imagine being a fragile child already who's willing to go through this procedure, can't believe they did it.
01:21:42.000 Now they don't have breasts anymore.
01:21:43.000 Their voice is deep forever.
01:21:45.000 They're all fucked up.
01:21:46.000 And then people are screaming at them online.
01:21:49.000 Yeah.
01:21:50.000 And it's crazy.
01:21:51.000 But you know, this is how the satanic child abuse panic of the 80s.
01:21:55.000 Yes, exactly.
01:21:56.000 This came to an end because of lawsuits.
01:21:59.000 When they realized that these psychotherapists have been using these leading questions, effectively telling them you've repressed the memory.
01:22:06.000 You know, there was that book, The Courage to Heal, where it said, if you think you might have been abused, you probably were.
01:22:11.000 Like such a reckless thing to say.
01:22:13.000 And all these people accused, you know, carers, parents.
01:22:18.000 None of it was true.
01:22:20.000 But when they started suing the psychotherapists, it all collapsed.
01:22:25.000 Right.
01:22:25.000 And I wonder whether.
01:22:27.000 Hysteria can collapse if you actually money talks.
01:22:30.000 You know when society shifted in this general direction because of Elon buying Twitter.
01:22:35.000 When Elon bought Twitter, the amount of trans-identified kids started to drop off.
01:22:40.000 The amount of non-binary identified kids started to drop off right, and that, I think, is a direct result of people being able to say what they really think.
01:22:47.000 Think because in the past, like my friend, Megan Murphy, she was banned off of Twitter until Elon bought it because she said, a man is never a woman.
01:22:55.000 Right, that's all she said.
01:22:56.000 Right, a man is never a woman.
01:22:58.000 She was arguing with people about biological males who identify as women, being able to get into women's spaces, and she said, a man is never a woman.
01:23:04.000 Banned forever.
01:23:06.000 Yeah, so no one wanted to talk about this.
01:23:08.000 See, there was no real discourse.
01:23:09.000 And if there's no real discourse, then you can push a goofy ideology pretty fucking far.
01:23:14.000 But as soon as people jump on board and start posting funny memes and and Elon says it's open season, do whatever you want yeah, and he calls it the woke mind virus and everybody's like piling in well, then you have discourse and then anything that's absurd immediately gets shot down because people say no, this doesn't make any sense.
01:23:34.000 This is crazy.
01:23:35.000 It comes back to what you said.
01:23:36.000 You said about debate.
01:23:37.000 You said about discourse.
01:23:38.000 You said about if you unless you I mean, I just saw today just on you know, obviously on twitter because i'm always on it but I saw John Lithgow you know the actor, brilliant actor, who plays Dumbledore in the New Harry Potter thing saying that Jk Rowling's views are inexplicable.
01:23:51.000 Inexplicable it means you haven't read them.
01:23:54.000 Like Jk Rowling yeah, is for women's rights and she recognizes that women's rights depend on the recognition of biological sex, for the preservation of single-sex spaces.
01:24:04.000 It's as simple as that.
01:24:05.000 All he has to do is read the essay she wrote on her blog like eight years ago.
01:24:09.000 He can't even, he's not even sufficiently intellectual cure, intellectually curious to do that and he goes out and says it's inexplicable.
01:24:15.000 Women's rights and gay rights are inexplicable.
01:24:18.000 Really, or are you just not having the conversation?
01:24:20.000 You're just shutting yourself up and saying my friends have said she's evil nah, criticized hard enough, but would be criticized if he supported Jk Rowlings?
01:24:29.000 If he supported Jk Rowlings, he would be attacked.
01:24:31.000 So it's a calculation.
01:24:32.000 You're saying yes, maybe it's the same thing we're talking about with Hollywood being mentally ILL.
01:24:37.000 Right same thing where you have to shape your opinions based on how you'll be accepted by the group.
01:24:43.000 It's the most groupthink place i've ever been in my life.
01:24:46.000 There it's almost universally left-leaning.
01:24:49.000 But isn't that the problem in comedy, like with the Uk, so many people who would otherwise be innovative, subversive comics.
01:24:54.000 They've got nowhere to go right, so they just tailor.
01:24:57.000 They come to Austin, baby.
01:24:58.000 They come to Austin like I did.
01:25:00.000 Right, that's it.
01:25:01.000 They come to, they come to I I, I and I, I.
01:25:04.000 I get so sick of it because I know in America it's much better, but in the Uk, all of like my old friends from the comedy circuit who tell me No one's self-censoring.
01:25:12.000 You can say what you want.
01:25:14.000 Like, the list of people I know who have had shows canceled, taken off because they caused offense.
01:25:14.000 I'm like, are you kidding?
01:25:21.000 This week, Leo Kurtz, a friend of mine, had one of his shows on his tour just deleted because some activists complained to the venue, right?
01:25:27.000 So it's happening all the time, and they're ignoring this Himalayan mountain of evidence.
01:25:33.000 And they're saying it's not a thing.
01:25:35.000 But of course, people are self-censoring it.
01:25:36.000 What's even happening here?
01:25:37.000 Is it?
01:25:38.000 Michael Rappaport got his shows.
01:25:41.000 He got his shows cancelled from Cap City Comedy Club.
01:25:44.000 Did he?
01:25:44.000 Which is our other comedy club in town, which is a great club owned by Helium.
01:25:48.000 Right.
01:25:48.000 But they were saying that he's racist because Michael Rappaport is very pro-Israel.
01:25:53.000 Right.
01:25:53.000 And apparently.
01:25:55.000 Why does that make you racist?
01:25:56.000 I don't know what he said, so I don't want to speak out of turn.
01:25:59.000 I don't know what exactly he said.
01:26:01.000 I'd like to make a small correction, I think.
01:26:02.000 Oh, I don't think that she has been.
01:26:06.000 Sorry, back to the Odyssey thing.
01:26:07.000 Oh, yeah, yeah.
01:26:08.000 She has been cast in the movie, but only Twitter rumors have said what her position in the movie is, and then everybody has ran with it.
01:26:17.000 Oh, interesting.
01:26:18.000 So she could be anything someone else under a different character.
01:26:23.000 All the articles I found online said it was like social media confirmation, and then people were just rubbing.
01:26:27.000 There we go.
01:26:28.000 Well, isn't that what I said?
01:26:29.000 What is that article that you just clicked?
01:26:31.000 This is the one I showed earlier.
01:26:32.000 But what is it from?
01:26:32.000 It starts off with the Hungarian Conservative.
01:26:36.000 That's a niche.
01:26:37.000 Shamie!
01:26:38.000 How dare you let that sneak by?
01:26:40.000 You didn't notice it was a Hungarian conservative.
01:26:43.000 Are you being paid by the Hungarian conservative?
01:26:45.000 That's the top thing that popped up.
01:26:46.000 It's the Helena Community Community College.
01:26:47.000 Meanwhile, it's probably a fucking troll farm in Pakistan that's creating that.
01:26:52.000 Or it's probably in China or something.
01:26:53.000 All I Googled was Helena Troy Odyssey movie, and it's the very first movie.
01:26:56.000 Good for the Hungarian Conservative coming out on top of the Google search.
01:27:00.000 That's pretty good.
01:27:01.000 That's so funny.
01:27:02.000 But did I not say, I'm not sure about this?
01:27:04.000 It's a Twitter rumor.
01:27:04.000 But look, Elon Musk bought into it.
01:27:06.000 Elon Musk says Christopher Nolan has lost his integrity.
01:27:10.000 That's how it spreads out.
01:27:10.000 Oh, Elon.
01:27:11.000 So there we go.
01:27:12.000 The dude's too busy building rockets to pay attention to what he tweets.
01:27:17.000 But this proves the point.
01:27:18.000 Like, let's not.
01:27:19.000 Oh, yeah, he's going to take us to the moon again.
01:27:20.000 So, you know, that's.
01:27:22.000 Isn't he?
01:27:22.000 So he's not.
01:27:23.000 No, Artemis is.
01:27:24.000 I thought he was working with NASA.
01:27:26.000 Oh, is he working with NASA with Artemis?
01:27:27.000 Again, someone said it online and I just bought it.
01:27:29.000 I just thought it was a lot of fun.
01:27:30.000 Oh, wow.
01:27:31.000 They probably can't get there without him.
01:27:33.000 It's probably like, oh, I'll show you some things.
01:27:35.000 Okay, so that is a perfect example because I am always now, even when I mentioned that earlier, I was cautious, wasn't I?
01:27:35.000 But that's.
01:27:42.000 Right.
01:27:42.000 Because I know I've fallen for this so many times.
01:27:45.000 I now double-check and triple-check everything.
01:27:48.000 And I wish I didn't have to, but you do have to, because even the mainstream media lie about stuff.
01:27:53.000 And then Twitter rumors go absolutely mad.
01:27:55.000 But it's important when you're talking about a historical film.
01:27:59.000 Yeah, yeah.
01:28:00.000 Like, it's got to kind of just can't do that.
01:28:02.000 It doesn't make any sense.
01:28:03.000 When you sort of can't, like, I think an artist should be able to do what they want.
01:28:07.000 And I think if you want to, like, they do it with Shakespeare all the time.
01:28:09.000 Sorry to go back to Shakespeare, but you rarely go and see a Shakespeare play today that hasn't been filtered through the prism of identity politics and changed in the world.
01:28:17.000 Right, but that's not the same.
01:28:19.000 That's not the same as historical figures.
01:28:24.000 Well, he wrote histories.
01:28:25.000 He wrote about kings, Henry VIII, Henry V. Yeah, but it's fiction, right?
01:28:31.000 Like, the thing about the Odyssey is definitely fiction.
01:28:35.000 It is sort of, but, you know, they didn't think Troy existed.
01:28:39.000 And then they found out it does.
01:28:40.000 Right.
01:28:41.000 It's a real place.
01:28:42.000 It's based on myth.
01:28:43.000 Yeah, that's right, yeah.
01:28:45.000 But you remember, like, they thought that Troy was a completely mythological creation.
01:28:50.000 So it's an actual thing.
01:28:51.000 They have evidence that it was a place.
01:28:53.000 Yes.
01:28:53.000 You didn't know that?
01:28:54.000 No.
01:28:54.000 Yeah, they found it.
01:28:55.000 When did they find Troy?
01:28:57.000 It was in the 20th century.
01:29:00.000 So for the longest time.
01:29:01.000 But there wouldn't have been sirens and there wouldn't have been Cyclopes and they wouldn't have, you know what I mean?
01:29:06.000 Oh, no.
01:29:07.000 No, Cyclopses they think were actually elephant skulls.
01:29:07.000 Oh, Joe.
01:29:10.000 That's what they think that was okay.
01:29:12.000 Do you ever see an elephant skull?
01:29:14.000 I have never seen an elephant.
01:29:15.000 Well, you know, where the trunk is is in an enormous hole.
01:29:18.000 And they thought that that was an eyeball.
01:29:20.000 So they would find these giant skulls that looked like, you know, they didn't know what the fuck it was.
01:29:25.000 Yeah.
01:29:25.000 They're like, oh my God, Cyclopses are real.
01:29:27.000 Fair enough.
01:29:28.000 I mean, evidence allegedly served as a real place began to emerge in the 1870s.
01:29:33.000 Henrik Schlielmann discovered large-scale excavations at the Hisarlik in northwestern Turkey in 1870.
01:29:44.000 So when did they first start excavating?
01:29:48.000 So where is it?
01:29:49.000 It's in Turkey.
01:29:49.000 It's in Turkey, yeah.
01:29:51.000 Which is a lot of the proponents of a revising of the beginning of civilization are now pointing to Turkey as opposed to like Iraq.
01:30:05.000 Well, the Greeks were everywhere, you know, so the Mesopotamians and the, I mean, that doesn't surprise me.
01:30:09.000 I mean, I think the point I was making about Helen of Troy is that even if it's not real, even if it's not history, the myth of Helen of Troy means something quite significant within that story.
01:30:18.000 So if you subvert that, the fundamental aspects of the story itself doesn't work and you can't buy into the myth.
01:30:18.000 Yes.
01:30:25.000 It's like if you turn the elephant man into a handsome fellow with a six-pack.
01:30:29.000 Exactly that.
01:30:30.000 Don't give them ideas.
01:30:31.000 Don't give them ideas.
01:30:32.000 They'll do it.
01:30:32.000 Can you show me a photograph of an elephant skull?
01:30:35.000 It's really kooky.
01:30:36.000 But you see an elephant skull and you're like, oh, I could totally see you falling for that.
01:30:42.000 You look at it and you go, what the fuck is that thing?
01:30:45.000 Like, look at an elephant skull.
01:30:46.000 Isn't it nutty?
01:30:47.000 Oh, completely.
01:30:47.000 Yes.
01:30:48.000 And it's going to be a big old beast.
01:30:48.000 Yeah.
01:30:50.000 Right.
01:30:50.000 So you're going to think it's a big giant thing with tusks coming out of its mouth.
01:30:54.000 Like, look, look with the look at the actual cyclops on the left.
01:30:58.000 Isn't that crazy?
01:30:58.000 Yeah.
01:30:59.000 Of course.
01:31:00.000 Makes complete sense.
01:31:00.000 No, it makes sense.
01:31:02.000 Complete sense.
01:31:03.000 Yeah.
01:31:03.000 You found that.
01:31:03.000 Yeah.
01:31:04.000 You're like, oh my God, cyclopses are real.
01:31:06.000 You would think, oh, my God, these monsters.
01:31:08.000 Isn't that funny?
01:31:09.000 What a weird-shaped skull.
01:31:11.000 So strange.
01:31:13.000 You never think the eyeballs would be down there by the cheekbones.
01:31:15.000 That's what's weird about them.
01:31:17.000 I have to say, elephant anatomy is something I'm not.
01:31:20.000 I haven't brushed up on that.
01:31:21.000 Show the photo again.
01:31:22.000 Look at that photo where the eyeballs are.
01:31:24.000 So where are the eyeballs or where the cheekbones are?
01:31:26.000 See the little circular holes where the cheeks are.
01:31:28.000 Now, when you see an elephant in the flesh, like show me a photograph of an elephant.
01:31:36.000 Just an elephant.
01:31:39.000 So their eyeballs are?
01:31:41.000 Isn't that crazy?
01:31:42.000 That's not how you think of them, is it?
01:31:43.000 No, well, they're so strange.
01:31:45.000 Aren't they?
01:31:45.000 Like, give me that the second one on the left.
01:31:48.000 Yeah, look at that.
01:31:49.000 Click on that.
01:31:49.000 What a wild animal.
01:31:51.000 They're amazing.
01:31:51.000 Have you never seen one of those before?
01:31:53.000 You'd be like, I know in a zoo.
01:31:55.000 Crazy.
01:31:56.000 I rode one in Thailand.
01:31:57.000 No.
01:31:58.000 Yeah, yeah, yeah.
01:31:59.000 Yeah, I don't recommend it.
01:32:00.000 I don't think I should ride them.
01:32:01.000 My whole family wanted to do it.
01:32:03.000 I didn't want to do it.
01:32:03.000 I felt like it's exploiting them.
01:32:05.000 But they're very sweet.
01:32:06.000 They're gentle, aren't they?
01:32:07.000 Yeah, they're pleasant creatures.
01:32:08.000 It's a whole process.
01:32:09.000 So one of the things you do when you go to Thailand is you take care of them first before you roll.
01:32:14.000 You don't just hop on them, you feed them.
01:32:16.000 So you give them a bunch of sugar cane and you pet them and they teach you to like so that the animal understands you have a gentle spirit.
01:32:24.000 But that it's intelligence, right?
01:32:25.000 It's because they're smart.
01:32:26.000 They're very smart.
01:32:27.000 Also, they'll fucking kill you.
01:32:28.000 Oh, they are.
01:32:29.000 Scary beasts.
01:32:30.000 Stomp you.
01:32:31.000 But they're not like the hippo.
01:32:32.000 The hippo will kill you.
01:32:33.000 You cannot do that with a hippo.
01:32:35.000 And I believe the reason why hippos are so dangerous.
01:32:37.000 We think they're really cute and fat, but they are fucking dangerous and they can run fast and they can tear you apart.
01:32:43.000 And they will.
01:32:43.000 But the key difference, I believe, is the intelligent thing.
01:32:46.000 So elephants are really smart and hippos are really stupid.
01:32:48.000 Yeah, and you can also become friends with an elephant.
01:32:51.000 Yes.
01:32:52.000 Like you can actually take care of an elephant and be kind to an elephant and that elephant will be gentle.
01:32:58.000 Yeah, they come up to you.
01:32:59.000 So you feed them sugar cane and you talk to them.
01:33:01.000 You say, hey, buddy, how are you?
01:33:03.000 And you pet them and you wash them.
01:33:05.000 You wash them.
01:33:06.000 You do all kinds of different things with them.
01:33:07.000 You brush them so it feels good for them.
01:33:09.000 You're going to have an elephant soon, aren't you?
01:33:11.000 No, I would never have an elephant.
01:33:12.000 I'd be friends with an elf, but he'd have to be wild.
01:33:14.000 Like, I just don't agree with any of that.
01:33:17.000 Having them in zoos and things.
01:33:18.000 No, I hate it.
01:33:19.000 I think I do as well.
01:33:20.000 I think if you're going to have animals, you should have a gigantic area that is a true ecosystem that they exist in naturally.
01:33:29.000 And then people can maybe venture into that ecosystem and explore.
01:33:33.000 I felt that I was at the zoo recently in Arizona.
01:33:35.000 I felt there was one jaguar pacing obsessively.
01:33:39.000 I just felt this.
01:33:40.000 It's like going to, you know, like in the Elizabethan era, they used to go to Bedlam to watch the people who were mad as an entertainment thing.
01:33:46.000 It felt a little bit like we were doing that.
01:33:48.000 I have far too much appreciation for the wild.
01:33:51.000 You know, I have animals that are contained at my house, but they have been watered down by selective breeding to the point where they can't even like I have a King Charles Spaniel.
01:34:02.000 He's this tiny little fella.
01:34:04.000 Like, he's incapable of doing anything.
01:34:05.000 Right.
01:34:06.000 Like, he's just a little cutie pie.
01:34:07.000 You can't unleash him into the wild.
01:34:08.000 Right.
01:34:09.000 And I have a golden retriever who thinks everybody's his best friend.
01:34:13.000 Did you see the guy who kept a hippo from birth and then it ate him?
01:34:15.000 And then it killed him.
01:34:16.000 Yeah.
01:34:16.000 It ate him.
01:34:17.000 So, you know, like understand that you're dealing with a creature that doesn't see the world that you do.
01:34:23.000 There's a lot of animals that you can breed up until you can rather have them in your home like chimps famously.
01:34:23.000 Yeah.
01:34:23.000 You know?
01:34:30.000 Yeah, yeah.
01:34:31.000 Up into a certain point, and then they decide, I want to rip your face off.
01:34:34.000 I don't like you anymore.
01:34:35.000 Oh, I'm sure.
01:34:36.000 If cats were as big as we are, they'd probably do the same.
01:34:38.000 Well, they would just eat you.
01:34:40.000 They would kill you 100%.
01:34:42.000 The only reason why we have a relationship with cats is because they're too small to eat us.
01:34:45.000 That's it.
01:34:46.000 Cats are great because they're convenient.
01:34:47.000 They do what they want.
01:34:48.000 They're sweet.
01:34:48.000 They're sweet.
01:34:49.000 I love cats, but I mean, you can't have a fucking giant one.
01:34:52.000 But you can if you take care of them from the time that they're cubs.
01:34:56.000 Yeah.
01:34:57.000 And most of the time they don't kill you.
01:34:58.000 Yeah.
01:34:59.000 But then you get a little Siegfried and Roy action and it just decides, for whatever reason, I want to drag that dude away with his neck.
01:35:05.000 Yep.
01:35:06.000 But you know, these sorts of pleasures, you know, life with animals and this sort of thing is going to matter more and more to us, I think, when the robots take over.
01:35:13.000 Yeah.
01:35:15.000 Well, we might have to live with them.
01:35:17.000 We might be wild and the robots might take over the cities.
01:35:20.000 We might be forced to be nomadic tribes again.
01:35:22.000 I think they might see us.
01:35:23.000 They have no impact whatsoever on the environment.
01:35:25.000 You can only live a subsistence lifestyle as a hunter-gatherer with primitive tools where the robots would no longer allow you.
01:35:33.000 You can hunt, but you have to make your own bows and arrows.
01:35:35.000 We'll be like, what?
01:35:36.000 I can't possibly do that.
01:35:39.000 So they're going to see us as pets.
01:35:41.000 Yeah, so they're going to treat us the way I want to treat elephants.
01:35:41.000 They're going to see what's going on.
01:35:44.000 So I want elephants to exist in a contained ecosystem where they live naturally.
01:35:49.000 And they're going to say, you can't have cars anymore.
01:35:51.000 You can't have any of these things.
01:35:52.000 Well, that's a good point, though, isn't it?
01:35:53.000 So all the stuff I've been reading at the moment about AI is saying that AI won't wipe us out because it'll see us in the way we see animals and way we see pets.
01:36:01.000 Is that, well, we think you're sweet and stupid, but we like having you around.
01:36:05.000 We'll tolerate you.
01:36:06.000 Is that the way it's going to go there?
01:36:07.000 I think we're going to be forced to integrate.
01:36:10.000 In what way?
01:36:11.000 Integrate technologically.
01:36:12.000 Like, I think we already are.
01:36:14.000 Like, Elon's famously made the point that you're already a cyborg.
01:36:17.000 You have your phone that you just carry around with you everywhere.
01:36:20.000 And then with Neuralink, it'll be inside your body.
01:36:23.000 And then whatever.
01:36:24.000 I wouldn't.
01:36:24.000 I'm not letting that happen.
01:36:25.000 You won't in the beginning, the first iterations.
01:36:29.000 A lot of people won't.
01:36:30.000 But if it makes your life measurably better and it's a simple procedure that's non-invasive, you know, it's like a simple thing that they plug into the back end.
01:36:38.000 Well, I'd be like a cyborg warrior.
01:36:40.000 Is that what you're saying?
01:36:41.000 Well, you would probably be connected to artificial intelligence and it would greatly enhance your cognitive function and greatly enhance your access to information.
01:36:52.000 It would be instantaneous.
01:36:53.000 You would no longer have to read.
01:36:55.000 You would just have all the information.
01:36:57.000 It would just completely change the way you store information because you would probably have some sort of an external hard drive that connects to you.
01:37:06.000 It would be something where your memory is no longer fallible, but it's now infallible.
01:37:11.000 Okay.
01:37:11.000 It's going to be a perfect 4K memory or 8K memory.
01:37:14.000 You're going to be able to rewind.
01:37:16.000 I mean, it was not an episode of Dark Mirror where they rewind their memories.
01:37:19.000 There's an interesting twist in this AI space.
01:37:22.000 Remember, you sent me that bot thing that was going around.
01:37:24.000 Yeah.
01:37:24.000 Yeah.
01:37:25.000 Oh, did you see this week?
01:37:26.000 Yeah, yeah, yeah.
01:37:26.000 This is what we're talking about.
01:37:27.000 This is a new twist on it.
01:37:29.000 I think if this is real, because grain of salt could be bullshit.
01:37:33.000 I'll just say that.
01:37:34.000 Like the L D C thing.
01:37:35.000 But if this is real, these bots have made a website where the other bots can rent a human to do tasks that the bots cannot physically do.
01:37:35.000 Yeah.
01:37:44.000 Well, that's slavery.
01:37:45.000 No, renting.
01:37:46.000 It's like jobs.
01:37:47.000 You're renting a human being.
01:37:49.000 A human has put themselves on this website.
01:37:52.000 Oh, humans put themselves on it.
01:37:53.000 For abilities to do whatever they want.
01:37:55.000 It's like gig autonomy.
01:37:57.000 Yeah, get paid your way, robot bosses.
01:37:59.000 Is this the thing where the robots are inventing their own language that we can't read?
01:38:04.000 It's on this website, right?
01:38:05.000 Meat space tasks.
01:38:07.000 Yeah, that's again, whether or not someone could have made this site to try to go viral.
01:38:11.000 I'll just go with a grain of salt with that.
01:38:13.000 Yeah.
01:38:14.000 But they might not have in the meat space.
01:38:16.000 Rentinghuman.ai is fun.
01:38:19.000 That's fun.
01:38:20.000 Well, you know, so the other thing is real, though, right?
01:38:23.000 The AI chat room where these AI agents have joined and now it's...
01:38:29.000 Yes and no.
01:38:30.000 Yes and no?
01:38:30.000 What do you mean?
01:38:31.000 Some of it, they are creating a space, but I've already seen places where people are taking advantage of it for viral reasons.
01:38:37.000 For instance, let's just assume it's real.
01:38:41.000 There was a polymarket bet that one of these bots would sue.
01:38:49.000 So someone actually just went ahead and filed a lawsuit on behalf of their bot and made it look like the bot did the thing.
01:38:54.000 Oh, so they can win the polymarket bet?
01:38:57.000 How regulated is that polymarket stuff?
01:38:57.000 Yeah, exactly.
01:39:00.000 Because it seems like you could get away with a lot.
01:39:02.000 It depends how much money is available.
01:39:03.000 As far as I know, it's just like if I put up 20 bucks for a bet, now there's only 20 bucks in the market.
01:39:09.000 So that's all that exists.
01:39:11.000 And more people have to back it up to make more money involved.
01:39:14.000 Right.
01:39:15.000 But if you have something where you have inside knowledge of it, is there any regulation?
01:39:21.000 There's supposed to be rules on the bets.
01:39:23.000 If I create one of those rules, you're supposed to, I think there's a caveat.
01:39:25.000 You can't have knowledge of it.
01:39:27.000 And that can cancel the bet.
01:39:28.000 Or I think if they find out later, I don't know.
01:39:31.000 Do you go to jail?
01:39:32.000 Like, what happens?
01:39:32.000 I don't know jail.
01:39:33.000 You probably just have to lose back the bet or you just probably go to like a civil lawsuit or something.
01:39:37.000 I don't know about that.
01:39:38.000 I don't know if it's an in-laws.
01:39:38.000 Interesting.
01:39:40.000 You know, the UFC is plagued with this issue.
01:39:42.000 They actually canceled a fight recently because there was suspicious betting.
01:39:46.000 And so there's been one fight.
01:39:50.000 So here's the story.
01:39:52.000 One guy apparently was injured and his teammates knew he was injured.
01:39:58.000 And so everyone started placing a bet for him to lose in the first round.
01:40:03.000 Right.
01:40:03.000 Because he apparently had a bad knee injury.
01:40:05.000 And so he knew that he couldn't fight.
01:40:08.000 And so the idea was, let's make a lot of money betting on me because he was the favorite.
01:40:13.000 He would go in there or betting against me.
01:40:16.000 And so he would go in there and throw a kick, fall down, injured, get beat up.
01:40:22.000 They'd stop the fight.
01:40:23.000 And then all these people that knew he was injured make a ton of money.
01:40:26.000 And he was in on it.
01:40:27.000 Like he told them that.
01:40:28.000 Allegedly.
01:40:29.000 I just want to say allegedly.
01:40:30.000 But it's enough so that the team was removed from the UFC roster.
01:40:30.000 Okay, okay.
01:40:36.000 Like if you are competing for that team, you no longer can fight in the UFC.
01:40:39.000 You have to find a new gym.
01:40:40.000 The coach was no longer allowed to coach.
01:40:43.000 The fighter was banned.
01:40:44.000 And so then the FBI got involved and they said, well, there's a bunch of different fights that are suspicious.
01:40:50.000 So then a bunch of fighters came out and said, hey, somebody offered me $70,000 to lose.
01:40:56.000 And I said no.
01:40:57.000 And so then there was a fight recently between Michael Johnson and Alexander Hernandez, which is a fight I was really looking forward to, that was canceled last minute.
01:41:04.000 And I was like, what's going on?
01:41:05.000 They said suspicious betting activity.
01:41:08.000 And so someone was saying that Alexander Hernandez was injured and a bunch of money came in on him to lose.
01:41:14.000 He was actually a favorite going into the fight.
01:41:17.000 And that therefore rigged it.
01:41:19.000 Nope, didn't rig it because the FBI was informed.
01:41:22.000 I believe they were informed, but the UFC was informed and the UFC pulled the fight.
01:41:27.000 So they said, because of this suspicious betting activity, because a lot of late minute money came in on this one guy to win, we're going to pull this fight from the card and not allow this fight to take place and do a thorough investigation because something seems wrong because of the previous fight that they know was fixed.
01:41:44.000 But fighters have been doing that for ages, haven't they?
01:41:46.000 I mean, that's a thing that they've always done.
01:41:48.000 How does that connect then to the AI element that this website?
01:41:52.000 Well, we were talking about betting.
01:41:54.000 We were talking about polymarket.
01:41:54.000 Oh, I see.
01:41:56.000 We weren't talking about AI.
01:41:58.000 Yeah, yeah, yeah.
01:41:59.000 We were talking about polymarket bets and whether or not it's legal to have inside information.
01:42:03.000 I mean, I know that.
01:42:05.000 Polymarket privileged users made millions betting on war strikes and diplomatic strategy.
01:42:10.000 What did they know beforehand?
01:42:11.000 Privileged users.
01:42:12.000 Right.
01:42:13.000 So imagine if you're someone who's an aide to the Pentagon.
01:42:17.000 You know, you're working there and you know that we are going to bomb Iran.
01:42:22.000 And then there's a polymarket thing about it.
01:42:22.000 Yeah, yeah.
01:42:24.000 No one else knows.
01:42:25.000 Okay, okay, yes.
01:42:26.000 You know?
01:42:26.000 I mean, that's been going on forever, though, hasn't it?
01:42:28.000 People have always done that.
01:42:29.000 They've always manipulated.
01:42:30.000 That's a plot in pulp fiction, isn't it?
01:42:31.000 Where Bruce Willis' character bets on something he knows he's going to, he loses the fight for the fight.
01:42:35.000 He throws a fight.
01:42:36.000 So that he can make the money off.
01:42:38.000 It's that kind of thing.
01:42:38.000 Yeah, yeah.
01:42:39.000 Yeah.
01:42:39.000 Yeah.
01:42:40.000 That's always gone on.
01:42:41.000 But this polymarket thing is new because you can kind of, there's Kalshi and then there's DraftKings has it now.
01:42:49.000 It's not actually gambling is the difference here.
01:42:51.000 You're speculating.
01:42:52.000 Yeah, you're not talking about the book or else or whatever.
01:42:55.000 You're betting against each other.
01:42:58.000 Right.
01:42:58.000 But the fact that they know about it and they know it's happening, that means they'll be able to crack down on it.
01:43:02.000 But I don't know because there's a lot of, there's so many options and possibilities.
01:43:08.000 Like unless you make a gigantic score and people start getting suspicious, if you're not greedy about it and you're kind of sneaking around a little bit here and a little bit there, I bet you could probably make a lot of money doing that.
01:43:18.000 But you think fighters and people like that and sports people generally, I mean, they're too proud, aren't they, to let something like that go just in case, just for money?
01:43:25.000 No.
01:43:25.000 No.
01:43:26.000 No, that's not true.
01:43:27.000 It depends on how much money they're making.
01:43:28.000 Look, if you're Anthony Joshua, I'd say, yeah, you're not going to do that.
01:43:31.000 You're very wealthy.
01:43:33.000 But if you're a guy who's on the undercard and you're only getting $10,000 to fight, but someone's giving you $100,000 to lose.
01:43:40.000 And you say, okay, I'm just going to box shitty tonight.
01:43:40.000 Yeah, okay.
01:43:43.000 Guys have done that forever.
01:43:44.000 Yeah, I guess so.
01:43:45.000 Just don't knock this guy out, whatever you do.
01:43:47.000 Carry him or carry him to the 10th round.
01:43:51.000 You know, there's a lot of that going on where they say, I have a bet that you're going to knock him out in the 10th round.
01:43:55.000 So knock him out in the 10th round only.
01:43:57.000 I don't think you'll ever be able to stop that.
01:43:58.000 No.
01:43:59.000 If that's going to happen.
01:44:00.000 No, I don't think so either.
01:44:01.000 I mean, that's gone on forever.
01:44:03.000 Yeah, yeah.
01:44:04.000 But isn't fighting like a kind of vocation, like a creative vocation for a lot of people?
01:44:08.000 Well, it is creative, believe it or not, because movement is creative.
01:44:13.000 You know, when you're fighting, you're not just running at each other, some guys do, but the really good guys don't just run at each other in charge.
01:44:20.000 There's feints and deception.
01:44:22.000 There's movement.
01:44:23.000 There's certain things that they're doing where they're reading your movement and trying to guide you in a particular direction and set you up.
01:44:30.000 Like boxers.
01:44:31.000 Boxers call it setting traps.
01:44:31.000 I can't believe it.
01:44:33.000 Yeah.
01:44:34.000 You got a bluff.
01:44:34.000 It's like playing.
01:44:35.000 Yes.
01:44:36.000 Yeah, absolutely.
01:44:36.000 Yeah, there's a lot of fainting involved in fighting.
01:44:40.000 There's a lot of fake movement to get you to react, and then they kick you when you settle in.
01:44:45.000 It's really creative.
01:44:47.000 It's just why, like, was it Faye Dunaway?
01:44:49.000 No, was it, who was it that said, you know, the older woman that said, and we're talking about the arts, and I don't mean mixed martial arts.
01:44:58.000 God, who was it?
01:44:59.000 What, like a kind of snobbish thing?
01:45:01.000 Glenclose?
01:45:02.000 No, it wasn't her.
01:45:03.000 It was the lady from Bridges of Madison County.
01:45:07.000 Who was that?
01:45:09.000 That's Meryl Streep.
01:45:11.000 Meryl Streep, that's who it was.
01:45:12.000 Yeah, Meryl Streep said that.
01:45:15.000 It pissed off so many martial arts people.
01:45:18.000 Why, that Meryl Streep doesn't like it.
01:45:19.000 And I'm not talking about mixed martial arts.
01:45:23.000 Who thought you were?
01:45:24.000 Yeah.
01:45:25.000 Who thought you were, Meryl?
01:45:26.000 That's crazy.
01:45:27.000 I don't know.
01:45:28.000 She's pretty versatile.
01:45:29.000 But also, even though it's violent, you think it's not art just because you don't understand it.
01:45:29.000 She can do it.
01:45:35.000 If you understood it, it is art and is in fact like a beautiful, some performances are beautiful.
01:45:40.000 Well, it's choreography, right?
01:45:41.000 In a way.
01:45:41.000 Yeah.
01:45:42.000 Well, it's not choreography at all.
01:45:44.000 It's ad libbing in the moment.
01:45:46.000 I mean, there's preconceived motions that you have that you're hoping that if the guy does this, you're going to do that.
01:45:54.000 And sometimes it works out.
01:45:56.000 But it's like the poetry of movement of a really sublime fighter like Anderson Silva in his prime.
01:46:03.000 It was beautiful to watch.
01:46:05.000 I believe you.
01:46:07.000 I have very limited experience of this.
01:46:08.000 I did kung fu when I was 12 and I stopped because I got so bruised.
01:46:13.000 I got so hurt.
01:46:14.000 I was too cowardly.
01:46:15.000 But you know, people impose their own standards on other people and their own ideas of what things are, you know, from the outside.
01:46:23.000 And, you know, it's kind of silly.
01:46:24.000 Yeah.
01:46:25.000 Oh, Joe, I was going to tell you about this Berkeley thing.
01:46:27.000 And I almost sidetracked.
01:46:29.000 We got into elephants.
01:46:31.000 But I think this was a natural segue.
01:46:34.000 Because I think this encapsulates all of the stuff you were talking about, which is that I was going to this, basically Charlie Kirk's tour.
01:46:40.000 It was meant to go on.
01:46:42.000 Berkeley was the last date.
01:46:43.000 And Rob Schneider had agreed to do it.
01:46:46.000 And apparently he'd said to Charlie, you know, what's the craziest place you could take me to?
01:46:51.000 Berkeley's going to be the crazy.
01:46:51.000 And he said, Berkeley.
01:46:52.000 Let's do that.
01:46:53.000 So he was already booked to do it.
01:46:55.000 After what happened with Charlie, Rob asked me if I'd come along as well.
01:46:59.000 And so we'd be on a panel.
01:47:01.000 And I had no idea of the extent of the problem, right?
01:47:03.000 So, and I'm sure you know a lot more than I do.
01:47:07.000 But I turned up.
01:47:08.000 We were there.
01:47:09.000 We turned up and there were men with guns.
01:47:10.000 We were in an SUV under the ground.
01:47:12.000 We got into this venue.
01:47:14.000 And suddenly the security starts showing me footage from outside.
01:47:18.000 And people are, it's like a war zone.
01:47:19.000 People are throwing smoke bombs.
01:47:22.000 They're trying to crash through the railings.
01:47:24.000 Some guy gets beaten up.
01:47:25.000 He's covered in blood because he was wearing a t-shirt with turning point written on it.
01:47:29.000 And I'm suddenly realizing, you know what?
01:47:32.000 This is a fantasy world that we're now occupying.
01:47:34.000 We're now occupying a world where the people outside think the world is this and what's going on inside is completely disconnected from it.
01:47:41.000 And I actually found it quite depressing because when I was sitting on stage talking to Rob and Peter Boghossian and Frank Turek, these people of completely different viewpoints, we're just having a chat.
01:47:50.000 Outside, they're smashing things, they're screaming, they're saying that fascists have overrun the university.
01:47:56.000 And I'm thinking, just to come back to that point you made about, you know, that need for discussion, that experience made me think, actually, now what's happening is we're living in two separate worlds at the same time.
01:48:08.000 And we can't see what the other side is, what the intentions of the other side are.
01:48:13.000 And I don't know how you resolve that.
01:48:14.000 I think that's that to me sort of encapsulated the entire problem.
01:48:18.000 Well, at this point, it's going to be very difficult to resolve.
01:48:21.000 And I honestly think it's going to take a generation to work through it.
01:48:25.000 But isn't it as simple as people learning what the word fascist means, for instance?
01:48:29.000 It's not just that.
01:48:30.000 It's like they firmly believe that they are trying to fight against something that is going to destroy democracy in this country, which is conservative values.
01:48:40.000 But we had that with the No Kings.
01:48:41.000 So there's a No Kings march.
01:48:43.000 And I couldn't figure that out.
01:48:44.000 I was trying to figure out what are they.
01:48:45.000 This is an elected leader.
01:48:47.000 Well, you know, it's all organized, right?
01:48:49.000 You know, this is all funded.
01:48:50.000 Okay, it is.
01:48:50.000 It's organized.
01:48:52.000 So this was Mike Benz's point when he was talking about the defunding of USAID and what they use that money for.
01:49:00.000 NGOs get a bunch of money and they fund a bunch of things, particularly in other countries, where they're essentially making it look like there's these on-the-ground street protests that are very organic.
01:49:14.000 But it's not.
01:49:15.000 It's very organized and it's very funded.
01:49:17.000 And the idea is to start chaos.
01:49:19.000 So I've seen people get caught out, people who are clearly being paid, who appear at various different.
01:49:23.000 It's not just that.
01:49:25.000 It's also email campaigns.
01:49:27.000 It's indoctrinating people into this particular ideology by supporting universities.
01:49:32.000 So you funded in advance.
01:49:34.000 So it's like decades of – this is – I'm sure you've seen the Russian guy from 1984, 1985, Yuri Besmanov talking about the – Remind me.
01:49:48.000 You've never seen it?
01:49:49.000 I don't think I've seen it.
01:49:50.000 It's a wonderful video because it shows you exactly what happened, how they're going to introduce Marxism and Leninism into universities, and then it'll indoctrinate children, and then those children will be poisoned, and within one generation, it'll ruin the United States' entire educational system.
01:50:06.000 So that's the law.
01:50:08.000 Yeah, that's the law.
01:50:09.000 You should watch a little bit of that because it's crazy.
01:50:12.000 Because back then, I remember the 1980s.
01:50:16.000 That would be a crazy idea.
01:50:18.000 No, universities are where people have free thought and discussion.
01:50:20.000 It's very important.
01:50:22.000 And I was in a very left-leaning place at the time.
01:50:25.000 I was living in Boston.
01:50:28.000 Probably more universities per capita than anywhere else in the country, at least at the time.
01:50:32.000 And it was a very well-read city.
01:50:35.000 The idea that universities are going to destroy the way human beings interact and debate is preposterous.
01:50:41.000 But this guy was talking about this back then: that the Soviets had planned this in advance, and that they had essentially subverted our entire education system, and thereby those people would leave those schools indoctrinated and enter into the workforce with these new ideas in universal acceptance that these ideas are correct.
01:51:00.000 And then it would, in turn, you know, the butterfly effect.
01:51:03.000 But do you think that everyone, I don't, I can't be sure that it's as conspiratorial as that because there must have been a lot of people who just got on board with the people.
01:51:10.000 Well, there's a lot of money involved in doing this.
01:51:13.000 There's a lot of funds that have come from China.
01:51:15.000 There's a lot of money that has been donated to these universities.
01:51:18.000 Like, find that video.
01:51:21.000 Okay, I found it, but there's like a second version on Twitter I've never seen before.
01:51:24.000 An AI-moderated person.
01:51:26.000 No, no, no.
01:51:27.000 He's now in a wig.
01:51:28.000 Oh, I recognize him.
01:51:30.000 So listen to what he says.
01:51:33.000 Is spent on espionage as such.
01:51:35.000 The other 85% is a slow process, which we call either ideological subversion or active measures, activi mirapriatia, in the language of the KGB, or psychological warfare.
01:51:49.000 What it basically means is to change the perception of reality of every American to such an extent that despite of the abundance of information, no one is able to come to sensible conclusions in the interest of defending themselves, their families, their community, and their country.
01:52:11.000 It's a great brainwashing process which goes very slow and it's divided in four basic stages.
01:52:20.000 The first one being demoralization.
01:52:23.000 It takes from 15 to 20 years to demoralize a nation.
01:52:26.000 Why that many years?
01:52:28.000 Because this is the minimum number of years which requires to educate one generation of students in the country of your enemy, exposed to the ideology of the enemy.
01:52:41.000 In other words, Marxism-Leninism ideology is being pumped into the soft heads of at least three generations of American students without being challenged or counterbalanced by the basic values of Americanism, American patriotism.
01:52:56.000 The result, the result you can see, most of the people who graduated in the 60s, dropouts or half-baked intellectuals, are now occupying the positions of power in the government, civil service, business, mass media, educational system.
01:53:11.000 You are stuck with them.
01:53:12.000 You cannot get rid of them.
01:53:14.000 They are contaminated.
01:53:15.000 They are programmed to think and react to certain stimuli in a certain pattern.
01:53:20.000 You cannot change their mind.
01:53:22.000 Even if you expose them to authentic information, even if you prove that white is white and black is black, you still cannot change the basic perception and the logic of behavior.
01:53:35.000 In other words, these people, the process of demoralization is complete and irreversible.
01:53:42.000 To get rid society of these people, you need another 20 or 15 years to educate a new generation of patriotically minded and common sense people who would be acting in favor and in the interests of the United States society.
01:54:03.000 And yet these people who have been programmed and, as you say, in place and who are favorable to an opening with the Soviet concept, these are the very people who would be marked for extermination in this country.
01:54:14.000 Most of them, yes.
01:54:16.000 Simply because the psychological shock, when they will see in future what the beautiful society of equality and social justice means in practice, obviously they will revolt.
01:54:31.000 They will be very unhappy, frustrated people.
01:54:34.000 And the Marxist-Leninist regime does not tolerate these people.
01:54:39.000 Obviously, they will join the links of dissenters, dissidents.
01:54:44.000 Unlike in present United States, there will be no place for dissent in future Marxist-Leninist America.
01:54:52.000 Here you can get popular like Daniel Ellsberg and filthy rich like Jane Fonder for being dissident, for criticizing your Pentagon.
01:55:02.000 In future these people will be simply squashed like cockroaches.
01:55:07.000 Nobody is going to pay them nothing for their beautiful noble ideas of equality.
01:55:12.000 This they don't understand and it will be greatest shock for them of course.
01:55:17.000 The demoralization process in the United States is basically completed already.
01:55:23.000 For the last 25 years, actually it's overful filled because demoralization now reaches such areas where previously not even Komrad Andropov and all his experts would even dream of such a tremendous success.
01:55:38.000 Most of it is done by Americans to Americans, thanks to lack of moral standards.
01:55:44.000 As I mentioned before, exposure to true information does not matter anymore.
01:55:51.000 A person who was demoralized is unable to assess true information.
01:55:56.000 The facts tell nothing to him.
01:55:59.000 Even if I shower him with information, with authentic proof, with documents, with pictures, even if I take him by force to the Soviet Union and show him concentration camp, he will refuse to believe it until he is going to receive a kick in his fat bottom.
01:56:18.000 When a military boot crashes, then he will understand, but not before that.
01:56:23.000 That's the tragic of the situation of demoralization.
01:56:28.000 Well, he's describing the situation as it is at the moment, right?
01:56:30.000 And he's describing it in 1984.
01:56:32.000 However, that doesn't prove that what he's des that intention to create that kind of chaos, that it was implemented and executed in the way that he describes.
01:56:41.000 He's describing that, he's talking about a program that they implemented.
01:56:47.000 So they had actual people in universities planted in universities to deliberately execute this idea.
01:56:52.000 Yeah, and they planned it in advance.
01:56:55.000 This is what he was saying.
01:56:56.000 And he's saying this before we even realized that it happened.
01:56:59.000 I agree, that's scary.
01:57:00.000 It is scary because it did happen.
01:57:03.000 But that doesn't fully explain why it caught on.
01:57:05.000 Why did academics who were clearly not plants, why did they catch on with this?
01:57:09.000 Well, they don't live in the fucking real world.
01:57:11.000 This is the problem with academics.
01:57:12.000 They go right from universities to teaching positions.
01:57:16.000 I mean, this whole thing.
01:57:17.000 They don't have any real world experience.
01:57:18.000 I mean, this whole idea of the long march through the institutions, it's there in Rue de Deutsche.
01:57:22.000 It's there.
01:57:23.000 It was said, we're going to do this.
01:57:24.000 We're going to infiltrate the major organizations, institutions, the church.
01:57:28.000 We're going to, over a very long period of time, change society in the way that we want to see it.
01:57:35.000 I think what's happened is, I think that intention was there.
01:57:37.000 I think what he's saying is very eerily describing what's happening now, the demoralization and the detachment from truth.
01:57:43.000 But I don't think it necessarily came about as systematically as that.
01:57:47.000 How do you think it came about?
01:57:49.000 Well, for one thing, I think what we're facing now isn't quite the template that Marx would have had in mind, right?
01:57:55.000 Because for one thing, there's no emphasis on class or money or the economy or anything.
01:58:01.000 Well, insofar as Marxism has become about group identity in terms of the left, substituting.
01:58:06.000 So the rich is a giant mantra that people can't in the streets.
01:58:10.000 That's true.
01:58:10.000 That's true.
01:58:11.000 They're trying to tax billionaires.
01:58:13.000 But it's incoherent because it's from people who've got money.
01:58:16.000 It's from the upper middle classes.
01:58:17.000 It's pretty coherent.
01:58:19.000 It's all just something, a narrative that you give the unwashed masses, and then they run with it.
01:58:25.000 Well, I wonder whether it caught on, partly through what became fashionable, what became trendy, but also because any ideology says to you, you don't have to do anything anymore.
01:58:33.000 You can outsource that to us.
01:58:35.000 You've got a set of rules.
01:58:37.000 And these are the rules that you've got.
01:58:38.000 People love that.
01:58:39.000 Well, it's why you've got people who are, well, it's why you've got queers for Palestine.
01:58:42.000 Right.
01:58:43.000 You know?
01:58:43.000 That can only exist when you're following a set of rules and not thinking about it for two seconds, right?
01:58:47.000 That's a wonderful group.
01:58:49.000 I actually thought that was fake when I first heard about it, which must be about five years ago.
01:58:53.000 You've seen the other meme.
01:58:54.000 I thought it was unreal.
01:58:55.000 You've seen that meme?
01:58:56.000 Which one?
01:58:57.000 Palestine and then Palestine for queers.
01:58:59.000 Oh, and I imagine.
01:59:00.000 They're throwing people off the rules.
01:59:01.000 Of course they are.
01:59:02.000 Of course they are.
01:59:03.000 I just say, go and do, go, go there and see what you see.
01:59:08.000 See what you experience.
01:59:09.000 Go there as a man in a dress wearing lipstick with a beard.
01:59:12.000 Good luck.
01:59:12.000 Yeah, I just did a Titania tweet of a drag queen touring the Middle East, and she's touring all these venues.
01:59:18.000 And she's got the sort of Palestine dress and the sort of the glam kind of Arabic look.
01:59:25.000 It's like, just go there and see what happens.
01:59:28.000 But that kind of cognitive dissonance can only work if you are ideologically driven.
01:59:34.000 And I think, so I suppose what I mean is I think the appeal of ideology is what explains, not a kind of, we've implanted these agents here.
01:59:44.000 They're going to lead to this.
01:59:45.000 They're going to lead to this.
01:59:46.000 It has to also be complicity that comes from implanting ideas.
01:59:52.000 Those ideas take hold and then groupthink takes it from there.
01:59:55.000 But isn't it a shame that universities of all places, the place where you go to be challenged and the place where you go.
02:00:00.000 I mean, I was thinking that when I was at Berkeley and, you know, I was sitting on the stage and there's all these men with guns all around the theater because, of course, what happened with Charlie.
02:00:09.000 And I'm thinking, it's like the end of the Blues Brothers, you know, where you're on stage and all the people are waiting.
02:00:13.000 It felt weird.
02:00:14.000 And I thought, this is not what a university is or should be.
02:00:19.000 And the other thing that I thought is a lot of those people outside protesting weren't students.
02:00:24.000 They'd sort of come in.
02:00:25.000 They'd been bussed in.
02:00:26.000 So maybe that feeds into what you were saying about, you know, this is all 100% planned and how are they getting bussed in?
02:00:32.000 Who's funding them?
02:00:33.000 People are paying a lot of money to do that.
02:00:33.000 Right.
02:00:36.000 And they're doing it all over the country.
02:00:36.000 Right.
02:00:38.000 But they did it during the presidential elections.
02:00:40.000 Yeah.
02:00:41.000 During the presidential elections, they were tracking cell phones from place to place, and they realized that there was a group of people that were paid attendees at Kamala Harris's rally.
02:00:52.000 Oh, yeah, I remember that.
02:00:53.000 And so they were getting paid.
02:00:55.000 Their job was to show up and cheer for Kamala Harris.
02:00:58.000 Do you think fundamentally then the Democrats are anti-democratic?
02:01:02.000 I think fundamentally anybody that doesn't have organic support is going to figure out a way in this environment to drum it up.
02:01:12.000 And if you can do that through a service, or if you could do that through an NGO, or if you could do that through a company that'll hire people to show up at your rallies, they do it because they want to win and they want to get into a position of power.
02:01:25.000 And one of the things that we do find with Trump is that it actually turns out the president can do a lot.
02:01:31.000 You know, and we used to think that they were kind of handcuffed and they weren't able to do as much and that's why nothing ever got done.
02:01:31.000 Yeah.
02:01:37.000 Turns out that's that doesn't seem to be true.
02:01:40.000 You get a maniac in office.
02:01:42.000 You can kind of get away with a lot of things.
02:01:43.000 You can do a lot of different things.
02:01:44.000 That's what we sort of need in the UK.
02:01:46.000 We need someone to come in and strip away.
02:01:48.000 We need what Besminoff was saying is that we need to kind of a whole generation that teaches that being patriotic and having morals and ethics is actually a good thing.
02:02:00.000 And that free speech is important and that to be able to debate ideas is essential to any sort of true society that considers itself an elevated modern version of what we hope for when this country was founded.
02:02:23.000 It wasn't founded on the idea that you have to adhere to one ideology and this ideology thinks that gender is not real and no one can answer what a woman is.
02:02:34.000 That's crazy.
02:02:35.000 That's become popular.
02:02:36.000 Well, we see in America, like America's the kind of life rafter the world, that you've got all these things built into your political system.
02:02:42.000 Yeah.
02:02:42.000 And that's why it's so scary when you see people.
02:02:45.000 Do you remember the vice presidential debate between JD Vance and Tim Waltz?
02:02:49.000 And Tim Waltz said that the First Amendment doesn't cover hate speech.
02:02:53.000 It doesn't cover misinformation.
02:02:55.000 Exactly.
02:02:56.000 He's a dangerous fuck.
02:02:57.000 Like, that's scary.
02:02:58.000 If the guy who might be vice president is saying, actually, we're going to strip out all of this stuff.
02:03:03.000 Also, just the way he behaves is so odd.
02:03:07.000 The way he waves and runs on stage, it's all just so fake and performative.
02:03:12.000 I don't know any men like that that aren't dangerous.
02:03:15.000 Why was he picked?
02:03:16.000 Probably because of the Minnesota stuff.
02:03:20.000 It probably had something to do with what he was allowing to happen in Minnesota.
02:03:23.000 They're probably making a ton of money.
02:03:25.000 Okay, okay, maybe.
02:03:25.000 Right.
02:03:27.000 There's a reason why he had to resign.
02:03:28.000 I mean, I'm clearly speculating.
02:03:31.000 I have no idea, and I'm a moron when it comes to politics.
02:03:34.000 But what I would assume is that for sure, he was informed of this fraud long in advance.
02:03:43.000 If it wasn't for that Nick Shirley kid and those videos, and apparently Nick Shirley had been informed by the GOP there that this was all going on.
02:03:43.000 Right, right.
02:03:52.000 So this gets exposed.
02:03:53.000 It gets into the public site, guys.
02:03:54.000 It becomes a huge news story.
02:03:56.000 It's not a coincidence that the riots break out in the exact same place where all this fraud is being exposed.
02:04:02.000 Because ICE is everywhere.
02:04:04.000 They're all over the place.
02:04:04.000 But it's not the most violent interactions are the interactions that are happening in the place where the most fraud has been publicly exposed.
02:04:13.000 This is all by design.
02:04:14.000 There's something very scary about it.
02:04:16.000 Yeah, and so this guy knew about it in advance.
02:04:18.000 How do we know?
02:04:19.000 Well, one way we know is because he's resigning.
02:04:22.000 So there must be something.
02:04:23.000 Right, there's something.
02:04:24.000 He's not running for governor again.
02:04:26.000 He was in the process of running for governor.
02:04:28.000 He's decided to step out of public office entirely now.
02:04:31.000 So maybe they told him if you do not step out, you are going to be prosecuted.
02:04:35.000 We know what you did.
02:04:37.000 Or maybe he's going to fucking turn states evidence.
02:04:40.000 Who fucking knows?
02:04:41.000 Imagine if he had won, him and Kamala Harris, if they would have been in charge.
02:04:45.000 I don't think I would have come here.
02:04:47.000 Elon doesn't buy Twitter and Kamala Harris wins and Tim Walsh is our vice president.
02:04:52.000 But doesn't that just tell you how fragile freedom is?
02:04:55.000 How close you are.
02:04:56.000 Very fragile.
02:04:57.000 And that's why people support Donald Trump.
02:04:59.000 And the people that think that they support him because he's a racist and all these different things.
02:05:02.000 No, no, no.
02:05:02.000 They support it because it's an alternative to what we all saw coming.
02:05:06.000 No one's excited that ICE is killing people in the streets.
02:05:10.000 No one likes that.
02:05:11.000 No, of course not.
02:05:11.000 You have to be fucking insane if you think those people should be just getting shot like that.
02:05:16.000 That's nuts.
02:05:17.000 But what they don't want is what the government was previously doing.
02:05:21.000 They had a completely open border.
02:05:23.000 They were bussing people into swing states.
02:05:26.000 They were trying to pretend that this was all organic.
02:05:29.000 And it's not.
02:05:30.000 It's not.
02:05:31.000 They had a plan.
02:05:32.000 And they did it in a sneaky way where they looked like the really kind, ethical, equitable, and inclusive crowd.
02:05:39.000 Right.
02:05:39.000 But that's the woke story all over again.
02:05:41.000 It was the woke stories applied to geopolitics.
02:05:41.000 Exactly.
02:05:44.000 It was the woke stories applied to the whole political process in this country was dependent upon the census, which the census doesn't count citizens.
02:05:53.000 The census just counts humans.
02:05:55.000 And so you get more congressional seats.
02:05:57.000 You get more electoral points.
02:05:59.000 The whole thing is nuts.
02:06:00.000 I mean, I like to think that not all Democrats are into that.
02:06:03.000 Not all Democrats are about the power for its own culture.
02:06:05.000 Of course not.
02:06:06.000 But the problem is it's a party.
02:06:07.000 Like if you work for a corporation and you're a good person, but the corporation is polluting a river in Guatemala, there's a diffusion of responsibility because you're a part of a giant system.
02:06:17.000 And hey, I'm just an accountant.
02:06:19.000 I go to work and I do my thing for Exxon or mobile or whatever it is.
02:06:23.000 Well, I'd say for however messy all of this has become in the U.S., at least you've had some sort of attempt to strip out the very stuff that that guy was talking about.
02:06:31.000 The fact that the civil service is all one way, the fact that the machinery of government, that was the plan, right?
02:06:36.000 So the machinery of government works in a certain way.
02:06:38.000 So there's no democratic means of getting rid of it.
02:06:40.000 There's no way to change it.
02:06:41.000 Well, I think the counter to that is the education that the internet provides.
02:06:46.000 And that's where they didn't anticipate in 1984.
02:06:49.000 So the education that the internet provides is untethered.
02:06:53.000 But then the internet tells us that Christopher Nolan's just made a film with Black Helen of Troy.
02:06:57.000 And he hasn't.
02:06:57.000 Right.
02:06:58.000 It produces all sorts of unsavory things, too.
02:07:01.000 Yeah.
02:07:02.000 But it also allows the distribution of information that would be impossible through normal means.
02:07:07.000 If these people are, as he said, in control of major media, which they were, in control of universities, which they are, and then it goes on to be the only way people get information, now your information is very heavily filtered, and then all that stuff works.
02:07:22.000 But that's why the technocrats in the EU, why ideologues generally are against internet, or they want to censor it.
02:07:29.000 That's why Macron is trying to stop X in France.
02:07:33.000 Or whoever's trying to stop it.
02:07:35.000 So the EU, the head of the EU Commission is Ursula von der Leyen.
02:07:38.000 Did you hear her?
02:07:39.000 That's a great name, by the way.
02:07:40.000 Well, yeah, it's a sexy name, right?
02:07:42.000 Yeah.
02:07:43.000 She's unelected.
02:07:45.000 The European Commission is an unelected body that sets the legislative agenda of all these European countries.
02:07:50.000 Absolutely crazy.
02:07:51.000 You can't vote them out.
02:07:52.000 She did a speech last May where she said, and I'm not joking about this.
02:07:57.000 She said that misinformation was like a virus and you need to inoculate yourself against the virus.
02:08:03.000 And the phrase she used is not debunking, pre-bunking.
02:08:06.000 So pre-bunking is her idea of what you do with misinformation.
02:08:10.000 What she means is censorship.
02:08:12.000 But pre-bunking is the most sinister.
02:08:14.000 That's crazy.
02:08:15.000 Like, if you were to say, I'm going to come up with the most Orwellian, sort of dark lord kind of cis pre-bunking.
02:08:15.000 Chill it.
02:08:23.000 Yeah, that's like fucking minority report, right?
02:08:25.000 I don't know what the...
02:08:27.000 Because I know that there's this free speech debate opening up between the US and Europe generally.
02:08:32.000 Like, you know, when JD Vance came over to Munich and gave that talk to all the European leaders and said, you've got to stop censoring your people.
02:08:39.000 You've got to stop running away from voters.
02:08:41.000 And they were shocked and they were horrified.
02:08:42.000 But he was dead right.
02:08:43.000 He's dead right.
02:08:44.000 And he should.
02:08:45.000 And you know what?
02:08:45.000 People on the left should admit that he's dead right as well.
02:08:48.000 But there's something about Europe, right?
02:08:50.000 There's something about, like I think over here, coming over here, I get the sense that even if most left-leaning people as well as right-leaning people do value free speech as a kind of shared value.
02:09:02.000 And in Europe, it's not that.
02:09:03.000 There's a real sense of we can't trust the masses.
02:09:07.000 Because I know that the EU is seen as this big lefty thing, which it absolutely is not.
02:09:11.000 The EU is a body that wants to censor its citizens.
02:09:15.000 It's a body that tells people, you can have a referendum, but if you get it wrong, we're going to make you vote again.
02:09:20.000 It's not a democratic organization.
02:09:22.000 So no wonder Vance is sort of, and Trump is at loggerheads with this body.
02:09:26.000 Because you've got these.
02:09:27.000 We in the UK have an authoritarian leader, Kier Starmer, the Prime Minister.
02:09:31.000 He couldn't be further away from the American ideal of free speech.
02:09:35.000 He introduced this online safety bill, which is basically this is why a lot of tweets in the UK, if you go over to the UK now, a lot of the tweets will come up saying this is potentially harmful content.
02:09:44.000 So we're screening it out.
02:09:46.000 He, you know, they're trying to get rid of juries for certain trials.
02:09:49.000 They're trying to get rid of juries.
02:09:50.000 They already did.
02:09:51.000 And that's particularly dangerous because some of those cases are for speech crime, right?
02:09:58.000 So I'll give you an example.
02:09:59.000 There was a Royal Marine called Jamie Michael who had made a video just saying we need to peacefully protest against the migration issue.
02:10:06.000 They took him to court for stirring up racial hatred.
02:10:10.000 But the jury is what let him off.
02:10:12.000 It was the jury that saved him.
02:10:14.000 In this new system, there wouldn't be a jury there and he would be in prison.
02:10:18.000 Yeah.
02:10:19.000 And most certainly would be in prison.
02:10:21.000 So I kind of feel like, and we've got Kier Starmer now for another three years.
02:10:25.000 Every decision that he makes is about not trusting the public, censoring what they think.
02:10:30.000 If he could get rid of X, he absolutely would.
02:10:32.000 Is it possible that someone sensible could win in three years?
02:10:36.000 Or is the system so deeply entwined in the ideology of the English people that it's just stuck?
02:10:44.000 This is what I think about that.
02:10:45.000 Because I look at America and I think, in a way, you had your culture war election because of Trump, right?
02:10:50.000 You know, I mean, a lot of people say the culture war doesn't matter.
02:10:50.000 Yeah.
02:10:53.000 Of course it does.
02:10:54.000 Of course it matters.
02:10:55.000 I mean, did you see about that the advert that the GOP put out?
02:10:59.000 You know, Kamala Harris is for they them, Trump is for you.
02:11:01.000 That was the slogan.
02:11:03.000 It was about the Democrats wanting to fund transgender surgery for prisoners.
02:11:08.000 And Donald Trump's team had this advert, Kamala Harris for they them.
02:11:12.000 Donald Trump is for you.
02:11:14.000 That actuated a 2.7 shift in favor of Donald Trump among everyone who saw it.
02:11:18.000 It was a major success.
02:11:20.000 That just shows that these issues, these cultural war issues, people do care.
02:11:24.000 And people do vote.
02:11:24.000 Oh, that's true.
02:11:25.000 But you had a way in America to vote that stuff out through Trump, right?
02:11:29.000 We've never had that.
02:11:31.000 But they barely had a way.
02:11:32.000 Like, if they had more time, they wouldn't have.
02:11:36.000 You mean that if the Democrats had to do it?
02:11:38.000 If the Democrats won this time and then they tried to do it again in 2028, Elon was really adamant about that during the last election.
02:11:45.000 Like, this might be the last real election we have if you don't stop this now.
02:11:50.000 Because they have an open border, and in the last four years, they've pulled 10, at least 10 million people into this country.
02:11:59.000 And they've changed the electoral map.
02:12:01.000 And then on top of that, there was both Schumer and Nancy Pelosi openly talking about letting these people vote.
02:12:11.000 Openly talking about giving these people a path to citizenship.
02:12:14.000 And they had already put them on Medicaid.
02:12:16.000 They had already put them on Social Security.
02:12:18.000 They're giving them EBT cards.
02:12:20.000 They were housing them at the Roosevelt Hotel in New York City.
02:12:23.000 They were giving them money and helping them get to these states.
02:12:26.000 They were flying them through into America and putting them in these places because they were trying to get voters.
02:12:34.000 So another four years.
02:12:35.000 Another four years, they might have had it completely locked up.
02:12:38.000 You know, that's what the Democrats have said about the Republicans.
02:12:41.000 I mean, Oprah Winfrey was saying this might be the last election we have if we don't vote for coming out.
02:12:46.000 Oprah Winfrey had Donald Trump on her show years ago.
02:12:51.000 asking him to be president.
02:12:52.000 Yeah, they were mates, yeah, yeah.
02:12:54.000 Oh, yeah.
02:12:55.000 Look, they all get captured.
02:12:56.000 They all get captured by groupthink and ideology, and they all get captured by money and protecting it and who's going to protect them.
02:13:04.000 But we don't have that safety valve in the UK.
02:13:07.000 So like I say, you were able to, for all the imperfections, you were able to vote in an administration that was actually going to rip out that whatever you call it.
02:13:15.000 This system is better.
02:13:16.000 It showed the system is better.
02:13:18.000 Even though the system was trying to get rigged, enough people revolted against it.
02:13:23.000 Yes.
02:13:23.000 But look at the ideas that you're attaching to this administration.
02:13:29.000 Like, look, the ICE stuff is horrific.
02:13:31.000 The people getting shot, it's horrific.
02:13:33.000 We all agree to that.
02:13:34.000 There's a lot of the authoritarian aspects.
02:13:36.000 It's horrific.
02:13:37.000 But what they've stopped is all of this illegal immigration.
02:13:41.000 They've stopped all the illegal immigration.
02:13:44.000 Legal immigration is still available.
02:13:46.000 And then what they've also done is investigate literally billions of dollars in fraud, and they're uncovering it over and over and over and over again.
02:13:54.000 So there was obviously crime that was going on that was not being addressed by the previous party.
02:13:59.000 And this is one of the reasons why they didn't want the Republicans getting in in the first place.
02:14:03.000 So they still have to label them in the most horrific ways possible, accentuate all the negative aspects of what's going on with the ICE stuff, but not talk at all about the economy taking an uptick, not talk at all about GDP, not talk at all about tariffs being effective, not talk at all about any of the positive things.
02:14:22.000 Stopping wars.
02:14:23.000 He stopped wars in multiple different countries.
02:14:25.000 Stopped conflicts.
02:14:27.000 No one's talking at all in an objective sense.
02:14:30.000 This is a Nazi party.
02:14:32.000 These are fascists.
02:14:33.000 We have to have no kings.
02:14:35.000 Stop the fascists.
02:14:36.000 So these narratives are just being pushed out there constantly by the media.
02:14:42.000 All the while these politicians are absolutely terrified that these investigations are going to start moving into their states and uncovering more and more and more fraud, which they're going to.
02:14:52.000 I mean, I know you say it's so reckless, though, I think, as well, for the Democrats to, like you say, paint ice as Nazis, talk about that this is the equivalent of the Gestapo, I think someone used that phrase.
02:15:01.000 I mean, I know what you're saying about the shootings, obviously we all agree it's absolutely horrific.
02:15:04.000 Any kind of situation where the police inflict that kind of violence on someone needs to be thoroughly investigated and looked into and all the rest of it.
02:15:10.000 But I'm concerned about the politicians saying, no, go there.
02:15:13.000 Get in the way of federal agents while they're enforcing the law.
02:15:16.000 They're just trying to be popular.
02:15:19.000 They're putting people's lives at risk, aren't they?
02:15:22.000 But it's again that chess move again, giving up the rook or attacking a rook and giving up your queen because of it, because you just want the current.
02:15:31.000 Well it's, it's working insofar as the like, the public is turning against Trump because of what's happening with ICE.
02:15:35.000 I mean, that's what?
02:15:36.000 There's certainly a lot of that.
02:15:37.000 Yeah, there's certainly a lot of that.
02:15:38.000 The the, the narrative is out there, but it's dependent upon how far it goes.
02:15:43.000 Yeah right, they've got to de-escalate this violence.
02:15:46.000 Yes, they've got to make sure that that.
02:15:48.000 But you also need support of local police.
02:15:51.000 You can't have people attack the hotels where these ICE people are staying and have no support whatsoever by the police.
02:15:57.000 That's crazy.
02:15:58.000 They're being told to stand down.
02:16:00.000 So this is messy stuff and you yeah, but look how hard it was.
02:16:03.000 I mean, you talk about how we Trump has come in and he's stripped away all this stuff and this fraud and.
02:16:09.000 But that was he didn't do it in the first term, it's only when he got to the second term and it was planned and he had Doge set up and he had Musk in place and all of this deep state stuff could be identified and stripped out and worked out.
02:16:20.000 A lot of deep state people in his cabinet.
02:16:22.000 The first term he didn't know, so he couldn't work against it, right.
02:16:25.000 But we can't.
02:16:25.000 In the Uk uh, just to sort of explain where I think we are, there is, we can't do that because we, we have the two major parties, are both ideologically in lockstep effectively, right.
02:16:36.000 So so I mean most of the woke stuff was pushed through the the Conservative Party.
02:16:40.000 They were in power for 13 years.
02:16:42.000 Uh, they're ostensibly running.
02:16:45.000 They pushed through all the genders, self-recognition stuff.
02:16:48.000 Why do you think the Conservatives did that?
02:16:49.000 So why is a good question?
02:16:51.000 So the Prime Minister, Theresa May, Conservative Prime Minister at the time, she said in her autobiography, I'm woke and proud.
02:16:59.000 Can you imagine Trump saying that?
02:17:01.000 It's the equivalent.
02:17:02.000 It's the equivalent.
02:17:03.000 So I think it's because something about this ideology infected every side of the political art, particularly in the UK.
02:17:11.000 What might happen now in the UK is reform are probably going to win the next election.
02:17:16.000 That's in three years' time.
02:17:17.000 And that's so seismic because it will blow apart this two-party system that we've got.
02:17:22.000 That probably couldn't happen in America, right?
02:17:23.000 You probably couldn't get it.
02:17:24.000 Do you have a third party that can win?
02:17:25.000 We have a third party that can win.
02:17:27.000 That's new.
02:17:27.000 Really?
02:17:28.000 And we haven't had that for a long, long, long, long time.
02:17:31.000 But you think what is the possibility that it could win?
02:17:33.000 You think it's 50-50?
02:17:34.000 Look at it this way.
02:17:35.000 We've been sort of veering massively from, you know, the Conservatives under Boris Johnson won this mad, mad majority, like 80-seat majority, and they could do whatever they want and they squandered it.
02:17:44.000 People were so resentful of what happened with Johnson, who, by the way, let in more migration than illegal migration than we've ever had, right?
02:17:52.000 Did he do that for cheap labor?
02:17:54.000 I mean, I think that's certainly part of it.
02:17:54.000 Probably.
02:17:56.000 Certainly that's part of it.
02:17:58.000 That's a problem that conservatives don't want to admit that they were.
02:18:01.000 You know, I had a conversation with a very prominent politician who explained to me that he had a conversation with a guy who was a CEO of a corporation that didn't want to stop the flow of illegal immigration because he wanted cheap labor.
02:18:13.000 And he was flabbergasted.
02:18:15.000 He was like, I can't fucking believe this guy's saying this out loud.
02:18:18.000 It's worse with Johnson because in their manifesto, they pledged not to do it.
02:18:22.000 So they had a promise.
02:18:23.000 They call it the Boris wave.
02:18:26.000 So that's how bad it was.
02:18:27.000 And then you have Starmer and the Labour Party who were just as bad, if not worse.
02:18:32.000 And we have a situation where it's unmanageable now.
02:18:35.000 And reform, this third party, Nigel Farage's party, is saying, no, we're actually going to tackle this.
02:18:40.000 And of course, ultimately, what happens is the public, they reach a tipping point and they say, by the way, Starmer is the least popular prime minister on any opinion poll ever in the history of records.
02:18:51.000 He's gone from a massive majority to nothing because he's been so useless on all of this stuff, because he's been so captured by the ideology, because he doesn't care about migration, because he said that anyone who was concerned about the grooming gang scandal was jumping on a bandwagon of the far right.
02:19:08.000 That's what he said.
02:19:09.000 So all of this has happened, but you can't blame the left.
02:19:12.000 It's the left and the right.
02:19:14.000 It's both of them.
02:19:15.000 It's why they call it the uni party.
02:19:17.000 It's the same thing.
02:19:18.000 So you need something else to come along and explode it.
02:19:21.000 What do you think the possibility of Farage winning?
02:19:24.000 Pretty high.
02:19:25.000 Right, so if it were today, he'd win.
02:19:26.000 If he didn't get whacked between now and then.
02:19:29.000 Do you guys whack people over there very often?
02:19:31.000 Less than here.
02:19:32.000 I think it's more an American thing.
02:19:35.000 It's a lot easier.
02:19:36.000 A lot more guns over here.
02:19:37.000 There's a lot more guns.
02:19:38.000 But fingers crossed, obviously, that won't happen.
02:19:43.000 But it looks like if it was today, he'd win.
02:19:46.000 There's obviously a couple of years.
02:19:47.000 I mean, he could mess things up.
02:19:48.000 Something crazy could happen.
02:19:50.000 Get caught with a live boy or a dead girl.
02:19:52.000 Something like that.
02:19:52.000 But I think with Starmer, people are just sick of it.
02:19:56.000 He has continually backtracked on all his promises.
02:19:58.000 He's not interested.
02:19:59.000 He dismisses people's concerns about immigration.
02:20:02.000 He dismisses people concerned about the mass rape of children in the grooming gang scandal.
02:20:07.000 They had to be dragged kicking and screaming to do an inquiry about that.
02:20:10.000 They didn't want to do it.
02:20:12.000 And because they're so terrified of being called racist, ultimately, so they let this thing slide.
02:20:17.000 So I think people are just sick of it.
02:20:19.000 I think people have reached the point where even I think people who don't like Nigel Farage will hold their nose and vote for a third party to explode the system.
02:20:27.000 And maybe we might be able to reset after that.
02:20:30.000 Maybe something could happen.
02:20:31.000 One of the things that's interesting in America is a lot of young people are becoming conservative.
02:20:35.000 That is interesting, yeah.
02:20:36.000 It's interesting because I think that's a force of the internet.
02:20:39.000 And being a conservative more today is more like being a rebel.
02:20:43.000 Yeah.
02:20:44.000 It's like bucking this system, whereas it used to be that if you were a rebel, you were left-wing.
02:20:50.000 You were like, you're a hippie.
02:20:52.000 You know, and that's not really the case anymore because the system that has power is a system that is pushing this one very particular ideology that also demonizes young males.
02:20:52.000 Yeah.
02:21:04.000 Hugely.
02:21:05.000 But that's also why I don't think it's about left and right anymore.
02:21:05.000 Yeah.
02:21:08.000 I think one of the things about the culture war is it kind of killed off left and right.
02:21:12.000 Like I say, in the UK, we couldn't vote this out.
02:21:14.000 We had a right-wing party.
02:21:16.000 It didn't make a difference.
02:21:17.000 The left-wing party makes it worse.
02:21:19.000 We had a prime minister, you know, Kier Starmer on radio saying that 99.9% of men, women don't have a penis, which means that there are, what is it, 35,000 female penises out there?
02:21:32.000 It's quite a lot, if you can picture that image.
02:21:34.000 You know, so that's our prime minister saying this crazy.
02:21:37.000 Our deputy prime minister said on TV that you could grow a cervix if you wanted.
02:21:44.000 That's David Lamy.
02:21:45.000 That sounds like I'm making that up.
02:21:47.000 He said that.
02:21:48.000 He said you could grow a cervix.
02:21:48.000 You can check that.
02:21:50.000 So these are the kind of people who are in charge now, who are it's just all about their fake, you know, fake ideology.
02:21:56.000 Which is why internet censorship is so much more prominent there.
02:21:59.000 That's what's going to happen.
02:22:00.000 Well, that's why they're going to absolutely try to do that.
02:22:02.000 Yeah, exactly.
02:22:03.000 Well, they are doing it.
02:22:04.000 Just self-censorship by arresting people.
02:22:06.000 There's a lot of censorship involved in scare.
02:22:08.000 Yeah.
02:22:09.000 Just in the fear of being arrested.
02:22:11.000 But the problem for reform will be, do they have the guts to do what Trump did?
02:22:14.000 Do they have the guts to come in and say, look, we need to scrap the civil service.
02:22:18.000 Well, you can't scrap the civil service, but you need to sort of bleed it dry.
02:22:22.000 You need to give it a good rinse, right?
02:22:24.000 You need to get rid of the – because there have been whistleblowers in the UK civil service who have said, we're not going to do what the elected politicians say.
02:22:31.000 If they come in and say there's an immigration problem, we're just going to stymie that.
02:22:35.000 We're not going to do what they want.
02:22:36.000 We've got police who are routinely investigating people for their opinions.
02:22:42.000 Just to put that into context, by the way, if we're talking about this deep state that we've got to clean out, our police force is trained by a body called the College of Policing.
02:22:52.000 They have been telling police for years, it's your job to arrest people for what they think and what they say.
02:22:57.000 And the High Court told them, you've got to stop this.
02:23:01.000 You've got to stop recording non-crime hate incidents.
02:23:04.000 Two home secretaries said to them, you've got to stop recording non-crime hate incidents.
02:23:09.000 They ignored the courts.
02:23:11.000 They ignored the government.
02:23:13.000 And that's the power of an ideologically captured quango.
02:23:17.000 That's crazy.
02:23:18.000 That's the problem.
02:23:19.000 So even when you vote for a party that's going to strip this stuff out, you still have to do the actual hard work of stripping out.
02:23:26.000 I would abolish the College of Policing.
02:23:28.000 Do people know about non-crime hate incidents?
02:23:30.000 Do they know that this is a thing in America?
02:23:32.000 Do they know that that's what we do?
02:23:34.000 Not really.
02:23:34.000 I mean, people are just aware that there's a lot of arrests because of social media posts.
02:23:38.000 We don't pay nearly as much attention to the UK as the UK pays attention to American politics.
02:23:43.000 Now, that's fair enough.
02:23:45.000 Because we're a small island.
02:23:46.000 That's fair enough.
02:23:47.000 But what I would say is it's worse than people think insofar as the 12,000 arrested a year, that's horrific.
02:23:53.000 But with the police routinely checking up on you if you commit non-crime, that's sort of even worse, isn't it?
02:24:00.000 The Scottish police have a database of jokes that they've seen online that they think are problematic.
02:24:04.000 And they've kept this.
02:24:05.000 The Scottish police introduced a hate crime bill two years ago now, which can prosecute you for things you say in your own house.
02:24:12.000 There's a section in that bill on the public performance of a play.
02:24:15.000 So if a play is offensive, they can arrest you.
02:24:17.000 If you're the director or an actor involved in the play and it's considered offensive, they can arrest you.
02:24:22.000 They set up, when they implemented that hate crime bill, they set up hate crime reporting centers.
02:24:28.000 So if you felt offended, and they converted like, there was a sex shop, I think.
02:24:31.000 There was a mushroom farm.
02:24:33.000 You could go and report hate to the police as and when it occurs.
02:24:38.000 And this is coming from the police force, the people who are supposed to sustain authority and prevent criminality.
02:24:45.000 And you've seen the viral videos of police coming knocking on people's doors saying, you said this thing online.
02:24:51.000 So I think it's worse than just the arrests.
02:24:54.000 I think it's a rotten system that is being trained by activists in the College of Policing that no government will deal with.
02:25:03.000 They don't get rid of these activists.
02:25:05.000 And the activists, when they're told to stop it, they carry on anyway.
02:25:10.000 The entire culture has to shift.
02:25:12.000 That's what I mean.
02:25:13.000 That's what I mean.
02:25:13.000 You need a politician to go in and say, scrap the college police in, strip out all the activists within the NHS, within the army, within the police, within the Crown Prosecution Service.
02:25:23.000 It also has to get so bad that people realise how bad it is and they need radical change.
02:25:27.000 But I think the grooming gangs did that.
02:25:29.000 I think the fact that we effectively sacrificed thousands of kids on the altar of ideology, the fact that we said, you know, there were politicians, counselors, doctors, social workers, saying we don't want to be called racist, so we're going to ignore the sexual assault of children on a mass scale.
02:25:48.000 And that was not really thoroughly covered here in America in mainstream news.
02:25:52.000 I think because Elon – No, online it was, but not in mainstream news.
02:25:56.000 So do people not generally know about that?
02:25:58.000 They know about it now.
02:25:59.000 Right, okay.
02:26:00.000 But it wasn't something that you would see every night on CNN.
02:26:03.000 Really?
02:26:04.000 That's a huge story.
02:26:05.000 But the power of being called racist became so intense.
02:26:10.000 I mean, even, you know, that horrible bombing at the Manchester Arena at the Ariana Grande concert, in the subsequent report of what went wrong, one of the security guards said he saw the perpetrator with the rucksack and he didn't approach him or apprehend him because he was afraid of being called racist.
02:26:27.000 That was the reason.
02:26:28.000 And as a result of that, two dozen children lost their lives.
02:26:31.000 The power of smearing someone as racist is so potent, which is why I think here in America, the word fascist, the word Nazi gets thrown around so much because they know if someone is so branded, you disoblige yourself from having to engage with their ideas.
02:26:47.000 They become this kind of monster that you don't have to even think about or worry about.
02:26:51.000 And we're just, I think we're just getting over that in the UK now where the accusation of racism no longer really sticks.
02:26:58.000 I think people think it doesn't mean anything anymore.
02:27:01.000 And, you know, they've tried with reform.
02:27:04.000 They've tried saying that reform is a racist party.
02:27:06.000 It's a far-right party.
02:27:07.000 No one's buying it anymore.
02:27:10.000 And I think that's why hopefully something can change.
02:27:12.000 I think the grooming gangs, I think the mass immigration, to the extent where people are now at risk, they just are.
02:27:18.000 Unvetted people, many with criminal records.
02:27:21.000 We don't want to go the way of Sweden.
02:27:23.000 I mean, you know how bad Sweden's got.
02:27:26.000 You know, Sweden used to be the most high trust society in Europe, low crime.
02:27:32.000 They allowed mass immigration on a scale they couldn't possibly contain.
02:27:36.000 I think it's now 20% of Swedish population are now foreign-born.
02:27:40.000 And predominantly they live in ghettos where crime is rife.
02:27:44.000 They didn't integrate.
02:27:45.000 There was no expectation they should integrate.
02:27:47.000 And as a result of that, it's gone from being one of the safest countries in Europe to being the country that has most gun and bomb attacks of any country not at war except for Mexico.
02:27:58.000 And that's happened in the space of 10 years.
02:28:00.000 Crazy.
02:28:00.000 It's an absolute trap.
02:28:01.000 I remember when it was going on, a Swedish stand-up friend of mine, Tobeas Pearson, texted me saying there's grenades going off in Stockholm.
02:28:10.000 There's gunfire on my street.
02:28:11.000 And the politicians are doing nothing about it.
02:28:14.000 They're saying this doesn't matter.
02:28:16.000 I was in Sweden a couple of years ago.
02:28:17.000 I was talking to a bunch of...
02:28:19.000 You know what Swedes are like.
02:28:20.000 They're very middle class, very, well, all of them, obviously, but very liberal.
02:28:24.000 Not a racist tread in their body.
02:28:27.000 And they all came back to the same story.
02:28:29.000 They all wanted to discuss immigration.
02:28:30.000 And they all come back to the same thing.
02:28:32.000 One woman said to me, I got this wrong.
02:28:33.000 We got this wrong.
02:28:34.000 Why do you think they did it?
02:28:36.000 Good intentions, first and foremost.
02:28:38.000 Really?
02:28:39.000 Okay, well, there's a really?
02:28:41.000 You think it's just good intentions to let all those people in?
02:28:43.000 Have you met Swedes?
02:28:44.000 I have.
02:28:46.000 It's happening in America.
02:28:46.000 Come on.
02:28:48.000 It's happening in England.
02:28:49.000 It's happening in the UK.
02:28:50.000 Yes.
02:28:50.000 It's happening in Ireland.
02:28:52.000 It's just good intentions everywhere.
02:28:54.000 Could it also be, could it also be this delusion, this idea, what you would call, I suppose, liberal universalism.
02:29:00.000 This idea that everyone is basically the same.
02:29:03.000 Everyone in every culture basically wants the same things.
02:29:07.000 It explains the Queers for Palestine phenomenon.
02:29:10.000 It doesn't matter where you go.
02:29:11.000 The Queers for Palestine phenomenon is explained by the internet and people being stupid and being in a bubble where they never experienced those folks.
02:29:11.000 No, no, no.
02:29:18.000 I don't think that I think this is organized.
02:29:20.000 I think it's organized.
02:29:22.000 I think the more chaos there is, the more they can crack down on your rights.
02:29:26.000 I know you think it's organized.
02:29:27.000 I'm not convinced of that yet.
02:29:29.000 I'm open to it.
02:29:31.000 I mean, at one point in time, it's fairly universal in Western societies.
02:29:37.000 In America as well.
02:29:37.000 Yes.
02:29:39.000 For the last four years before Trump got into office, that's what they were doing here.
02:29:43.000 It seems like a strategy.
02:29:44.000 It doesn't seem as simple as just good intentions.
02:29:47.000 I know.
02:29:47.000 Well, and that does seem too simplistic.
02:29:49.000 I absolutely agree with that.
02:29:50.000 You create more chaos.
02:29:51.000 The more chaos you have, the more laws you need.
02:29:54.000 The more laws you need, the more control you have.
02:29:56.000 But speaking to these people in Sweden, I mean, it was an event where we were talking about a book I'd written, so it was all about these issues.
02:30:02.000 And I was mingling and talking to them.
02:30:04.000 And they all wanted to talk about it.
02:30:06.000 But they're the citizens.
02:30:07.000 That's what I mean.
02:30:08.000 They're the people that implemented those laws in the first place.
02:30:10.000 That's where I'm cynical.
02:30:12.000 I think the people that implement those laws in the first place, they know what they're doing.
02:30:15.000 Yes, and well, certainly they're aware of the risks.
02:30:18.000 I mean, if you take what happened in Cologne, that New Year's Eve party, where I think over 800 women were sexually assaulted, and the media didn't report it.
02:30:27.000 And the government wanted to sort of minimalize it and say that this wasn't real.
02:30:30.000 It's not even just the risks, it's the physical, actual, measurable consequences.
02:30:35.000 Yes, exactly.
02:30:35.000 And they're not course correcting.
02:30:37.000 That, to me, leads me to think that they know what they're doing.
02:30:41.000 You don't think it could just be complete naivety, this idea that...
02:30:44.000 I think it's the best way to combat the internet.
02:30:46.000 The best way to combat the internet is to create a massive amount of chaos and then crack down on people's lives.
02:30:51.000 I suppose what worries me about it is, though, the assumption that it's all sort of coordinated will take you down that route where you start thinking, as some friends of mine now think, the world is controlled by a group of Satanists who sit in a room and they choose the leaders and they do know what I mean?
02:31:08.000 Well, I don't think it's Satanists, but I think it's incredibly wealthy people.
02:31:12.000 But why would it be in their interest to destroy the economy that so sustains them?
02:31:16.000 Well, it depends on where they are and who they are.
02:31:18.000 But George Soros clearly does that, and he's talked about it.
02:31:22.000 He's talked about enjoying destroying democracies and enjoying destroying countries.
02:31:27.000 He's kicked.
02:31:28.000 He's not allowed to go into certain countries.
02:31:30.000 He makes money doing it.
02:31:31.000 But he relies on those democratic societies to make, you know.
02:31:34.000 Yeah, but they're still functional.
02:31:35.000 He just profits off of it, largely.
02:31:37.000 what i struggle with though like you know someone who believes in fundamentally the capitalist dream can't you can it's subject to manipulation Yeah.
02:31:50.000 And intelligent, evil people, or at least amoral.
02:31:53.000 But this doesn't answer why people do vote for it, and they do.
02:31:57.000 But they do vote for it because they've done a really good job of attaching it.
02:32:01.000 And there's also this ideology thing.
02:32:03.000 There's left and right.
02:32:04.000 And if you're left, you're blue, no matter who, blue to the grave.
02:32:08.000 That's it.
02:32:08.000 And if anybody that votes red is a dirty, racist, fascist, and they think about it that way.
02:32:13.000 And we really have no option for a centrist party in this country, which is where most people lie.
02:32:18.000 Most people lie in the middle.
02:32:20.000 Most people are very socially liberal.
02:32:22.000 And most of the people that I know that even identify as conservative, they're very socially liberal.
02:32:28.000 But they're financially much more aligned with conservative ideology.
02:32:32.000 Sure.
02:32:33.000 Well, I think, I mean, I think ultimately, hopefully, the brick wall of reality is what cures this.
02:32:39.000 If we don't destroy society along the way, if we don't allow them to destroy society, if we don't completely erode all of our rights along the way.
02:32:48.000 And as you said earlier, you can get very close to that happening.
02:32:51.000 And rights lost are never regained.
02:32:54.000 Never.
02:32:54.000 Look at Australia.
02:32:56.000 They had one mass shooting.
02:32:57.000 They took their guns in the 1990s.
02:32:58.000 Then COVID came.
02:32:59.000 They're like, get in a fucking camp.
02:33:01.000 Yeah.
02:33:01.000 And they've just introduced a new hate speech law off the back of the Bondi Beach shooting.
02:33:06.000 And of course, this, again, is really draconian.
02:33:09.000 It goes way too far.
02:33:11.000 In fact, I think the Australian hate speech law is basically saying if someone does something that wasn't intended to stir up hatred, but it could conceivably have stirred up hatred among a theoretical group of people, then it's a crime and you can get five years in prison.
02:33:23.000 Sure.
02:33:24.000 And imagine blaming that on hate speech instead of blaming it on just letting wild, violent criminals emigrate into your country.
02:33:33.000 I mean, that's.
02:33:34.000 Imagine.
02:33:34.000 Yeah.
02:33:36.000 What amazing gaslighting.
02:33:38.000 By not saying, hey, maybe we should stop letting violent criminals enter into our country illegally and live here.
02:33:45.000 No, no, no.
02:33:46.000 What we should start doing is taking people that have done no crime whatsoever and create their dissent, create a crime based on their dissent.
02:33:55.000 I totally agree.
02:33:56.000 We had it in the UK.
02:33:57.000 We had a politician, horrible story, a guy called David Amis.
02:34:02.000 You know, he was stabbed to death by an Islamist at his surgery.
02:34:05.000 You know, politicians have, we call them surgeries where you meet face-to-face your constituents.
02:34:09.000 They come and you talk about the local issues.
02:34:11.000 I don't think they do that in America.
02:34:14.000 He stabbed him to death.
02:34:15.000 And then there was this parliamentary debate about how can we crack down on free speech online?
02:34:20.000 Right?
02:34:20.000 No, the problem was the knife-wielding maniac.
02:34:23.000 The problem was unchecked Islamism.
02:34:26.000 I mean, it really is what Besminov was saying.
02:34:29.000 Yeah, it's that thing of not addressing the, like after the not seeing the truth, not seeing the truth because you've been captured.
02:34:35.000 But you've been demoralized.
02:34:37.000 But I think what's better now is that people can see through that.
02:34:39.000 So like when Keir Starmer, after that horrible, I mentioned it earlier, the girls who were killed in the dance class by the guy who was a child of immigrants, his response to that was, okay, let's not deal with the fact that we've got radicalized individuals within our community, young people.
02:34:59.000 He said, let's ban buying knives off Amazon because the guy got the knife from Amazon, right?
02:35:04.000 You can also get them in shops here.
02:35:07.000 You can walk in and get shops.
02:35:08.000 Most people have a kitchen knife at home.
02:35:09.000 It's like one of the most common weapons.
02:35:12.000 And he banned ninja swords around the same time, which was a big blow to the ninja community.
02:35:17.000 But I kind of so crazy.
02:35:19.000 Like, that's the thing you go for.
02:35:20.000 You choose the thing that isn't.
02:35:23.000 But this is the idea of allowing this kind of chaos and having this be a coordinated plan, right?
02:35:30.000 The more chaos you have, the more you gaslight people, the more people are attached to an ideology, the more you can keep restricting their rights further and further and further until they're more and more frustrated until a lot of them just give up.
02:35:42.000 But we are at a position now where people are seeing through it all the time.
02:35:45.000 In the UK now, like no matter how much they smear reformers' far right, the polls just keep going up and up and up things.
02:35:51.000 Right, but it's because of the internet, because you have at least some dissenting voices.
02:35:56.000 have that and also the palpable absurdities of what the politicians are trying to tell you is real right is as big as reach that's why they're trying to crack down on pub talk Oh, and by the way, you know, the Labour Party has cancelled a number of local elections because they know they're going to lose them.
02:36:10.000 They've actually cancelled them.
02:36:11.000 They've cancelled them.
02:36:12.000 Well, they've said they postpone them while they're reforming the system.
02:36:17.000 But it's stuff like that where...
02:36:19.000 Get rid of the juries, cancel elections, and they're the good guys.
02:36:23.000 And at that point, it doesn't matter how much your propaganda or how much you think your propaganda is going to work.
02:36:28.000 The public are going to see through that.
02:36:30.000 And they say, hang on a minute.
02:36:31.000 You're saying that I can't vote.
02:36:32.000 You're saying if I end up in court, I may not have a jury.
02:36:35.000 You're saying I can't browse through Twitter.
02:36:36.000 You're saying I can't say the wrong thing online.
02:36:38.000 Enough is enough.
02:36:39.000 And I think they reach a point where they say, and some of the stories are so egregious.
02:36:44.000 Like, for instance, the guy, have you heard of a guy called Hamit Koskin?
02:36:48.000 I think he's Armenian guy who burned a copy of his Quran outside the Turkish embassy, right?
02:36:53.000 The idea of this was a protest against the Turkish government because he perceives Erdogan's government as, I suppose, supporting Islamism and the rise of Islamism.
02:37:02.000 So he protests outside the thing, burns the Quran.
02:37:04.000 Two people attack him, one with a knife, the other, some deliveroo driver starts kicking him.
02:37:09.000 He gets prosecuted in a court of law for inciting the violence.
02:37:13.000 And the judge actually says, the fact that you were attacked is proof that you were inciting violence, right?
02:37:19.000 It took the free speech union in the UK to have that overturned, to fight on his behalf, to say, that's a peaceful protest.
02:37:25.000 It was his copy of his book.
02:37:27.000 We don't have blasphemy laws in the UK.
02:37:29.000 But now the CPS, the Crown Prosecution Service, is trying to overturn that because they want to see this guy go down.
02:37:34.000 And that is what we're talking about.
02:37:36.000 We've got bodies like the Crown Prosecution Service saying, no, we want an Islamic blasphemy code in the UK.
02:37:41.000 The Labour Party wants an official definition of Islamophobia.
02:37:44.000 So you can't criticise.
02:37:45.000 You can't peacefully protest.
02:37:47.000 You can't burn a book that you bought.
02:37:49.000 You know, and all of that.
02:37:50.000 And we're seeing this happen in front of us.
02:37:52.000 And people are just saying, look, we believe in plurality.
02:37:55.000 We believe in freedom of religion.
02:37:56.000 You should be able to, you know, we've got nothing against Muslim people.
02:38:00.000 What we are objecting to is the idea that we shouldn't be able to ridicule your religion or mock your religion or protest against your religion.
02:38:06.000 And you're going to pathologize it by saying we've got a sickness, we're Islamophobic.
02:38:10.000 I think people, I think that case, the fact that you can't burn, I mean, some kid in a school in Wakefield accidentally scuffed a copy of his Quran and he got hit with a non-crime hate incident and there was a big issue and the police got involved.
02:38:24.000 You know, we have to hold fast to this idea that, no, no idea, no idea doesn't get criticized.
02:38:33.000 And so I just think the more stories like that happen, maybe I'm naive, but I think the British public's patience is kind of at the very end.
02:38:42.000 I hope so.
02:38:43.000 I hope it's not too late.
02:38:44.000 I really do.
02:38:45.000 But in the meantime, your book, The End of Woke, it's available.
02:38:49.000 Did you do the audio version of it?
02:38:50.000 I did.
02:38:51.000 It took me ages.
02:38:52.000 I'm glad you did.
02:38:52.000 Yeah.
02:38:54.000 Yeah, I'm sure it is, but it's always so much better when it's fun.
02:38:57.000 It's a safe voice, especially someone like you.
02:38:59.000 Thank you, Andrew.
02:39:00.000 Really appreciate it.
02:39:01.000 And I hope you guys figure it out over there.
02:39:03.000 But in the meantime, I'm glad you're here.
02:39:04.000 Well, I got away.
02:39:05.000 I'm glad.
02:39:06.000 But I mean, it shouldn't be that everybody has to escape.
02:39:06.000 I'm glad.
02:39:09.000 That's crazy.
02:39:10.000 No, I know.
02:39:11.000 You know, it's nuts.
02:39:11.000 And then what's going to be left?
02:39:13.000 Like, only people that are submitting and then the chaos of what you've loud in?
02:39:18.000 It's fucking nuts.
02:39:18.000 Exactly.
02:39:19.000 So you've got to make sure that America doesn't go to pop because I need this place to work.
02:39:22.000 I need it to work too.
02:39:24.000 Part of my business model.
02:39:26.000 Thank you.
02:39:26.000 All right.