The Joe Rogan Experience - August 06, 2025


Joe Rogan Experience #2361 - Graham Linehan


Episode Stats

Length

3 hours and 7 minutes

Words per Minute

184.36256

Word Count

34,611

Sentence Count

3,137

Misogynist Sentences

121


Summary

In this episode of Train My Day, I'm joined by my good friend Joe Rogan to talk about his recent fall off a scooter and the scar he got on his head from it. We also talk about how to deal with a broken nose.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:03.000 The Joe Rogan experience.
00:00:06.000 Train my day, Joe Rogan podcast by night, all day!
00:00:12.000 Yeah.
00:00:14.000 So you're telling me about the scar you got on your forehead recently.
00:00:17.000 Yeah, I got into a bar fight with some, some guys were insulting a woman.
00:00:21.000 Yeah, take care of business.
00:00:22.000 That's right.
00:00:23.000 Take care of a bit of business.
00:00:24.000 No, I fell off a scooter.
00:00:26.000 I fell off a scooter.
00:00:28.000 I was riding around Scottsdale feeling great and free because I'd never ridden a scooter before because I always thought that we called them scooter nonces in the UK.
00:00:39.000 And because there was no one around to see me, I just thought, oh, this is great.
00:00:42.000 I could do this all the time.
00:00:44.000 But I just immediately fell flat on my face.
00:00:47.000 Did you hit something?
00:00:48.000 No, I saw what looked like a ramp.
00:00:51.000 But it was a single step down.
00:00:54.000 So I went flying through the air.
00:00:56.000 And I remember when I landed, there was a weird moment when I landed where I thought, oh, that wasn't so bad.
00:01:01.000 I didn't screw myself up too badly.
00:01:03.000 But then there was a second crunch.
00:01:06.000 And I remember thinking, oh, I'm dead.
00:01:10.000 What was the second crunch?
00:01:11.000 I don't know.
00:01:12.000 Somehow I fell and I went to the bottom.
00:01:14.000 Double.
00:01:14.000 So you double fell.
00:01:15.000 Oh, yeah.
00:01:16.000 Yeah.
00:01:16.000 And, you know, I kind of like it because it makes me, it makes me look how I feel internally.
00:01:24.000 Busted up and changed.
00:01:25.000 Yeah, yeah, yeah.
00:01:26.000 It's been so.
00:01:26.000 So did it break your nose?
00:01:28.000 No, oh, I did break my nose.
00:01:29.000 Yeah.
00:01:30.000 But, you know, but, you know, it's one of those things.
00:01:33.000 Again, as I say, I quite like it.
00:01:35.000 I think it gives me some courage.
00:01:36.000 It doesn't mess you up.
00:01:37.000 Yeah.
00:01:37.000 You're fine.
00:01:38.000 You know, it's just a thing.
00:01:38.000 Yeah.
00:01:39.000 And it might stick with the story about the bar.
00:01:42.000 I think you already gave up the goods.
00:01:44.000 The problem with a story like that is I'm always like, is that better to get the fuck beaten out of you?
00:01:48.000 Like, when it's better to fall down.
00:01:48.000 By guys?
00:01:50.000 It's different if you're a fighter.
00:01:52.000 I would imagine you wouldn't be too happy about it.
00:01:54.000 Yeah, I'd be very upset that I got my ass kicked.
00:01:58.000 Yeah, yeah.
00:01:59.000 Although that's happened many times.
00:02:00.000 Yeah.
00:02:01.000 The broken nose is a problem.
00:02:03.000 Did you get your septum fixed and make sure that it's not all clogged up?
00:02:06.000 I didn't.
00:02:07.000 I did.
00:02:07.000 Didn't.
00:02:08.000 I know, yeah.
00:02:09.000 Okay.
00:02:10.000 That is one that I tell fighters whenever they retire.
00:02:12.000 I'm like, please get your nose fixed.
00:02:14.000 It'll change the quality of your life so much.
00:02:16.000 Okay.
00:02:17.000 Because if you can't breathe out of your nose, you're missing out on a large percentage of cardio.
00:02:22.000 Right.
00:02:23.000 And not just cardio, like for athletics, but just everyday life.
00:02:26.000 You're not breathing out of your nose.
00:02:27.000 You're not getting enough oxygen.
00:02:29.000 Sure.
00:02:30.000 Like you're probably oxygen depleted.
00:02:31.000 And then you become one of them mouth open dudes.
00:02:35.000 You did a whole interview with a guy who does that, doesn't he?
00:02:38.000 He kind of teaches himself to only.
00:02:40.000 James Nestor.
00:02:41.000 Yes.
00:02:42.000 He's got a great book called Breathe.
00:02:44.000 For anybody who's fascinated by breathing techniques and how much it can help you.
00:02:48.000 You can do a lot with breathing.
00:02:50.000 It's just a thing that no one does because it's like, oh, I'll look into that.
00:02:54.000 And then you put it off and it never happens.
00:02:56.000 Absolutely.
00:02:56.000 And I got to be careful because that's how old age gets you.
00:02:58.000 It doesn't get you in one go.
00:03:00.000 You don't suddenly become this bent old man.
00:03:02.000 It picks away at you.
00:03:04.000 Things like falling off scooters and you know what I mean, not getting it fixed.
00:03:07.000 Things like that are what pick away at you.
00:03:10.000 Yeah, yeah.
00:03:11.000 You got to be ahead of it.
00:03:13.000 You got to be vigilant.
00:03:14.000 Yeah.
00:03:15.000 Yeah.
00:03:15.000 Stay off the heroin, too.
00:03:17.000 I hear that's bad for you.
00:03:18.000 I was to get old.
00:03:19.000 You don't see all out of old heroin addicts.
00:03:21.000 No, there's a punk poet in the UK named John Cooper Clark, who is a bit of a genius.
00:03:27.000 And he was taking it recreationally for years.
00:03:30.000 And he only gave it up because he said so many people were worried about me.
00:03:33.000 I just couldn't, I couldn't deal with it.
00:03:35.000 They were just worried about me.
00:03:36.000 But he was one of these guys, I guess like Burroughs, William Burroughs, that he was able to just, you know, implement it in his life.
00:03:44.000 Well, I worked with, well, I worked out with a guy who was a longshoreman.
00:03:48.000 And he had this guy that he worked with that would shoot heroin at lunch every day.
00:03:53.000 Wow.
00:03:54.000 And he was fine.
00:03:55.000 He worked fine on the job.
00:03:56.000 He was totally functional.
00:03:58.000 He would take his hour lunch break.
00:04:00.000 He would go sit in his truck.
00:04:02.000 He'd get a bag from some guy.
00:04:04.000 And then he'd sit in his truck and shoot up.
00:04:06.000 Yeah, like real heroin, like shooting world.
00:04:06.000 Wow.
00:04:09.000 What was his job?
00:04:10.000 He was a longshoreman.
00:04:11.000 Oh, right.
00:04:12.000 So he worked on the docks.
00:04:13.000 Holy cow, you'd think you'd have to have your wits about you.
00:04:15.000 Well, some of those longshores, they did different jobs.
00:04:18.000 Like my buddy was a fish filleter for a long time.
00:04:20.000 So what would happen is you would get these huge trucks filled with fish and they would just fillet fish all day long.
00:04:26.000 And he was a boxing trainer.
00:04:28.000 And so he would rub like Vaseline on your face before you sparred.
00:04:31.000 You just smelled fish because he was just, he just smelled like fish all the time.
00:04:35.000 You couldn't get it off of him.
00:04:37.000 Yeah, yeah.
00:04:37.000 It was just a part of his odor forever.
00:04:41.000 I mean, he cut like thousands of fish a day, probably.
00:04:41.000 Right, right.
00:04:45.000 That's like the old dye pits I read in Toronto in the job that you didn't want was to work in the dye pits.
00:04:53.000 Because guys, what they'd have to do is they'd have to get into these like human-sized pits filled with dye and wrestle the dye into material.
00:05:00.000 I'm not sure which material it was.
00:05:02.000 Yeah.
00:05:02.000 Jesus Christ.
00:05:03.000 And your whole body be covered in dye.
00:05:05.000 You would look like the blue man group, you know, except from the neck down, you know?
00:05:09.000 And you would always smell of it.
00:05:12.000 And when you took a shower, I read this in a brilliant book called The Skin of a Lion by Michael Andace.
00:05:17.000 So it never comes off?
00:05:18.000 No, it does.
00:05:19.000 What they do is they come in a shower and then in one piece, the whole paint just falls on the ground like a skin.
00:05:26.000 But they never got rid of the smell, you know?
00:05:28.000 And probably the inside of your bathroom looks like shit forever.
00:05:32.000 It's like draping.
00:05:33.000 No, that all happens at work.
00:05:34.000 That all happens at work.
00:05:34.000 Oh, they do in that work.
00:05:36.000 Yeah.
00:05:36.000 This is in the old days, you know, way.
00:05:38.000 What must that be for you?
00:05:40.000 I know, yeah.
00:05:41.000 Your skin is an organ.
00:05:42.000 Yeah.
00:05:43.000 So you're making your organ doused in a chemical all day long.
00:05:48.000 Yeah.
00:05:49.000 Yeah, yeah, yeah.
00:05:50.000 Well, these are the jobs we used to have, you know?
00:05:52.000 Like, what's that thing they say?
00:05:53.000 Hard times make hard people.
00:05:56.000 Hard people make soft times.
00:05:56.000 Yeah.
00:05:58.000 And we're certainly in the soft times now.
00:05:59.000 Yeah, we're in soft times, make hard people.
00:06:02.000 Oh, there we go.
00:06:02.000 Or probably make hard times, rather.
00:06:05.000 Soft people make hard times.
00:06:06.000 So this is these folks working in it.
00:06:08.000 Yeah, but this would be the new version.
00:06:09.000 I think in the old days, in Toronto anyway, they used to get into the pits.
00:06:14.000 You know, so, yeah.
00:06:15.000 Cute.
00:06:17.000 Yeah, man, there's jobs out there that fucking suck.
00:06:20.000 I was watching this video today where this young lady was complaining about her job and that it just consumes her entire life and she doesn't want to do it anymore.
00:06:27.000 And then people were complaining in the comments that she's lazy.
00:06:30.000 I'm like, no, like she hates her job.
00:06:31.000 It's entirely reasonable.
00:06:34.000 It's a little kind of crazy that everybody wants to declare to the whole world what their personal issues are about their op.
00:06:40.000 I mean, social media has made it very weird that all these people just get attention for something that you essentially used to just talk about with friends.
00:06:50.000 Yeah, absolutely.
00:06:50.000 And that's part of the problem, isn't it?
00:06:52.000 I mean, I mean, every time we make these statements, we make it to a public.
00:06:56.000 So everything becomes political.
00:06:58.000 Yes.
00:06:59.000 You know, and suddenly everyone's, you know, moaning at you because you say you hate your job.
00:07:03.000 It's such a weird time.
00:07:07.000 And this is one of my things.
00:07:08.000 It's one of the things I'm obsessed about.
00:07:09.000 By the way, we better tell people who I don't know.
00:07:11.000 Yeah.
00:07:12.000 Graham, you have an well, I found out about you from our talent coordinator at the mothership, who's my good friend Adam Egot, who loves you to death.
00:07:19.000 And he loves your work.
00:07:20.000 He loves the shows that you've created.
00:07:22.000 He's a huge, huge fan.
00:07:23.000 So this was all because of Adam.
00:07:26.000 This all came about because of Adam.
00:07:28.000 Sure.
00:07:28.000 And then I heard the story.
00:07:30.000 I was like, oh my god, they did that.
00:07:32.000 That guy dirty.
00:07:32.000 They did him so dirty.
00:07:34.000 And it was one of the examples.
00:07:36.000 Why don't you tell the story of how it went down so everybody could kind of get it from your words, which I'm sure would be better than me.
00:07:42.000 Okay.
00:07:43.000 Fucking it up.
00:07:44.000 I'm not really good.
00:07:45.000 I've never been good.
00:07:46.000 I spent the last, you know, most of my life.
00:07:49.000 I'm 57 now, and I spent most of my life forming a sort of sort of, what's the word?
00:07:57.000 You know, what's that word?
00:07:58.000 Self-deprecating, humorous personality.
00:08:02.000 So I would come out and I would make fun of myself.
00:08:05.000 I can't do that anymore because my situation is so bizarre that anything I say that's self-deprecating will just get reported as truth and all sorts of things.
00:08:14.000 You can't make jokes in my situation, you know?
00:08:16.000 It's so weird.
00:08:17.000 So let's explain your situation, how it started.
00:08:20.000 Well, I was a comedy writer.
00:08:22.000 I started writing journalism in Ireland when I was very young, about 19 years old.
00:08:27.000 And I was hanging around with some funny people.
00:08:30.000 We were writing sketches and stuff like that.
00:08:32.000 So I went over to UK and I got in.
00:08:35.000 I was just very, we sent our sketches to producers.
00:08:40.000 We worked very hard to get on TV.
00:08:43.000 That succeeded.
00:08:44.000 And then I kept having early success.
00:08:46.000 I had a sitcom called Father Ted, which was about some Irish priests who were so bad that they'd been banished to a tiny island in the middle of nowhere.
00:08:54.000 That was a huge success, probably my biggest success.
00:08:58.000 Then I went on to a sitcom.
00:09:01.000 IT crowd.
00:09:02.000 Black Books was a big one over here.
00:09:05.000 All of them I co-wrote or wrote on my own.
00:09:07.000 And people have to understand that for people that are a fan of English comedy, like your shows were legendary.
00:09:12.000 These are amazing shows.
00:09:14.000 Thank you.
00:09:14.000 That's really kind of you to say.
00:09:16.000 Yeah, they're really big.
00:09:17.000 They're big over there.
00:09:19.000 Some of them travel over here.
00:09:20.000 Black Books did very well over here.
00:09:22.000 IT Crowd, I think, is probably the second most known one.
00:09:26.000 I think we're suffering a bit from content fatigue.
00:09:29.000 Absolutely.
00:09:30.000 There's almost too much to choose from.
00:09:32.000 Yeah.
00:09:32.000 Yeah.
00:09:32.000 Well, yeah, there's no real reason to.
00:09:34.000 I mean, you know, the only reason I would say to watch them is, you know, some of them are, some of them really hit the target, you know?
00:09:41.000 Yeah, some of them are great.
00:09:42.000 Watch them because they're great.
00:09:43.000 But I'm just saying that the problem with anybody finding out about a new show today, they're like, oh, geez, another one I have to pay attention to?
00:09:50.000 Yeah, yeah.
00:09:50.000 How?
00:09:51.000 And I think that that's got to burst soon.
00:09:54.000 I think we're, I mean, AI means the whole, who knows what the landscape will be like next year, you know?
00:09:59.000 It's weird.
00:09:59.000 Yeah.
00:10:00.000 It's really weird.
00:10:01.000 I watched an amazing video that someone put out today of a small film that they made just with prompts.
00:10:08.000 And it was some cyberpunk thing.
00:10:12.000 Oh, yeah.
00:10:13.000 Did you see that?
00:10:14.000 It's insane.
00:10:14.000 I know.
00:10:15.000 It is the next level.
00:10:17.000 It's so good.
00:10:18.000 It's like, I'd watch that movie.
00:10:20.000 I mean, they just made it in a few moments with prompts.
00:10:20.000 Yeah, yeah.
00:10:23.000 Yeah.
00:10:24.000 Like, it's over.
00:10:25.000 Yeah.
00:10:25.000 Except, I mean, the interesting thing, though, is I think I better get back to my story because I think it's a good idea.
00:10:30.000 Oh, we'll get to it.
00:10:31.000 Don't worry.
00:10:32.000 That's how we do things.
00:10:33.000 But the interesting thing is that I think personally there might be a revolution in those kind of smaller films that just need a few people.
00:10:42.000 You know, what's a good example?
00:10:43.000 Steven Soderbergh's early movies, Tarantino with Reservoir Dogs.
00:10:48.000 Right.
00:10:48.000 You might find people actually returning to human beings.
00:10:51.000 Because I don't know.
00:10:52.000 I mean, maybe I'm sure it will change.
00:10:54.000 But at the moment, there's a real uncanny valley feeling from all of these AI videos.
00:10:58.000 Well, this one has an uncanny valley for sure.
00:11:00.000 See if you can find it, Jamie.
00:11:01.000 Cyberpunk.
00:11:02.000 Somebody released it on X. I forget which engine they used.
00:11:08.000 There's a battle with all these engines.
00:11:10.000 See who's got the best.
00:11:12.000 I'm waiting it out, you know?
00:11:14.000 I mean, it's just like, just tell me what tools I can use to tell stories, and then I'll do them.
00:11:21.000 Well, people are, this is it.
00:11:22.000 Go full screen on this.
00:11:24.000 Give me some volume.
00:11:25.000 This is so wild.
00:11:27.000 Like, that woman looks incredibly lifelike.
00:11:30.000 But a little weird right there with the hair.
00:11:32.000 Yeah.
00:11:32.000 Music is going to get us flagged.
00:11:34.000 We don't need to play the music.
00:11:34.000 Okay.
00:11:35.000 The music is not that good anyway.
00:11:37.000 It doesn't mean anything.
00:11:38.000 And there's things with the lighting and stuff that don't really make sense.
00:11:41.000 But that's, I'm being stupid.
00:11:42.000 I'm being picky.
00:11:43.000 No, no, no, no, you're not being picky.
00:11:45.000 It's like we're being accurate.
00:11:46.000 Like if you were an expert and you were assessing whether or not this was real, I'd say, right there, definitely not real.
00:11:52.000 This is not real.
00:11:52.000 This is artificially generated.
00:11:54.000 But that could also be a style, right?
00:11:57.000 Like if you think about Robert Rodriguez when he did Sin City, there's a lot of that that didn't look real.
00:12:02.000 Like you could get weird with filming just to make the experience, you know, more bizarre because you're in this crazy sci-fi thing.
00:12:10.000 You could kind of make it look a little fake and it would be dope.
00:12:13.000 I think also when you think of what AI is going to do to voice lines, like if you look at a game like Grand Theft Auto, now you're going to be able to have generative dialogue from the characters, you know?
00:12:24.000 It's insane.
00:12:25.000 Well, you're also going to have porn where you have any woman that you want that you desire in your life.
00:12:32.000 Like you could take photos of someone that you know and turn them into someone who's like so attracted to you and just can't wait to have sex with you.
00:12:42.000 Like POV porn from your neighbor.
00:12:45.000 Yeah.
00:12:46.000 You know what I'm saying?
00:12:47.000 I know.
00:12:47.000 And it's like, it's like we're there.
00:12:49.000 We're getting there.
00:12:50.000 We're about to head into that.
00:12:51.000 You've read Jonathan Haight's book about, I can't remember which one it was.
00:12:55.000 Oh, cuddling and coddling.
00:12:57.000 Like 2007 was when depression and teenage girls went self-harm, depression, suicidal ideology, actual suicide went up.
00:12:57.000 Yeah.
00:13:07.000 All of it went up in this comparison culture.
00:13:10.000 Yeah, but also the iPhone.
00:13:12.000 Yes.
00:13:12.000 It was the iPhone.
00:13:13.000 That's what the comparison culture comes from.
00:13:15.000 I mean, that's the real comparison culture because they're comparing themselves to the Kardashians and people with massive amounts of plastic surgery and filters.
00:13:22.000 But now I saw a guy yesterday who said we're about to enter a 2007 moment.
00:13:27.000 In other words, the invention of the iPhone.
00:13:29.000 We're about to have another big change.
00:13:31.000 And we haven't even spoken about the last one.
00:13:33.000 I mean, that's part of my thing.
00:13:36.000 Let me go back and go back at the story.
00:13:38.000 But basically, I was like a very successful comedy writer, probably about as successful as a non-on-screen comedy writer can get in the UK.
00:13:46.000 I won something like six BAFTAs, I think, in the end, five or six BAFTAs.
00:13:50.000 I'm not being stupid.
00:13:51.000 I just genuinely can't remember.
00:13:53.000 And one of them they didn't give me the plaque for.
00:13:54.000 I must tell you that.
00:13:57.000 But I won, got a standing ovation at the comedy awards.
00:14:01.000 And then the moment I started talking about women's rights, they took everything, absolutely everything away from me.
00:14:08.000 Oh, my God.
00:14:09.000 So that's your, this is your version of it.
00:14:12.000 Yes.
00:14:13.000 Women's rights.
00:14:14.000 Yes.
00:14:15.000 And this is a version.
00:14:16.000 This is why it gets real weird.
00:14:19.000 You know, because as soon as you say there are some men that are going to use this, as soon as you say there are some men who we've known forever have been sexual deviants and perverts and psychotic creeps.
00:14:35.000 Yeah.
00:14:35.000 And you're giving them an out.
00:14:37.000 Instantaneously, you're letting them wear dresses and now they can't be touched.
00:14:37.000 Yeah.
00:14:41.000 That is a crazy thing to do.
00:14:43.000 And that doesn't deny the existence of trans people or in any way be transphobic.
00:14:43.000 Yeah.
00:14:48.000 It's not saying that a person can't choose to be whoever the fuck they essentially feel they are, their true self.
00:14:53.000 Yeah.
00:14:54.000 Well, I don't know how you feel.
00:14:56.000 I don't want to restrict you, but as soon as you start allowing men in dresses to get into women's spaces, and you frame it that way, you say this is about women's rights, then it's chaos.
00:15:08.000 Then there's no rational conversation when it should be totally rational.
00:15:12.000 With those factors, knowing that some men are creeps, knowing that women are more vulnerable, and you're going to allow these potential creeps to have carte blanche and just go into the women's spaces.
00:15:22.000 Yeah.
00:15:23.000 That is an that you, the screaming at me and the calling me a Nazi makes me think I'm over the target.
00:15:29.000 Well, you know, I think I could be wrong here.
00:15:31.000 This is a theory.
00:15:32.000 You obviously know more about it than me, but like I think that when they tried to get you for COVID, I actually think that that was sort of left over from you interviewing Megan Murphy and Abigail Schreier.
00:15:44.000 I think they really hated that you were giving them a platform.
00:15:47.000 Because when you think of it, no one else did.
00:15:50.000 No one else did.
00:15:51.000 If you look back at Megan Murphy and Abigail Schreier's appearances, and Abigail Schreier wrote the most important book about transitioning, the transitioning of young women, irreversible damage.
00:16:01.000 And she's had a terrible time as well.
00:16:03.000 She has had a terrible time.
00:16:04.000 And unfairly, because it's a real issue.
00:16:04.000 Yeah.
00:16:07.000 But this is the thing about a real issue.
00:16:09.000 When real issues come up, when there's a real ideological debate going on, like, hey, what is actually really going on?
00:16:19.000 That's when things get the most hostile.
00:16:22.000 Because when you can't really defend your position logically, then you start using pejoratives and turning everyone into Nazis and everyone into fascists.
00:16:32.000 And you start really fucking the argument up in a way that, like, if you're a normal person and I'm talking to you, well, I'm going to choose to not talk to you anymore because you're not rational.
00:16:40.000 I don't like talking to Mike.
00:16:41.000 He goes crazy and calls me a Nazi every time we disagree with something that is logically something that you should be debating, whether or not children have the ability to make these decisions at an early age and whether or not there's some kind of social contagion going on.
00:16:55.000 That's what Abigail brought up.
00:16:56.000 And I think that's super accurate.
00:16:59.000 And I mean, like, you know, this, I don't want to go too strong too early, but let me take an example, right?
00:16:59.000 Yeah.
00:17:06.000 The word trans people.
00:17:08.000 I see people using it all the time as if it has, as if it is a stable category.
00:17:14.000 And it's not.
00:17:15.000 It's not a stable category at all.
00:17:17.000 You know, when most people hear the word trans people, they think transsexuals.
00:17:23.000 But the number, according to, I think, a 2016 study, the number of men who identify as trans and aren't having any surgery at all is something like 90%, right?
00:17:35.000 So you have a whole whole group of people out there who are transvestites, okay?
00:17:41.000 To give them the actual word that refers to their condition areas.
00:17:45.000 Or used to forever.
00:17:46.000 Yes.
00:17:47.000 Until it became a pejorative.
00:17:48.000 Well, then it just disappeared.
00:17:51.000 All the transvestites just disappeared.
00:17:53.000 I think it's a fragmentation now.
00:17:55.000 I don't think they like it.
00:17:56.000 They don't like it because they know that if you use that term, it reveals the truth.
00:18:01.000 And the truth is that 90% of these men are putting on a dress and expecting to be given every single right that women have.
00:18:09.000 And it's an absolute lie.
00:18:11.000 It's a delusion.
00:18:12.000 It's a mass delusion.
00:18:14.000 It's a cult.
00:18:16.000 It's like, and I genuinely don't think it would have existed without the internet.
00:18:20.000 You know, the internet superpowered it and they gave rise to things like, you see it online, the same repeated phrases over and over again.
00:18:28.000 Trans women are women.
00:18:30.000 How are you going to keep men out?
00:18:31.000 That's one thing they say about single-sex spaces.
00:18:34.000 How are you going to keep men out?
00:18:35.000 Well, you know, we were always able to in the past because most men were decent and weren't trying to get into women's spaces.
00:18:42.000 But suddenly now it's a problem, you know?
00:18:44.000 And so basically what we've done is we've created this false kind of civil rights movement.
00:18:50.000 It's not a civil rights movement.
00:18:52.000 It's a male push to undo every single thing that suffragettes won over 100 years ago.
00:18:58.000 You know?
00:18:59.000 Have you ever heard of the urinary leash?
00:19:01.000 Do you know this phrase?
00:19:02.000 No.
00:19:02.000 The urinary leash was what was called when the suffragettes, before the suffragettes won the right to vote and single-sex spaces and so on, the women of a household were not able to go too far from their house because there were no public toilets.
00:19:19.000 And all the public toilets, there were public toilets, but they were mixed.
00:19:22.000 So men would be in them and they couldn't go into the toilets with the men.
00:19:26.000 So you had a urinary leash.
00:19:29.000 You had a leash that kept you close to your home.
00:19:31.000 Jesus Christ.
00:19:32.000 And that was one of the ways that men were able to exercise such power over women at that time.
00:19:37.000 Wow.
00:19:38.000 If you're a long time listener, you probably heard me talk about ExpressVPN and how great it is at keeping your data private and secure.
00:19:46.000 But that's not all it's good for.
00:19:48.000 Did you know that services like Netflix limit your options based on your location?
00:19:52.000 It's called geo-blocking.
00:19:53.000 And the fucked up part is that you, the paying customer, have to pay even more and subscribe to another streaming service to watch these titles.
00:20:01.000 If you use ExpressVPN, though, you can unlock thousands of new shows and movies worldwide without having to spend hundreds of extra dollars a year.
00:20:11.000 ExpressVPN is an app that lets you change your IP address to practically anywhere in the world.
00:20:16.000 Like if you want to access the entire library of free TV and movies on BBC, iPlayer, just use ExpressVPN to switch your location to the UK.
00:20:27.000 In fact, you can change your location to 105 different countries.
00:20:31.000 It's fast and easy too.
00:20:33.000 With just one click, you can hop on over to a place where the content you want isn't restricted.
00:20:39.000 And right now, you can get four extra months free if you tap the banner or go to expressvpn.com slash rogan.
00:20:46.000 That's expressvpn.com slash rogan.
00:20:49.000 And if you're watching on YouTube, get your four free months by scanning the QR code on screen or by clicking the link in the description.
00:20:58.000 And so urinary leash.
00:21:01.000 That's what they called it, yeah.
00:21:02.000 And now we have some, we have members of this so-called civil rights group who are basically just trying to bring back the urinary leash, you know?
00:21:13.000 So it's not safe for women to go into a space because they genuinely don't know if they'll share the space with a man.
00:21:19.000 So, you know, it's anyway, one of the problems with this fight is there's so many aspects to it that it's really, I've been fighting it for eight years.
00:21:30.000 Did you, so were you stunned by the reality?
00:21:34.000 Let's bring it back to the original thing that you did, your original offense to the cult, where they came for you.
00:21:41.000 My original offense was, I think, sharing a piece by a feminist named Heather Brunskill Evans that said exactly what you said just at the beginning, you know, a few minutes ago, where you said, yeah, people have things going on in their head.
00:21:52.000 They need to be respected.
00:21:54.000 They need to be helped.
00:21:55.000 And then someone wrote back immediately.
00:21:59.000 I was actually getting surgery for cancer at the time.
00:22:03.000 And someone wrote back and said, I wish the cancer had won.
00:22:06.000 Right?
00:22:07.000 Jesus.
00:22:07.000 And this was like sharing a very, very mild piece that just basically said women deserve rights.
00:22:13.000 Okay.
00:22:14.000 So that reaction, I thought, holy cow.
00:22:18.000 And then, but the strangest thing was all my friends and colleagues.
00:22:22.000 They just, they just completely ignored what was happening to me.
00:22:26.000 Not a single person stood up to say, hey, I know Gray Linahan, he's not a bigot.
00:22:31.000 And I'd made lots of these people famous, you know?
00:22:34.000 Not a single person stood up for me.
00:22:36.000 And the next thing that happened was that a sex offender, we found this out later, but a sex offender and kind of serial litigant in the UK, he reported me to the police, sued me on the same weekend.
00:22:52.000 And the police came to my home, or no, they phoned me that time.
00:23:01.000 And since then, I've been basically, the police just visit every so often on the orders of these, and this guy was sexy, he sexually assaulted a 14-year-old boy, you know.
00:23:11.000 And basically, the police in the UK are working for these men, you know?
00:23:16.000 So he can complain anytime he wants and they just visit you.
00:23:19.000 Yeah.
00:23:19.000 And there's no repercussions?
00:23:21.000 No, not so far.
00:23:22.000 He's been doing it for eight years.
00:23:24.000 And he's had women in prison cells overnight, you know?
00:23:27.000 I mean, it's his hobby, you know.
00:23:30.000 So this guy was the guy who reported me to police.
00:23:33.000 And then The Guardian reported that as Graeme Linehan is warned by police for harassing a trans woman.
00:23:42.000 Jesus Christ.
00:23:43.000 So now everybody is looking at this and they're thinking, oh, a trans woman, transsexual, poor transsexual.
00:23:47.000 No, again, just a bloke who's put on a dress and is taking a piss, you know?
00:23:53.000 So that destroyed my name, thanks to The Guardian.
00:23:57.000 And then after that, I couldn't get anyone to speak to me.
00:24:01.000 Like, you know, I mean, Father Ted might be big in the UK, but in Ireland, it's a bit of a national institution, you know?
00:24:10.000 It allowed the Irish to laugh at the Catholic Church, which for years had had a sort of oppressive effect over the Irish.
00:24:18.000 So they weren't able to laugh at it.
00:24:20.000 And suddenly Father Ted came out and it was a great kind of release to be able to laugh at silly things.
00:24:25.000 We weren't really attacking the church.
00:24:26.000 We were just making silly jokes.
00:24:28.000 Very surreal show, you know.
00:24:33.000 But nonetheless, it kind of chipped away at the Catholic Church.
00:24:39.000 And the Catholic Church just kind of lost a lot of power in Ireland.
00:24:44.000 So I thought in Ireland I would be at least understood and listened out.
00:24:50.000 People would listen to me, listen me out.
00:24:52.000 Is that the phrase?
00:24:53.000 Yeah, hear me out.
00:24:54.000 Hear me out.
00:24:55.000 And no, like there's a show in Ireland called The Late Late Show.
00:24:59.000 And like I brought out my biography a few years ago called Tough Crowd, which I should plug.
00:25:07.000 And The Late Late Show, which interviews every single person who's got the letter O apostrophe in their name, hasn't interviewed me.
00:25:17.000 And all of the Irish media has just pretend I've died.
00:25:20.000 They just pretend I've died.
00:25:23.000 All because of their guardian artists.
00:25:24.000 Not just because of that, because I just refused to back down.
00:25:29.000 They were saying, you know, I was constantly being told to apologize, and I hadn't done anything.
00:25:34.000 I would have people online would do fake screenshots of me apologizing for sending my pictures of my genitalia to women on a forum.
00:25:44.000 Jesus.
00:25:44.000 And they spread that.
00:25:46.000 And rather than my friends standing up for me, people would approach them on Twitter and say, why are you following Graham Linehan?
00:25:52.000 He's a bigot.
00:25:53.000 And they would just go, oh, sorry, and just don't follow me.
00:25:56.000 And so I lost 300,000, 400,000 followers in a few months.
00:26:04.000 And then we went into COVID and Twitter banned me for two years.
00:26:09.000 So then that became Graham Linnahan.
00:26:12.000 Was there a specific post or was it just because your reputation?
00:26:16.000 It was a combination of things.
00:26:18.000 I was causing more trouble for them by, I don't know, what did I do?
00:26:24.000 I went onto the website.
00:26:25.000 This is a funny, this connects me with Alex Jones.
00:26:28.000 Do you know this?
00:26:29.000 No.
00:26:30.000 Did Adam not send you this?
00:26:31.000 No, no, no.
00:26:32.000 Oh, my God.
00:26:34.000 So one of the things I did was I went on an app called Her Social, which is a lesbian app.
00:26:39.000 And I did it to show that men were joining these apps.
00:26:43.000 And they weren't, you know, some of them would put on a bit of lipstick, but most of them were just, they would look like you and me.
00:26:48.000 You know?
00:26:50.000 And they go onto this app and they say they put down their pronouns as she, her.
00:26:50.000 Yeah.
00:26:54.000 Oh, boy.
00:26:55.000 And they call themselves lesbians.
00:26:56.000 And if a lesbian complains about this, they're booted off the site.
00:27:01.000 Okay?
00:27:02.000 So I decided I would go on and call myself she, her and go on the site.
00:27:07.000 So I did it.
00:27:09.000 And then I had some friends who put me in Photoshop and did me in different outfits.
00:27:14.000 And one of them, I looked like my mother in the 60s.
00:27:16.000 She's wearing kind of Jackie Kennedy pink beret.
00:27:20.000 And Alex Jones was interested in the same story.
00:27:23.000 And I think I sent the link to you, Jamie, but like, oh yeah, here it is.
00:27:28.000 Alex Jones.
00:27:29.000 That's me.
00:27:30.000 That's a lesbian app.
00:27:30.000 I'm going to be honest with you.
00:27:32.000 I'm not usually for transvestites and stuff.
00:27:36.000 This one here.
00:27:37.000 This is a.
00:27:38.000 Oh, and you see the symbol they've got here.
00:27:40.000 You know what that symbol is right there?
00:27:43.000 Yeah.
00:27:43.000 These symbols all mean something.
00:27:45.000 What does that symbol mean?
00:27:46.000 I don't know.
00:27:47.000 It doesn't mean anything.
00:27:50.000 Fucking Alex.
00:27:51.000 Yeah, so I fooled Alex.
00:27:52.000 But like, like.
00:27:54.000 But the thing is, that's not outside of what you could find on there.
00:27:57.000 No, absolutely.
00:27:58.000 That's what's crazy.
00:27:59.000 But I was re-cancelled because of it.
00:28:00.000 Because they only reported about me.
00:28:02.000 They said Graham Linehan went on a lesbian app.
00:28:04.000 Oh.
00:28:05.000 Pretending to be a man.
00:28:06.000 Oh, so then you're a pervert.
00:28:08.000 Yeah.
00:28:09.000 Yeah.
00:28:10.000 Oh, you're a hypocrite.
00:28:10.000 So, and of course.
00:28:12.000 And none of my arguments are making it to the mainstream press because.
00:28:12.000 Yeah.
00:28:18.000 God, I wish I knew about this earlier.
00:28:20.000 I wish I'd sent you my notes.
00:28:22.000 The scandal never made it to the States except for Adam telling me.
00:28:26.000 My scandal.
00:28:27.000 Yeah, I mean, to like the comedy community in the States.
00:28:29.000 You didn't hear about it.
00:28:30.000 Almost there's too many of them.
00:28:32.000 And then the thing was, with J.K. Rowling, that was so big that they went for the queen.
00:28:38.000 They're trying to take down literally the most successful author of human history.
00:28:43.000 Yeah.
00:28:44.000 Hasn't she sold more copies in the Bible at this point?
00:28:47.000 Exactly.
00:28:48.000 Is that accurate?
00:28:49.000 Oh, I know, probably.
00:28:50.000 I think the Harry Potter books have sold more or are at least similar to the sales of the Bible.
00:28:58.000 She just wrote it in her lifetime.
00:29:00.000 Bible's been around for 2,000 years.
00:29:01.000 2,000 years Head Start.
00:29:03.000 Yeah.
00:29:04.000 And they went for her anyway.
00:29:05.000 in for her.
00:29:06.000 And there was like seven...
00:29:11.000 Best selling series in history.
00:29:13.000 So Harry Potter series alone is exceeding 600 million copies sold worldwide.
00:29:19.000 Yeah, it's crazy.
00:29:21.000 And, you know, and the kids who read her books are the ones trying to counsel her.
00:29:27.000 The kids who read their books aren't taking on board any of the lessons of the books.
00:29:31.000 It's very strange.
00:29:32.000 Well, it's hard to be courageous.
00:29:34.000 It's hard to step outside of the narrative.
00:29:37.000 And when there's a very forceful narrative that's being pushed, like, you know, what Elon likes to call the woke mind virus, like whatever that thing is that has like these very clear rules that you must follow.
00:29:50.000 Like people get real scared.
00:29:52.000 And they get real aggressive when they get real scared.
00:29:52.000 Yeah.
00:29:55.000 They don't want to get canceled themselves.
00:29:57.000 So they become some sort of an enforcer for these ideologies.
00:30:02.000 Yeah.
00:30:02.000 And it gets contagious and it turns into some sort of a weird culture war that's akin to a religious war.
00:30:09.000 They call it the other side call it a culture war to try and to try and to try and make it only that, right?
00:30:18.000 You know, they talk about it as a culture war to try and kind of keep it under in a framework they understand.
00:30:28.000 But it's not a culture war, really.
00:30:29.000 It's like it's women's lives.
00:30:32.000 You know, you're talking about 51% of the population.
00:30:35.000 You know, they fought for these rights 100 years ago.
00:30:38.000 And now they're trying to take them away by stealth.
00:30:41.000 There's a nurse in the UK at the moment.
00:30:44.000 Her name is Sandy Peggy.
00:30:46.000 And she had a doctor, six foot two, rugby playing doctor, who had started identifying as a woman, I think in 2022, right?
00:30:56.000 Using women's toilets in an NHS hospital every time.
00:31:00.000 She was bothered by this, but she tried not to say anything, okay?
00:31:03.000 Until finally she had her period and she went in and he was there and he asked her to leave.
00:31:08.000 She's now going through the whatever we tribunal.
00:31:12.000 Wait, he asked her to leave.
00:31:13.000 No, sorry, I put that wrong.
00:31:15.000 She asked him to leave.
00:31:16.000 Oh, I see.
00:31:16.000 And he refused.
00:31:18.000 He took it up as a complaint and now she's going through a work tribunal.
00:31:22.000 Not him.
00:31:23.000 He's not in trouble for going into women's toilets.
00:31:25.000 She's in trouble.
00:31:26.000 And that's the whole of the UK at the moment.
00:31:29.000 How did this happen?
00:31:31.000 You're an intelligent guy and you've had eight years to think about this.
00:31:35.000 What do you think happened where people lost all ability to objectively analyze all the various little things that are at work in this?
00:31:35.000 Yeah.
00:31:45.000 It's partly a problem with the internet, I think.
00:31:48.000 I think it's, first of all, the internet spread it.
00:31:50.000 One of the things that happened was there was a real supercharged moment for trans ideology when Tumblr banned porn because all the trans-identified kids who were all over Tumblr and porn was a big thing on Tumblr.
00:32:07.000 I remember Tumblr.
00:32:08.000 Yeah, yeah, it was a big kind of visual site.
00:32:11.000 I never used it.
00:32:12.000 Did you ever use Tumblr?
00:32:13.000 Didn't use it, but I know what he's talking about.
00:32:16.000 Yeah, it was big for generations.
00:32:18.000 Was it a social media thing?
00:32:19.000 Yeah, it was big amongst teenage girls, of course.
00:32:21.000 So you've already got a worry there, you know.
00:32:25.000 Oh, now I remember it.
00:32:26.000 Yeah.
00:32:27.000 Okay.
00:32:27.000 And they all came over to Twitter, and that's the moment when Twitter became extremely toxic in terms of talking about this issue.
00:32:34.000 And so there was a kind of a double thing going on.
00:32:37.000 Like, I remember when I was talking to people I knew about the issue, they simply couldn't talk about it.
00:32:43.000 It was the strangest thing.
00:32:44.000 They get scared.
00:32:45.000 I know, but even in personal one-on-one, they can't talk about it.
00:32:50.000 Well, it's religion, but it's showing itself in a new form.
00:32:53.000 Yes.
00:32:53.000 It's religion for secular people.
00:32:55.000 It's a religion.
00:32:56.000 It's a religion that's terrifying because the consequences of not obeying are you get ostracized and you get attacked and you get deplatformed and de-banked and do this and de that and labeled a bigot.
00:33:08.000 I had a West End musical ready to go based on Father Ted.
00:33:14.000 It was like, you know, you can't really ever guarantee a hit in terms of musicals, but it was the closest thing to a guaranteed hit you could get.
00:33:22.000 It was my pension.
00:33:22.000 I'd worked on it for about three or four years.
00:33:25.000 They cancelled it.
00:33:26.000 We had it up on the feet.
00:33:27.000 We had it up on its feet.
00:33:29.000 We had songs written.
00:33:32.000 We'd even performed it in front of audiences a couple of times just to generate excitement and interest.
00:33:38.000 It was ready to go.
00:33:39.000 So how did so many people go along with it where you can't?
00:33:44.000 How is it that you can make this reasonable argument on this podcast about why you think this is the case and what you think is going on?
00:33:54.000 And this is why you stood up against it.
00:33:56.000 And why can't there be some sort of a logical debate about this?
00:34:02.000 Like, how is this one issue so insanely third rail where you can't even touch it?
00:34:10.000 Like people don't even try to touch it.
00:34:13.000 There's a few reasons for that.
00:34:14.000 This episode is brought to you by ZipRecruiter.
00:34:17.000 There is such a thing as having too many options to choose from.
00:34:19.000 Like when you're scrolling on the TV trying to find something to watch or have you been to one of those ice cream shops where they have hundreds of different toppings to choose from.
00:34:30.000 It's overwhelming.
00:34:31.000 The same thing can happen when you're hiring and you get inundated with applications.
00:34:36.000 Well, it's time to stop stressing and use ZipRecruiter instead.
00:34:40.000 Their innovative resume database can help you find and connect with the best people for your role.
00:34:46.000 Try it for free now at ziprecruiter.com slash Rogan.
00:34:51.000 What makes ZipRecruiter's resume database so special is the advanced filtering feature.
00:34:57.000 You can use it to hone in on exactly what you're looking for from the hundreds of thousands of resumes that are uploaded monthly to the site.
00:35:06.000 And when you find a potential candidate, you can unlock their contact info instantly.
00:35:12.000 Skip the candidate overload.
00:35:14.000 Streamline your hiring with ZipRecruiter.
00:35:16.000 See why four out of five employers who post on ZipRecruiter get a quality candidate within the first day.
00:35:23.000 Just go to this exclusive web address, ziprecruiter.com slash Rogan.
00:35:28.000 Again, right now, try it for free.
00:35:31.000 Again, that's ziprecruiter.com slash Rogan.
00:35:35.000 ZipRecruiter, the smartest way to hire.
00:35:37.000 One of the main reasons is that the language of this movement is so deliberately obscure.
00:35:43.000 Like they did a poll recently.
00:35:45.000 They found out that when people were talking about trans women in women's sports, a lot of people, I think the majority of people, I'm not sure what the percentages were, but the majority of people thought they were talking about trans-identified females in women's sports.
00:35:59.000 Oh boy.
00:36:00.000 Yeah.
00:36:01.000 And in fact, even now, when you say trans women, some people are thinking trans men.
00:36:05.000 And I have to tell, I have to sometimes tell people the way to do it is think trans means opposite.
00:36:10.000 So if it's trans man, it's a woman.
00:36:11.000 If it's a trans woman, it's a man.
00:36:13.000 That's all it means.
00:36:14.000 It just means opposite.
00:36:15.000 And so this language, which is constantly being used.
00:36:18.000 If you see a press report about a, you know, this happens all the time.
00:36:24.000 You see a press report that says something like, I saw a great one that said something like, woman takes cocaine and then kills Alsatian or something like this.
00:36:33.000 And it's only that we know, it's only that me and the feminists who are fighting this know that it's a man that tell me it's a man.
00:36:40.000 Every other person reading that newspaper thinks they're talking about a woman.
00:36:44.000 It happens all the time.
00:36:45.000 It's happened here.
00:36:46.000 Yes.
00:36:46.000 It's happened here when a trans person has done something.
00:36:48.000 They call it a she, but it's a man that did it.
00:36:51.000 Yeah.
00:36:52.000 It's it's very strange.
00:36:52.000 Yeah.
00:36:54.000 And it's the press.
00:36:55.000 It's the press are, I mean, really when you say, why isn't it possible to be talked about?
00:37:01.000 It's because the press are helping confuse people.
00:37:04.000 You know, the press are actually aiding.
00:37:06.000 Like if you get a pedophile and you report him to be a man, oh, sorry, a woman when he's actually a man, then it's even harder to step back and go, we shouldn't have done that because you've actually already committed a terrible sin against journalism.
00:37:23.000 You know, you're not telling the truth.
00:37:23.000 Right.
00:37:25.000 You're not being accurate.
00:37:26.000 Yeah.
00:37:27.000 And it's just so strange that this is so potent that it allows people to give up those, give up their journalistic integrity.
00:37:34.000 Yeah, it's extraordinary.
00:37:35.000 But some of them don't even have any, for the first, in the first place, a lot of, like, one of the things that happened when I started talking about this is I started noticing, like, there was a magazine in England called Total Film, and that was calling me a bigot.
00:37:49.000 And all these different, and my old magazines that I worked for were calling me a bigot.
00:37:53.000 And then you see photographs of the guys, and it's always, you know, they've always got black fingernail polish, and they think they're a new kind of human.
00:37:59.000 You know, it's like you're not a new type of human being if you're worried.
00:38:03.000 The first thing about the internet has allowed them to all group up.
00:38:05.000 Yeah.
00:38:06.000 Whereas before the internet, it's a very small percentage of people that have autogynophilia or that, you know, fall into those categories.
00:38:14.000 And we've always kicked those people out of women's rooms.
00:38:17.000 And this is one of the really important things when you're talking about like trans bigotry.
00:38:22.000 It's only about men.
00:38:25.000 It's not about trans men.
00:38:26.000 No.
00:38:27.000 No one cares.
00:38:28.000 Well, here's the thing, right?
00:38:29.000 Trans men going into the men's room.
00:38:31.000 First of all, first of all, right?
00:38:33.000 That's one of the earliest kind of smears against the feminists fighting this, who are all in the UK, by the way, left-wing women, classic left-wing environmental, environmentalists.
00:38:44.000 No, they're cooks.
00:38:46.000 No, but childless cat ladies, as J.D. Vance calls them.
00:38:50.000 A bunch of kooks.
00:38:51.000 No, these are the good guys.
00:38:53.000 These are the good guys.
00:38:54.000 And they've been, and you know, but they've been smeared as right-wing bigots, you know, even though they spent all their life fighting things like section 20.
00:39:00.000 So the people that were, yeah.
00:39:00.000 Oh, I see what you're saying.
00:39:02.000 You know, there's different types of feminism, and I don't think that's sometimes appreciated in conservative society.
00:39:07.000 Well, does Megan Murphy?
00:39:08.000 Yeah, absolutely.
00:39:09.000 Good friend of mine.
00:39:10.000 She's great.
00:39:11.000 She's a legit feminist.
00:39:12.000 Yeah.
00:39:12.000 Like legit.
00:39:13.000 Yeah, I'm saying.
00:39:14.000 Hey, this is an infringement on women's spaces.
00:39:17.000 And immediately, everything, same thing as you, called a big, but she's in a different space because she's on the internet.
00:39:24.000 But she got kicked off of Twitter.
00:39:25.000 She got kicked off of Twitter.
00:39:27.000 She was all stronger than me.
00:39:28.000 Crazy.
00:39:28.000 Five years.
00:39:30.000 And being kicked off of Twitter allowed people to further lie about me online until my reputation was completely destroyed.
00:39:39.000 So I went in for a meeting with the people who produced the Father Ted musical, who also produced Father Ted back in the day.
00:39:46.000 And I walked in and everybody, I saw someone I worked with for years, a runner who had grown up with me as I worked with them on different productions, just looking at me like Elliot Gould.
00:40:00.000 Who was it?
00:40:00.000 Elliot Gould, what was the guy's name?
00:40:02.000 At the end of Invasion the Body Snatchers.
00:40:05.000 Oh, yeah, yeah.
00:40:06.000 And I thought, what the hell was that look?
00:40:08.000 And then I went into the office and they offered me £200,000 to walk away from the musical.
00:40:13.000 Wow.
00:40:14.000 And they thought I would take it because I was so desperate because I had lost every bit of work I had.
00:40:20.000 And I thought about it.
00:40:22.000 And at one point I said to them, well, as long as I can come in and watch the occasional rehearsal just to see if it's going well, you know.
00:40:30.000 And they said, no, we want a clean break.
00:40:33.000 Wow.
00:40:34.000 And I had brought them the idea.
00:40:36.000 I had more or less written the whole thing.
00:40:40.000 And they just thought they could do that to me.
00:40:43.000 So I thought, no, I'm not taking part in any further efforts to blacken my own name.
00:40:48.000 So I said, no, I'm not doing it.
00:40:50.000 And they won't make the musical now.
00:40:53.000 So I know the internet was a part of this.
00:40:56.000 But how did it get so kooky?
00:41:00.000 How did it get so kooky where people are willing to put women in these vulnerable positions because they don't want to offend this entirely tiny, very, very vocal part of the world?
00:41:13.000 Because as you say, because these tiny, very vocal, very, and you know, there's a, I want to make this clear.
00:41:19.000 There are a lot of trans-identified people who are completely sane.
00:41:23.000 Of course.
00:41:23.000 Who know what sex they are.
00:41:25.000 Of course.
00:41:25.000 Who are not trying to impose themselves in places where they're not wanted or where they would disturb or frighten women.
00:41:32.000 They're great, and I'm friends with a lot of them.
00:41:34.000 And they've existed forever.
00:41:36.000 Well, you know, when they say they're different, but there's always been people like that in the world.
00:41:39.000 Sure.
00:41:40.000 But, you know, when you come to the current iteration of the word trans people, what does it mean then?
00:41:45.000 Like, you know, when you're talking about just transvesticism.
00:41:48.000 Right, right, right, right.
00:41:49.000 That's where it gets squirrely.
00:41:50.000 Existed wherever they wear each other's furs, you know, in the caves.
00:41:55.000 It's like it's just clothes.
00:41:57.000 Right, but there's been historical tales forever.
00:42:00.000 There was actually a famous one about from the old west about this guy that was married to this woman.
00:42:05.000 And then when the woman died, he was out of town.
00:42:08.000 And the doctor found out that this woman that he was married to was actually a man.
00:42:12.000 Oh, sure.
00:42:12.000 And then, so then he committed suicide.
00:42:14.000 Oh, right.
00:42:15.000 Because he couldn't let everybody know that he was banging a dude this whole time.
00:42:18.000 Yeah.
00:42:18.000 But my point is, there's always been people that identified as a woman, but there also has always been perverts.
00:42:24.000 And so to deny the existence of one while pushing the other, it's like, yes, yes, yes.
00:42:29.000 I agree.
00:42:30.000 There have been people that feel like they're in the wrong body and the wrong gender.
00:42:33.000 That's always existed.
00:42:34.000 It's a reality of human civilization.
00:42:36.000 Also, what's that?
00:42:37.000 Yeah.
00:42:38.000 That's a guy in a dress with a heart on.
00:42:40.000 Like, there's autogynophilia is a real thing.
00:42:42.000 Men get turned on by dressing up like women.
00:42:45.000 Also, there's certain perverts that don't want to wear a dress, but they know if they do wear a dress, now they can get into the women's room.
00:42:51.000 They're going to do that too.
00:42:52.000 Of course they are.
00:42:53.000 If you put the line here, men will come up to that line.
00:42:56.000 Exactly.
00:42:56.000 If you put it up here, men will come up to that line.
00:43:00.000 You can't do it under the guise of compassion because, again, it's only about men.
00:43:06.000 When you talk about trans men, no one is complaining.
00:43:10.000 Let me tell you a little bit.
00:43:11.000 Let me tell you something, a little something about that, right?
00:43:11.000 In any way.
00:43:15.000 As you point out, they never talk about trans men.
00:43:17.000 And in fact, when you think about it, who are the famous trans men?
00:43:20.000 Very, very few of them.
00:43:21.000 Elliot Page.
00:43:22.000 Elliot Page.
00:43:23.000 A few other divers.
00:43:24.000 That's the most famous.
00:43:25.000 Yes.
00:43:26.000 Because she was a famous actress and then she became Elliot Page.
00:43:26.000 Right?
00:43:29.000 Yeah, well, like, actually, let me tell you something about Elliot Page's voice that'll be of interest.
00:43:35.000 But what was my point going to be?
00:43:39.000 Oh, yeah.
00:43:40.000 Trans men, out of the whole trans deal, get the worst deal out of it.
00:43:45.000 Trans women, all a man has to do is wear a dress and he is suddenly a trans woman, right?
00:43:51.000 But trans-identified women, they get double mastectomies, hysterectomies in their 20s and 30s, you know.
00:44:00.000 Every single young woman on testosterone will go into early menopause.
00:44:04.000 Early menopause brings with it a risk of dementia, incontinence, itching, and all sorts of fucking problems you don't want to put up with when you're these young women think they're going to be young men and they're actually turning into old women.
00:44:18.000 And no one has told them this, Joe.
00:44:21.000 Early menopause.
00:44:22.000 I know it makes a lot of them infertile for life.
00:44:25.000 You know, it's one of the things that happens is when you take testosterone, your ovaries confuse to, I'm not sure, some other part of your internal organs.
00:44:25.000 Oh, yeah.
00:44:36.000 And that means it becomes infected.
00:44:38.000 And that means, and that's why you see so many trans men having to have hysterectomies, you know?
00:44:44.000 If a woman has her breasts removed and then goes on to have a child later on in life, if they're lucky enough not to have been sterilized by the drugs, if they have a child later on in life, when the child cries, the tissue in their breasts will achieve because there's always tissue left behind after those operations.
00:45:03.000 And it will ache because it wants to feed the baby, but they can't.
00:45:07.000 No one tells these kids that.
00:45:10.000 The younger men who are, I met a detransitioner, his name's Richie Tulip.
00:45:14.000 He told me that there are, oh, actually, no, let me stick to trans men for a moment.
00:45:19.000 You know, the only time that trans men get famous in the same way that trans women do is if they get pregnant, right?
00:45:27.000 And then it's like they're on the cover of Time magazine or something.
00:45:30.000 Because, oh my God, a bearded pregnant lady.
00:45:30.000 Yeah.
00:45:33.000 And it's just, we've always known it.
00:45:36.000 We've seen fairgrounds with bearded ladies.
00:45:38.000 It's just testosterone.
00:45:40.000 It's just an excess of testosterone.
00:45:42.000 There's nothing magical or great about it.
00:45:46.000 In fact, it's very dangerous for women on testosterone to get pregnant because they could pass on.
00:45:55.000 I mean, this is how horrible it is.
00:45:57.000 There was a study in the UK published by a gender sociologist, I think she is, who works for Sheffield University.
00:46:05.000 And the study said, this is, what's her name?
00:46:09.000 Sally Hines.
00:46:10.000 And the study said that even if there's a risk of deformity to a baby, a trans-identified woman should continue taking testosterone because there was too much of an emphasis on babies born with normative bodies.
00:46:26.000 Oh my God.
00:46:27.000 Oh my God.
00:46:29.000 So preventative medication is a denial.
00:46:35.000 It's just insane.
00:46:35.000 Yeah.
00:46:35.000 Yeah.
00:46:37.000 Preventative medication would be a denial because you're denying the existence of people with disabilities as if they're not real.
00:46:45.000 They're not equal.
00:46:46.000 It's something like that.
00:46:47.000 That is so crazy.
00:46:49.000 Basically, they can, you know, that's another thing with this debate.
00:46:52.000 And there's other, there's, well, anyway, sorry, I want to stick to trans men.
00:46:56.000 But just that language is so Orwellian.
00:46:59.000 Yeah.
00:46:59.000 But if you look, look up.
00:47:00.000 How crazy.
00:47:01.000 But what a crazy way to justify potentially harming a child.
00:47:05.000 Yeah.
00:47:05.000 So we're putting too much emphasis on children that aren't harmed.
00:47:08.000 Her self-identity is more important.
00:47:11.000 That is so nuts.
00:47:12.000 Yeah.
00:47:13.000 And I've been trying to tell people that this has been going on for years.
00:47:17.000 That's the second most shocking story I know in this fight.
00:47:21.000 Will I just tell you this first one?
00:47:23.000 Sure.
00:47:24.000 Okay.
00:47:24.000 You know WPATH?
00:47:26.000 WPATH.
00:47:26.000 What is that?
00:47:27.000 WPATH is meant to be the world leader for trans healthcare.
00:47:31.000 It is where the whole world gets their orders for how to treat trans people.
00:47:37.000 Okay.
00:47:39.000 It is.
00:47:41.000 This is going to blow you away, Jill.
00:47:43.000 So there's a woman named Mia Hughes, and she published a piece called, she published a study called the WPATH files.
00:47:52.000 It hasn't been reported on anywhere.
00:47:53.000 No one is talking about it.
00:47:56.000 It came and went without causing barely a ripple.
00:48:00.000 She found out that WPATH, which briefly tried to make eunuch a gender identity, right?
00:48:07.000 She found out that they were linking to a website called the Eunuch Archives.
00:48:12.000 And the Eunuch Archives is mainly a repository of about, I don't know, I have it written down, but it's something like 8,000 short stories, something like that.
00:48:22.000 And they're just pornography about people cutting their dicks off.
00:48:26.000 WPATH linked to this site.
00:48:29.000 Not only that, but something like 40% of the stories are tagged minor.
00:48:37.000 Okay?
00:48:38.000 So these are the people who are cutting off young men's dicks and they are sharing erotic pornography about cutting off young men's dicks.
00:48:48.000 And Jamie has all the links.
00:48:50.000 This may sound that I'm pulling it out of my ass because it's so hard to believe.
00:48:55.000 That's another problem we have.
00:48:56.000 Some of these stories are so hard to believe.
00:48:58.000 It's so hard to inform people because you're only going to hear about something like this on a podcast.
00:49:02.000 Yeah, exactly.
00:49:03.000 The press won't report on it.
00:49:05.000 And when you think about it, and BBC, the BBC deliberately ignores this.
00:49:09.000 The BBC is outrageous on this issue.
00:49:13.000 But look at what they did with Jimmy Savile.
00:49:15.000 Oh, yeah, yeah.
00:49:16.000 But this is like forever.
00:49:17.000 But this is almost worse than Jimmy Saville because there's more kids being hurt, you know?
00:49:21.000 And the UK is addicted to ignoring scandals and to hurting, you know, to allowing children to be hurt.
00:49:28.000 You know, what's his name?
00:49:29.000 Kier Starmer, the UK prime minister.
00:49:33.000 When he came in, he said he would end the culture wars.
00:49:35.000 He hasn't ended the culture wars.
00:49:36.000 He hides from them while ordinary people still have to fight in court.
00:49:40.000 People like me and various women who are fighting this nonsense.
00:49:43.000 He's an absolute coward on this issue.
00:49:45.000 But the thing about the WPATH files is WPATH, this place that's sharing pedophilic castration pornography, is the world leader on trans healthcare.
00:49:59.000 Okay?
00:49:59.000 They're the ones that are bowed to on everything in this.
00:50:02.000 And they're the reason why doctors all over the world are giving these protocols to kids because there's a thing called the chain of trust that Mia Hughes writes about, which is an ear, nose, and throat specialist has to believe that other doctors know what they're doing.
00:50:20.000 And they have to believe that the head of any particular discipline knows what they're saying.
00:50:27.000 And what's happening with WPATH is they're issuing all this stuff.
00:50:31.000 And it's all just crazy nonsense.
00:50:33.000 One thing in the WPATH files they found out was there was one letter from, I think, one of the doctors associated with WPATH.
00:50:41.000 And she said, I've only ever refused a transition diagnosis once.
00:50:46.000 And that's when the patient was having a, that's because the patient had a psychotic episode in my office.
00:50:52.000 That's the only reason she didn't say, yeah, you're a man, because she was having a psychotic episode.
00:50:59.000 They tried to transition a homeless guy.
00:51:02.000 So when you think about it, he has the surgery, and the next day he's back in the streets with a wound that needs to be clean.
00:51:08.000 They tried to transition a homeless guy.
00:51:09.000 That's the WPATH files.
00:51:12.000 Is it their goal to just transition anybody?
00:51:16.000 It's purely a kind of ideological insanity.
00:51:19.000 Like one of the people who is involved in this, her name always jumps out of my head.
00:51:26.000 I can't remember her name, but she suggested that a baby who fiddles with the buttons on their baby grow is trans because they're indicating they don't like this baby grow.
00:51:36.000 They want to wear a male or whatever, you know.
00:51:40.000 Oh my God.
00:51:40.000 That woman was involved in the satanic panic scandal.
00:51:44.000 So she's moved from one insane, you know, mass delusion to another.
00:51:50.000 What was the satanic panic scandal?
00:51:52.000 Oh, do you not know this?
00:51:53.000 This was like 80s, I think, in the middle of kind of Midwestern America.
00:51:58.000 There was a lot of places that suddenly started believing in cults that were worshiping the devil and having sex with children.
00:52:07.000 And the thing about it was it was before the internet.
00:52:09.000 So it didn't actually spread that far.
00:52:11.000 You know, there were a few towns where it broke out.
00:52:13.000 Do you remember that three kids who were in jail for years for something they didn't do?
00:52:17.000 And they nearly tried to kill them.
00:52:18.000 And it was found out.
00:52:19.000 And they were just goths, you know, the stuff like that.
00:52:22.000 And it didn't break out of Middle America because the internet wasn't there.
00:52:26.000 But I have to think now, if you had, if the satanic scandal broke out again, you would certainly know about it because it would be all over the world.
00:52:33.000 And this person was involved in this?
00:52:35.000 Yeah.
00:52:36.000 Same, some part, it was something to do, she was something to do with a military base.
00:52:36.000 Yeah.
00:52:41.000 I wish I could remember her name.
00:52:43.000 She, as I say, she did this thing about babies popping their mini, they're popping the buttons on their thing, you know, which is, Yeah, this is a crazy person.
00:52:55.000 So, a crazy person who was a part of the satanic panic is now telling you that a baby fiddling with its buttons is probably trans.
00:53:03.000 Yeah, and people are listening, and people are listening, and wow, and you have it.
00:53:09.000 I mean, one of the things when you said, why has this happened?
00:53:12.000 Like, another thing that's happened is you've got to understand there's millions of things going on at the same time.
00:53:18.000 A lot of very bad men have been empowered, okay?
00:53:20.000 A lot of very bad men know they can walk into a female-only space, and at least they may even get a fucking payout if someone complains, right?
00:53:30.000 Right?
00:53:31.000 But then there's a lot of really lovely kids who are grown up and have been told, like boys who've been told that boys are evil, and they feel guilty because they think of women in a sexual way.
00:53:44.000 And, you know, there's stories of boys being castrated because of that.
00:53:50.000 They do not want to associate themselves with what they see as male toxicity, you know?
00:53:56.000 So, anyway, oh, yeah.
00:53:59.000 And then there's other, there's other things.
00:54:00.000 There's like, you know, people like that, Grifter, who said that about the baby grows.
00:54:05.000 There's all sorts.
00:54:06.000 It's like a gold rush.
00:54:08.000 If you create a, if you create a completely senseless system that has no rules, that anyone can be a woman if they put on a dress and it's just complete free-for-all.
00:54:21.000 It's going to be a gold rush.
00:54:23.000 It's not just that.
00:54:24.000 If you're part of that clan, you get to be very aggressive about defending these ideas.
00:54:30.000 Yeah.
00:54:30.000 To the point where you're allowed to hold up pistols and say we shoot TERFs.
00:54:34.000 Yeah.
00:54:34.000 I saw some of that stuff going on in the UK where it's really apparently very hard to get a gun.
00:54:38.000 Yeah.
00:54:39.000 But shoot TERFs.
00:54:40.000 Yeah.
00:54:40.000 No, that's big.
00:54:41.000 Yeah, I'll tell you what a really funny thing as well.
00:54:43.000 You see, you see British people holding up posters saying arm trans people.
00:54:48.000 And it's because they're completely Americanized.
00:54:51.000 This term, for people who don't know what it means, is trans-exclusionary radical feminism.
00:54:56.000 So you're talking about shooting a woman.
00:54:58.000 So there's a man who identifies as a woman holding a pistol who's about to shoot the real woman and this is openly promoted.
00:55:06.000 Yeah.
00:55:06.000 There's nothing else like that in the world.
00:55:09.000 Imagine if that was instead of TERFs, which is just really a woman saying a man with a penis shouldn't be allowed to be in the women's room.
00:55:17.000 That will turn you into a TERF.
00:55:19.000 And this is a person with a pistol saying shoot TERFs.
00:55:23.000 And that's okay.
00:55:23.000 Yeah.
00:55:23.000 Yeah.
00:55:24.000 And also, trans-exclusionary is such a lie because these women, they accept every woman, no matter how they identify.
00:55:33.000 So if a trans man comes in and says, I need help or something that's happened to me or whatever, these groups won't turn them away.
00:55:39.000 They're not trans exclusionary.
00:55:40.000 Right.
00:55:40.000 And they're male exclusionary.
00:55:42.000 A trans man using the women's room.
00:55:44.000 Yeah.
00:55:46.000 Like if a trans man was using the woman's room, what would the women, you know, if we do that's that gets weird, right?
00:55:46.000 But that's a weird thing.
00:55:54.000 People know, Joe.
00:55:56.000 These are small, tiny women with unconvincing facial hair.
00:55:59.000 They're just a giant one.
00:56:00.000 One big Viking lady who gets on the juice.
00:56:03.000 I don't know.
00:56:04.000 I could see how it could be an issue if it's there's there's people that turn into trans men that are very passable as men.
00:56:12.000 Sure.
00:56:12.000 Much more so, I think, much more likely than people that are women or male that turn into women.
00:56:19.000 Oh, definitely.
00:56:20.000 The effect of facial hair is such a mesmerizing effect.
00:56:23.000 It's definitely going to change your opinion.
00:56:24.000 Especially because there's kind of feminine men.
00:56:26.000 Yeah.
00:56:27.000 And they'll fall into that category.
00:56:29.000 But here's the thing.
00:56:30.000 They also have a bunch of things in common.
00:56:32.000 They're all lower than the average height of men.
00:56:34.000 Right.
00:56:35.000 And also their voices are croaky.
00:56:38.000 Have you ever noticed Elliot Page's voice?
00:56:40.000 It's very croaky now.
00:56:41.000 It's like this.
00:56:43.000 It's a little bit like Kennedy going there.
00:56:45.000 Not that bad.
00:56:46.000 Well, you hear about it for a little croakiness.
00:56:49.000 Female bodybuilders, so the original trans men.
00:56:51.000 Oh, okay.
00:56:52.000 Because they were rejecting testosterone.
00:56:54.000 Listen, female bodybuilders, we knew a long time ago that you could turn a woman into a man.
00:57:00.000 Yeah, well, like, at least visually.
00:57:02.000 Yeah, sure.
00:57:03.000 So, like, go give me some female miss Olympias.
00:57:07.000 I'm going to show you something that's literally not possible.
00:57:10.000 Okay.
00:57:10.000 This is not physiologically.
00:57:12.000 It's not possible for a woman to get this muscular because it's not a woman.
00:57:16.000 It's a science project.
00:57:18.000 So it's a biological science project.
00:57:20.000 And so this is like.
00:57:23.000 Give me some crazy ones.
00:57:24.000 You know, in East Germany, the women who were on steroids and so on.
00:57:29.000 They weren't told.
00:57:30.000 And on testosterone.
00:57:31.000 Oh, I'm sure.
00:57:31.000 The government.
00:57:32.000 And show me one from like the 90s.
00:57:36.000 Okay, how about that young lady in the lower right-hand corner?
00:57:40.000 Look at that.
00:57:40.000 Look at those muscles, bro.
00:57:42.000 Oh, Jesus.
00:57:43.000 Bro, that is so crazy.
00:57:44.000 But it does look like, I gotta say, it does look feminine in like the hips.
00:57:49.000 No, sure.
00:57:49.000 But it looks almost like a superhero.
00:57:53.000 That's another thing.
00:57:53.000 Hips don't lie, you know?
00:57:55.000 But there's another thing that happens.
00:57:57.000 How about that one, Jamie?
00:57:59.000 Scroll back.
00:58:01.000 Scroll up a little.
00:58:02.000 Yeah, that one right there, right above that one.
00:58:04.000 The one where her hands are on her hips in the middle, the blue one.
00:58:07.000 Yeah, look at that one.
00:58:08.000 Yeah.
00:58:09.000 Bro.
00:58:10.000 Look at those arms.
00:58:11.000 But that's just, you know, that's just, you know, someone who's doing that for a competition.
00:58:15.000 That's, you know, and not pretending to be a man.
00:58:17.000 But my point is, like, if that person did identify as a man and decided to start using the women's room because, you know, of their biological sex, that would freak some women out.
00:58:17.000 Right.
00:58:30.000 And to be honest with you, that's a kind of a gotcha that's often pulled out.
00:58:34.000 But in the end, the vast majority of the time, you know, and it's...
00:58:38.000 Yeah, exactly.
00:58:39.000 Also, it takes not just steroids to get that big.
00:58:42.000 Yeah.
00:58:42.000 It takes years of being in the gym and steroids for a woman to get that big.
00:58:46.000 But let me tell you what I was going to tell you about testosterone.
00:58:49.000 The croakiness.
00:58:50.000 Yeah, the croakiness.
00:58:50.000 Do you know why that is?
00:58:51.000 Why?
00:58:52.000 Because they have slender women's necks, but their vocal cords have expanded because it's testosterone.
00:58:59.000 And so a lot of trans-identified women have these croaky voices, you know?
00:59:04.000 I thought it was just the deepening of the voice because it took both of those, right?
00:59:07.000 It's probably both, but like the deepening of the voice comes because of things like this.
00:59:11.000 I don't know what Elliot Page sounds like now.
00:59:13.000 There was something else I was going to tell you about this that was really interesting about the oh man, what was it?
00:59:21.000 Sorry.
00:59:21.000 Trans men.
00:59:22.000 It's like eight years of stuff just crowding up in my head.
00:59:28.000 It'll come back to me.
00:59:29.000 It'll come back to me.
00:59:31.000 But yeah, these girls, oh, I know what it is.
00:59:34.000 Here's another fun fact, right?
00:59:34.000 Okay, here's another.
00:59:36.000 Okay.
00:59:37.000 Grinder, the gay men's dating hookup app, okay?
00:59:41.000 Trans men are going onto that app, expecting to be accepted by gay men, okay?
00:59:48.000 And they're not.
00:59:49.000 And again, if the gay men complain, they get thrown off.
00:59:52.000 Oh, no.
00:59:53.000 Gays, get a hold of your stuff.
00:59:53.000 Yeah.
00:59:55.000 Get a hold of your stuff.
00:59:56.000 Don't let them do that to you.
00:59:57.000 But here's what's happening on Grindr.
00:59:59.000 Well, here it is.
01:00:00.000 Here's what's happening on Grindr.
01:00:01.000 Straight men are joining Grindr to predate on those women.
01:00:06.000 On the trans men women.
01:00:07.000 Yeah, because some of the women, some of the women, some of the women haven't yet.
01:00:12.000 The testosterone hasn't taken over.
01:00:14.000 Oh, so they're catching them while they're vulnerable.
01:00:16.000 They catch them while they're vulnerable.
01:00:17.000 They say, hey, I'm a gay man.
01:00:18.000 Hey, I'm a gay man.
01:00:20.000 Oh, my God.
01:00:20.000 That's so good.
01:00:21.000 And so they're predating on these vulnerable, you know, confused young women who've been told that they are literally now gay men.
01:00:28.000 They're straight women.
01:00:30.000 What percentage?
01:00:31.000 I mean, how many numbers are we talking about where this is a strategy for getting laid?
01:00:37.000 I have seen a forum discussion between two guys who were just kind of sniggering about it amongst themselves.
01:00:42.000 Oh, my God.
01:00:43.000 It's really, it's really, I mean, you know, these kids, one of the things that gets me about this is that these kids are the kids that I was.
01:00:50.000 You know, they're just strange, not well-adjusted, spend a lot of time reading, maybe, sensitive.
01:00:57.000 A lot of girls who are caught up in this are the most emphatic, imaginative girls, you know?
01:01:04.000 And it appeals to them for some reason.
01:01:06.000 It appeals to them, maybe because it feels so they see men as just gliding through life in a way that they can't.
01:01:15.000 And also, they see pornography from when they're kids.
01:01:18.000 And the women in pornography are treated appallingly.
01:01:21.000 And so they are saying, nope, I don't want any of that.
01:01:24.000 And they think they can just.
01:01:26.000 But what they're really doing is they're stepping into a world where they have a four times higher chance of having a heart attack if they're taking testosterone than any than normal women.
01:01:36.000 They are going to die younger.
01:01:38.000 They're going to lose their ability to have children.
01:01:41.000 And no one is talking about it.
01:01:43.000 It's happening in clusters.
01:01:45.000 And that's the most disturbing thing that Abigail brought up.
01:01:48.000 A lot of these girls are diagnosed as autistic.
01:01:51.000 Yeah.
01:01:51.000 And they wind and they wind up doing these group clusters of six or seven girls, which statistically speaking is highly, highly unlikely to be natural.
01:02:05.000 And it leans towards the idea of a social contagion, and no one wants to believe it.
01:02:08.000 She also talks about how when you do take testosterone, when you're young, it elevates your sense of confidence.
01:02:18.000 It alleviates anxiety.
01:02:19.000 It does a bunch of things for you psychologically.
01:02:21.000 It creates more female school shooters.
01:02:23.000 Have you noticed this?
01:02:24.000 It has.
01:02:24.000 It does.
01:02:25.000 This has been happening recently.
01:02:26.000 It's a new thing.
01:02:26.000 Yeah.
01:02:27.000 And it's like, what I can never understand is why there don't exist people in the world.
01:02:33.000 I mean, there are.
01:02:33.000 They are there.
01:02:34.000 There's a lot of them there.
01:02:36.000 But no one on a high level, very few politicians.
01:02:39.000 Trump, in fact, is probably the only one who will say, hang on a sec, this is insane nonsense.
01:02:45.000 We've got to protect these kids.
01:02:47.000 Let's cut it out.
01:02:49.000 Instead, there's this constant, like, oh, yeah, but you know, we have, like, a funny thing happened when the Supreme Court recently, I mean, this took For Women Scotland, a group in Scotland, years to get through to court.
01:03:00.000 But finally, the Supreme Court said, no, sex means biological sex.
01:03:04.000 You know, it doesn't, you know, Scotland.
01:03:06.000 In law, yeah, in law, sex means biological sex.
01:03:10.000 And all these places have come back saying it's the most, you know, we don't know how we're going to implement this.
01:03:15.000 It's all very confusing.
01:03:16.000 And they're really dragging their heels with it, you know.
01:03:19.000 It's like, you were able to do it for 100 years.
01:03:21.000 How difficult is it to write ladies and gents and put it up in a sign?
01:03:24.000 Right.
01:03:25.000 You know, and enforce it.
01:03:26.000 Put it up on a door and enforce it.
01:03:27.000 You know, they are absolutely hypnotized by this.
01:03:31.000 And they're fully convinced that it is just like gay rights and that they have to be careful because, I mean, one of the things that's happened, for instance, with the police in the UK is that a few years ago, there was a young kid murdered by some racists, a black kid murdered by some racists.
01:03:47.000 And I think, I think it was called the McPherson report came out that described the police as institutionally racist.
01:03:53.000 And they probably were in that way that all cops were racist at one point, you know, or at least, you know, very much not right on.
01:04:02.000 But anyway, it was a big scandal.
01:04:04.000 It had this convulsive effect on the police.
01:04:07.000 And then the police just flipped and they started putting on pride colours on their faces and marching with pride.
01:04:16.000 And I genuinely think that there are police who are complicit.
01:04:20.000 In fact, I sent a video, maybe Jamie can pull it up, but I sent a video of the police actually walking away from a group of kettled women.
01:04:30.000 Trans activists had kettled them in this small space.
01:04:33.000 Their back was up against a railing, and there was a huge crowd of Antifa-type guys screaming at these women.
01:04:41.000 And about four or five police keeping the Antifa guys from the women.
01:04:45.000 And I arrived and I saw them walking away.
01:04:48.000 I saw about six, seven policemen walking away from it.
01:04:52.000 I was like, what the hell's going on?
01:04:54.000 So I think British police are using trans activists to scare women out of fighting for their rights because they know that if women gather to meet trans activists will definitely be there to hurt them or harass them.
01:05:10.000 Do you really think that?
01:05:10.000 You don't think that it's just they're scared of the trans activists?
01:05:14.000 No, because they've been advised for years by Stonewall, which was the big gay rights organization in the UK, that these women are bigots and that these women are actually far right.
01:05:25.000 And, you know, and the police believe this stuff because they've had it as training for years.
01:05:29.000 So their training is that these women that are fighting for women's rights, these women are bigots, and you should let the Antifa people have Adam.
01:05:36.000 Oh, they wouldn't say that officially, but I believe that's what's happening.
01:05:39.000 I believe they're basically using Antifa to control these women.
01:05:43.000 You know, the flip side of this is ugly.
01:05:47.000 When people rise up against something like this, it gets real ugly and real violent.
01:05:51.000 And that's what scares me the most.
01:05:53.000 Oh, one of the things we're trying to head off is the backlash against transsexuals and gay people who had nothing to do with this, you know?
01:06:00.000 Gay people in particular.
01:06:02.000 There's a lot of my friends that are gay that do not like any of this movement.
01:06:05.000 Yeah.
01:06:06.000 They do not like any of it.
01:06:07.000 It's a homophobic movement.
01:06:08.000 Have you ever heard of anything that's more homophobic than a lesbian with a penis?
01:06:13.000 It's homophobia.
01:06:14.000 That's all it is.
01:06:15.000 And for some reason, people have just been held in this kind of, you know, tractor beam where they're just kind of like going along with it and they're not questioning it.
01:06:25.000 I guess they're worried that what happened to people like me will happen to them.
01:06:30.000 But there's increasingly less of an excuse now.
01:06:33.000 I mean, John Oliver and Jon Stewart both said on their programs that puberty blockers were reversible.
01:06:40.000 That was a dangerous lie.
01:06:42.000 Well, I don't understand Jon Stewart saying that.
01:06:44.000 I have to assume that Jon Stewart was misinformed.
01:06:47.000 Everyone's misinformed.
01:06:48.000 But I have to assume that John didn't look into this because he's super reasonable and very intelligent.
01:06:53.000 Absolutely.
01:06:54.000 And that's why it's crushing when someone like that says something like this.
01:06:58.000 It is simply not true that puberty blockers are reversible.
01:07:01.000 It has a direct impact on the development of the child's penis to the point where they might not be brain and causes strokes.
01:07:09.000 It is literally chemical castration that they used to give to sex predators.
01:07:14.000 That's what it is.
01:07:15.000 Yeah.
01:07:15.000 That's what it is.
01:07:16.000 Yeah, it's even a form of the drug.
01:07:19.000 A few years ago, they wanted to put Alan Turing on a banknote in the UK.
01:07:23.000 And now they're putting gay kids on the same drugs that chemically castrated him.
01:07:29.000 Yes.
01:07:30.000 And led him to commit suicide, right?
01:07:32.000 Yeah, exactly.
01:07:33.000 The guy who invented the test to figure out whether or not artificial intelligence had reached sentience.
01:07:37.000 Oh, I didn't.
01:07:38.000 Yeah, yeah, yeah.
01:07:39.000 That's the Turing test.
01:07:40.000 I always forget about that.
01:07:40.000 Yeah, yeah, yeah, yeah.
01:07:41.000 I think of it more as the Enigma guy, you know.
01:07:43.000 I mean, but imagine that.
01:07:44.000 Like, what a crazy contribution than a footnote in history that we're currently wrestling with.
01:07:50.000 Like, we're grappling right now with the idea that these things are already sentient.
01:07:53.000 The thing is, we're grappling with things that we shouldn't be grappling with when we're actually on the cusp of what seems to me with AI to be a huge moment in human evolution, right?
01:08:04.000 We're about to move into like it's, I always think of it like the thing in Alien that's the Gourney Weaver with the JCBS.
01:08:04.000 Yeah.
01:08:16.000 Exactly.
01:08:17.000 I think that's what AI is for humanity.
01:08:19.000 We're going to be able to step into these things and be much more powerful than we used to be.
01:08:23.000 We need to be completely irrelevant and we fade off because we're not breeding anymore anyway.
01:08:28.000 Yeah, yeah, yeah.
01:08:29.000 You know, I mean, you think about the amount of people that are having children now, especially in developed parts of the world, it drops off.
01:08:37.000 It always drops off.
01:08:38.000 But like Japan is at the risk of population collapse.
01:08:42.000 South Korea is at the risk of population collapse.
01:08:45.000 Then you factor in microplastics, which significantly affect child's reproductive systems when they're in the uterus.
01:08:54.000 Also, like lower sperm counts.
01:08:57.000 You look at the invention of plastics and the use of plastics, and then the correlation, the giant dip in sperm counts.
01:09:05.000 Wow.
01:09:05.000 We talked about it yesterday in the podcast.
01:09:06.000 There's a woman, Dr. Shanna Swan from Harvard, who wrote a book on this about phthalates.
01:09:11.000 And every person carries literally a fucking spoon, like a picnic spoon, like plastic picnic spoon of microplastics in their brain.
01:09:22.000 Yeah.
01:09:23.000 So, and this stuff is neutering humans.
01:09:25.000 It's also causing their, I mean, whether it's causing it or where there's a correlation.
01:09:29.000 So there's a correlation in a larger number of miscarriages for women.
01:09:34.000 And they think a lot of this has to do with environmental toxins.
01:09:37.000 A lot of it has to do with microplastics.
01:09:38.000 So all this is moving us into this genderless direction.
01:09:42.000 We're going to stop breeding anyway at the same time where artificial intelligence becomes the new alpha life form on Earth.
01:09:49.000 Yeah, but again.
01:09:50.000 We just stop breed off.
01:09:51.000 We just stop breeding.
01:09:52.000 But one of the things, I mean, I just feel like there's all these, there are all these tests ahead of humanity, right?
01:10:01.000 From the things you've just been talking about and all sorts of other things, geopolitical and so on.
01:10:06.000 Why are we wasting time concentrating on this imaginary thing?
01:10:11.000 It's not a real problem.
01:10:14.000 It is a mass delusion spread by the internet.
01:10:17.000 Well, it's a real problem in that men are always a real problem.
01:10:19.000 But my point is...
01:10:24.000 Yeah, yeah.
01:10:25.000 Most high-speed car chases, most assault on police officers.
01:10:30.000 Most everything.
01:10:31.000 Absolutely.
01:10:32.000 And we have to deal with the reality of life if we are going to take this major evolutionary step as human beings with AI.
01:10:40.000 Because if we continue ignoring the insanity that the internet has brought about with this movement, we're just going to waltz right into the next one.
01:10:51.000 If we don't take time to say, okay, what just happened?
01:10:54.000 Why did it happen?
01:10:55.000 How can we make sure it doesn't happen again?
01:10:57.000 Because it seems to me now that with the trans thing, the human race with the internet is hugely vulnerable to these kinds of what you might call them sense-destroying viruses, whatever you might call them.
01:11:10.000 I don't know, there should be a word for what the trans movement is, but I think we're so vulnerable to it that we have to start developing antibodies, you know?
01:11:20.000 I agree with you because I do think that something else could come from a different direction, right?
01:11:25.000 So here's what we saw in our lifetime: this same kind of thinking, the same kind of bizarre, violent group thinking.
01:11:31.000 We saw it from COVID.
01:11:33.000 We saw people that turned on their neighbors that were skeptical about the vaccine or people that didn't want to take it.
01:11:38.000 They were called murderers and plague rats and this craziness where people were ostracized from social circles because they weren't vaccinated.
01:11:48.000 And even though, in hindsight, they were directionally correct, right?
01:11:54.000 No one, there's been no course correction, but there was this othering of people for a very simple thing.
01:12:01.000 Like you decide, like if that fucking thing works and you take it, why are you mad at me if I don't take it?
01:12:07.000 If it works, that means you're not going to get COVID no matter what fucking happens to me, right?
01:12:12.000 So it was weird and illogical, just like the trans thing, but violent.
01:12:17.000 And people were terrified because your life was in danger.
01:12:21.000 So you saw the most vile reactions from older populations who were calling for people to be quarantined, round up in camps, take away their livelihood, take away their children.
01:12:32.000 You were hearing it from these old, terrified people with fragile health.
01:12:38.000 And they all got really violent about it online.
01:12:41.000 They got really, really extreme in their positions on this.
01:12:44.000 Yeah, yeah.
01:12:45.000 I didn't notice that stuff so much.
01:12:47.000 Oh, in America, it was nuts.
01:12:48.000 In America, it was wild.
01:12:51.000 And it was pushed by these multimedia corporations, these huge news corporations.
01:12:57.000 It was pushed by them.
01:12:59.000 It was all nonsense and propaganda, including Rolling Stone and CNN.
01:13:04.000 Rolling Stone had a whole article that was 100% bullshit about people who were waiting in line at the emergency room for gunshot wounds because there were so many people that were getting treated for horse dewormer because they took too much ivermectin.
01:13:18.000 Wow.
01:13:19.000 Total fabrication with a stock photograph of people waiting in line in August in Oklahoma.
01:13:26.000 This is supposed to be taking place.
01:13:28.000 A stock photograph of people wearing winter coats outside in line because they're waiting for the fucking flu shot.
01:13:34.000 Yeah, yeah, yeah.
01:13:34.000 So it's a lie.
01:13:36.000 It's a lie.
01:13:37.000 And so there's this willingness to lie with no retractions on all the major television networks.
01:13:43.000 Everybody was pushing this one narrative and everyone at home was locked down.
01:13:48.000 You couldn't go to work.
01:13:49.000 You couldn't go to school.
01:13:50.000 And everyone was terrified because this had never happened before.
01:13:53.000 And we got to see how quickly people who are cowards just attacked the others to try to keep themselves safe.
01:13:59.000 It got very rat-like.
01:14:03.000 We live in a village now, really.
01:14:06.000 And I often describe myself as a victim of village gossip on a global scale.
01:14:06.000 When you think about it.
01:14:11.000 That's it.
01:14:12.000 Yeah.
01:14:12.000 That's it.
01:14:13.000 You know, but it's written down, so it seems a lot more real.
01:14:15.000 Yeah, that's the thing.
01:14:16.000 I mean, when the Bible, when the printing press appeared, there was 100 years of chaos as all these sects and everyone who had a crazy idea about the Bible that they'd had themselves.
01:14:28.000 Oh, on the seventh page, it says this on line seven.
01:14:31.000 And that would become a religion, right?
01:14:33.000 And then for a hundred years, these religions were fighting it out.
01:14:36.000 There were pogroms and massacres and Protestantism was formed because of it.
01:14:43.000 We're having a similar moment, right?
01:14:45.000 Yeah.
01:14:46.000 With the internet.
01:14:47.000 Because suddenly every idiot with an opinion is putting it out there.
01:14:51.000 Some of them are great opinions.
01:14:52.000 Some of them are terrible opinions.
01:14:54.000 But there's no difference.
01:14:55.000 And one thing I've noticed with the left is that with bad opinions, all they do is they repeat certain lines over and over again.
01:15:02.000 and you hear these things come up over and over.
01:15:04.000 That's why I reacted slightly earlier because trans women have always been with us, it's one that you hear a lot, you know.
01:15:10.000 Right.
01:15:10.000 But you also hear, you know, just things designed to shut down the conversation, you know, and you see them over and over again.
01:15:17.000 They're repeated over and over again.
01:15:19.000 Whatever you think of the Gaza situation at the moment, I noticed that it's the same thing, genocide, over and over and over again.
01:15:26.000 In every tweet, they mention the word genocide, you know.
01:15:30.000 And I don't happen to think it is, but like by the end of it, if you were to stand up against that and say something against that, it's a difficult thing because you suddenly look like you're against genocide, you're for pro-genocide.
01:15:43.000 What is the measure of genocide?
01:15:45.000 Like, when does it become genocide?
01:15:47.000 Like, when you're starving people, when you're bombing indiscriminately, when you've destroyed most of the buildings, when you've killed who knows how many tens of thousands of women and children.
01:15:58.000 Well, do we want to get into this?
01:16:00.000 Yeah, I mean, we got into it a little bit.
01:16:03.000 I just think that, you know, Hamas is, I agree with Coleman Hughes, I think it was on your show, where he says that, you know, Hamas, they've built like a huge network of tunnels underneath people's houses.
01:16:14.000 They put their headquarters in civilian buildings.
01:16:18.000 It's a form of guerrilla warfare that I don't think should be allowed to continue.
01:16:22.000 And I support Israel in defeating Hamas.
01:16:26.000 Is there another way to do it other than to blow up everybody?
01:16:30.000 Is there a way to do it without starving the innocent people?
01:16:34.000 Again, I just don't know how much of this is Hamas, you know, comes from Hamas.
01:16:39.000 I don't know how much of it is true.
01:16:41.000 I would suggest that all these conversations wait until we finally find out exactly what's happened.
01:16:46.000 At the moment, it's fog of war stuff, you know.
01:16:49.000 But I don't want to get into a big debate.
01:16:51.000 Do you know, have you ever heard of Eric Prince's idea?
01:16:53.000 Do you know Eric Prince?
01:16:54.000 Do you know who he is?
01:16:55.000 Was he the Blackwater guy?
01:16:56.000 Yeah.
01:16:57.000 He had an idea that apparently he floated by, but they rejected.
01:17:00.000 He said, instead of destroying everything like that, you could just flood the tunnel system with the seawater.
01:17:05.000 Oh, well, yeah, great.
01:17:06.000 I mean, that was his idea.
01:17:08.000 It's like you could flush them all right out.
01:17:10.000 Like, they're very vulnerable.
01:17:11.000 Yeah, yeah.
01:17:12.000 And I don't know whether or not.
01:17:15.000 I heard him speak of this.
01:17:17.000 I believe it was on the Sean Ryan show.
01:17:18.000 I might be incorrect.
01:17:19.000 He spoke about it on some show.
01:17:22.000 I don't know if it would work.
01:17:23.000 I don't know why they would reject it.
01:17:25.000 Yeah.
01:17:25.000 Yeah.
01:17:26.000 I don't know.
01:17:26.000 But as I say, I don't really want to go into it because it's such a, again, it's a very, very heated debate, and I've only got room for one.
01:17:35.000 I understand what you're saying.
01:17:36.000 I understand what you're saying.
01:17:40.000 When you're dealing with all this, you lost your entire income.
01:17:45.000 You lost everything.
01:17:46.000 You lost your standing in the television world.
01:17:49.000 You lost a lot of your friends.
01:17:51.000 Oh, my friends.
01:17:52.000 No one stuck with you.
01:17:53.000 Two people, Jonathan Ross and Richard Ioetti.
01:17:57.000 Everyone else, all the people I'd given jobs to, made famous.
01:18:01.000 Well, shout out to those two soldiers.
01:18:03.000 Oh, they're great.
01:18:04.000 Richard's great.
01:18:05.000 It's hard.
01:18:05.000 It's hard when someone close to you gets canceled and you're worried about catching strays.
01:18:11.000 Yeah, yeah.
01:18:12.000 Actually, I should say there are a few people that have stood by me.
01:18:15.000 Lissa Evans, wonderful woman who produced Father Ted, and a few other people.
01:18:20.000 She's a big feminist.
01:18:22.000 So I shouldn't be too.
01:18:24.000 But in terms of my actual old life, everyone just dumped me.
01:18:28.000 Crazy.
01:18:29.000 That's the beautiful thing about the comedy community.
01:18:31.000 Like, we expect cancellations.
01:18:33.000 Yeah.
01:18:34.000 You cancel like nobody.
01:18:35.000 You guys are so interesting.
01:18:36.000 I compare you guys to the birds that feed in between the teeth of crocodiles.
01:18:42.000 You guys dance so close to being to being cancelled, but it doesn't seem to touch you.
01:18:48.000 And it's always great to see.
01:18:50.000 I don't know if you saw, did you see the trigonometry interview with Kill Tony, Tony Hinchcliffe?
01:18:56.000 I did not.
01:18:57.000 You got to watch that.
01:18:58.000 Tony's my boy, though.
01:18:59.000 So presumably he told you about that Chinese incident.
01:18:59.000 He's my best friend.
01:19:02.000 Oh, I was there.
01:19:03.000 I was there for the whole thing.
01:19:04.000 I was there for the whole thing.
01:19:05.000 Mind-blowing.
01:19:06.000 I was there for the whole thing.
01:19:07.000 What saved Tony was that he had a recording of the entire set, including the other guys.
01:19:11.000 That's right.
01:19:12.000 And that's what saved him.
01:19:14.000 And then it became a thing.
01:19:16.000 I mean, but Tony was going through it, man.
01:19:18.000 It was hard.
01:19:19.000 It was hard to watch a friend go through it like that.
01:19:22.000 But, you know, he took like a week off, maybe two weeks off.
01:19:26.000 And then I brought him on the road with me in Utah.
01:19:29.000 We did Salt Lake City, which is an awesome place.
01:19:33.000 Wise guys in Salt Lake City.
01:19:35.000 Shout out.
01:19:35.000 It's one of the best clubs in the world.
01:19:37.000 So we did this club, and the audience had no idea who was opening for me.
01:19:41.000 It's just me, right?
01:19:43.000 And when they announced Tony Hinchcliffe's name, everybody went crazy.
01:19:47.000 Oh, yeah.
01:19:48.000 It was awesome.
01:19:49.000 It was awesome.
01:19:50.000 People stood up, they clapped, they cheered.
01:19:53.000 And he went up and murdered.
01:19:54.000 He had so much material on that cancellation.
01:19:57.000 It was so good.
01:19:59.000 It was like an amazing moment for him because his comedy actually jumped up a notch.
01:20:04.000 He actually got funnier because of it.
01:20:06.000 He tightened his bits up and got more aggressive with it.
01:20:09.000 He was like, he got really dialed.
01:20:11.000 And he was worried his whole world was going away.
01:20:12.000 Same with Shane.
01:20:13.000 I mean, Shane Shane came up on top of that so brilliantly.
01:20:17.000 Yeah, I just love, I love people who, but for some reason, I just couldn't do it.
01:20:17.000 Exactly.
01:20:22.000 I think because you didn't have a comedy community, like a stand-up community.
01:20:26.000 And also my different thing.
01:20:27.000 My job required a cast and a crew and producers and agents.
01:20:32.000 The UK art sector is completely captured.
01:20:36.000 Well, it's the same as America.
01:20:37.000 I mean, America is a giant issue with left-wing politics in Hollywood.
01:20:41.000 Yeah.
01:20:42.000 You're either on the team or you're outside.
01:20:44.000 If you're outside, you don't work.
01:20:45.000 The only one who works is Chris Pratt.
01:20:48.000 Chris Pratt is like openly Christian, but he's such a nice guy.
01:20:51.000 You cannot fault him.
01:20:52.000 Sure.
01:20:53.000 He's such a good guy.
01:20:54.000 You can't say him being a Christian is an issue.
01:20:57.000 So strange that they're coming after Christians.
01:20:59.000 It's like, it's like, you know, real Christians are the nicest fucking people you ever meet.
01:21:04.000 Exactly.
01:21:05.000 They really follow Jesus' principles.
01:21:08.000 If you want to use it as a method, like as a, if you want results, like their results are pretty fucking solid, man.
01:21:16.000 Real Christians really follow it.
01:21:19.000 Some of the nicest people.
01:21:20.000 And also, you know, you can disagree with them and not fall out with them.
01:21:23.000 Right.
01:21:23.000 Clinic Switzer grandfathered in, because he's old as fuck.
01:21:26.000 Like, they still let him make movies.
01:21:27.000 Who?
01:21:28.000 Clint Eastwood.
01:21:29.000 Oh, Clint.
01:21:29.000 Jesus.
01:21:30.000 How dare you?
01:21:30.000 Yeah.
01:21:31.000 Clint.
01:21:31.000 I like his late stage, though.
01:21:33.000 His late stage of movies.
01:21:34.000 Yeah, like those kind of documentary-like about ordinary people who get caught up in things.
01:21:38.000 He did one, Richard Lewis, I think it was called.
01:21:40.000 That was brilliant about a security man who found an explosive device at a gig.
01:21:47.000 And because he pointed it out, they charged him with it.
01:21:50.000 I remember that guy.
01:21:51.000 Yeah, that guy got smeared all over the news.
01:21:53.000 We all thought that he was the guy who bombed.
01:21:55.000 Yeah.
01:21:55.000 Yeah.
01:21:56.000 We thought he was the bomber.
01:21:57.000 In the film, he said he had one day of being a hero, and then it flipped to the bomb.
01:22:02.000 It's so scary when things like that happen to people because, you know, but Clint, he had this one film to me that is like the answer to the spaghetti westerns.
01:22:15.000 I think unforgiving.
01:22:15.000 That's unforgiving.
01:22:16.000 That's a great movie.
01:22:17.000 I think it's the best Western of all time because I think it's probably the most accurate about how fucked up life was back then.
01:22:23.000 Have you ever read Larry McMurtry?
01:22:25.000 He's a Texan author.
01:22:25.000 No.
01:22:26.000 You've got to read Lonesome Dove.
01:22:29.000 Oh, I started Lonesome Dove, but I got distracted.
01:22:32.000 There's something that happens about page 100, and you will not be able to put it down afterwards.
01:22:37.000 Yeah.
01:22:39.000 But those books, those lonesome books, the Lonesome Book series by Larry McMurtry, I watched every Western differently after that because I suddenly realized what these people, how brave these settlers were.
01:22:51.000 Oh, they were crazy.
01:22:53.000 And also, he writes about Indians and Native Americans brilliantly, just as well as he writes about everyone else, you know.
01:23:01.000 And they're great books.
01:23:03.000 They're great books.
01:23:04.000 You just cannot put them down.
01:23:05.000 800 pages, you'd read them in a week.
01:23:08.000 I went through a whole series of Wild West books that I read after I read Empire of the Summer Moon.
01:23:16.000 Oh, yeah.
01:23:17.000 No, not Empire of the Summer Moon.
01:23:18.000 No, I know which one you're talking about.
01:23:19.000 You're talking about the Cherokee.
01:23:21.000 God damn it.
01:23:22.000 How am I saying that wrong?
01:23:23.000 Is it Cheyenne?
01:23:24.000 No.
01:23:26.000 What were they?
01:23:27.000 Oh, I think I read a bit of them.
01:23:29.000 Comanches, that's right.
01:23:30.000 But what is the...
01:23:34.000 What's that?
01:23:35.000 It is Empire Summer.
01:23:37.000 Why am I...
01:23:40.000 I'm a little movie afterwards.
01:23:43.000 But have they made Empire of the Summer Moon already?
01:23:45.000 They're working on it, I believe.
01:23:46.000 Yeah.
01:23:46.000 No, I was thinking of the Leonardo.
01:23:48.000 I was right.
01:23:48.000 So Empire of the Summer Moon, which is all about the Comanches and all about the settlers and all about people that got slaughtered.
01:23:54.000 And you realize how insane life was like.
01:23:57.000 And I think Eastwood was the, that was the closest, I think, to really getting it.
01:24:02.000 Because I think I love those old spaghetti westerns, don't get me wrong.
01:24:06.000 But they were this kind of like bullshitty 1970s version of the Wild West.
01:24:12.000 Clint Eastwood never got shot.
01:24:14.000 Yeah, yeah, yeah.
01:24:15.000 No, there's more grit in that one.
01:24:18.000 And it was real.
01:24:18.000 It was like, I believe it.
01:24:20.000 I believe that the sheriff was a coward.
01:24:22.000 Also, it's an interesting thing that you might, if you watch it again, you might notice, but they all know each other.
01:24:27.000 It's so weird.
01:24:28.000 They all go, do you know William Dency or whatever?
01:24:31.000 And Magonia, yeah, yeah, I met him down.
01:24:33.000 Because it's like, even though they're all spread out, they all know each other.
01:24:37.000 So I bet you that's based on research, you know?
01:24:39.000 I bet it is as well.
01:24:41.000 Yeah, well, they had to know each other.
01:24:43.000 There's a few fucking people and there's no laws.
01:24:47.000 You have to know who the dangerous people are.
01:24:49.000 Yeah, yeah, yeah.
01:24:50.000 William Money.
01:24:51.000 Yeah, that was a great movie.
01:24:52.000 Oh, fucking good.
01:24:53.000 No, I love Clint.
01:24:54.000 But he's grandfathered in as a conservative.
01:24:56.000 John Voigt is kind of on the outs, but he was, you know.
01:24:59.000 But, you know, I remember I even kind of was worried about you in the early days.
01:25:03.000 Like, I used to be a bit of a left-wing twat myself.
01:25:09.000 You know, I famous, one of the most famous things I did was, do you remember the saluting pug guy who got his, yeah, well, I kind of joined in on all that, and I'm deeply ashamed of it.
01:25:21.000 And I actually apologized to him at his, I did a video and apologized at a roast that he did.
01:25:27.000 Well, good for you.
01:25:28.000 Because, like, you know, I just, but I completely believe, this is what I found out.
01:25:34.000 I completely believed he was a fascist because that's what every publication was telling me.
01:25:39.000 Right.
01:25:39.000 And I was trusting all these people who eventually came out against me.
01:25:44.000 And now I look back at it and think I was just lied to consistently for all these years.
01:25:49.000 And it's meant that I have to readdress things that I've spent the last 25 years thinking about.
01:25:55.000 But I realize now I wasn't really thinking deeply enough about them, like climate change.
01:25:59.000 I was always so terrified of climate change, you know?
01:26:01.000 Yeah.
01:26:02.000 And, you know, I've had about 30 years of being terrified about climate change.
01:26:08.000 And nothing's happening.
01:26:09.000 Well, go back and watch that stupid fucking movie that Al Gore made.
01:26:14.000 Yeah.
01:26:15.000 That movie's so wrong in terms of his predictions.
01:26:18.000 It's, you know, in the benefits of hindsight looking back.
01:26:20.000 I was told 2019 we'd all be underwater.
01:26:23.000 Well, it's not just that.
01:26:24.000 Have you ever seen one of those aerial time-lapse photos of the coast?
01:26:27.000 No.
01:26:28.000 Yeah, the bitch doesn't move.
01:26:29.000 It doesn't move since the 1980s.
01:26:30.000 It's not going nowhere.
01:26:32.000 The reality is the fucking Earth climate has never been static.
01:26:36.000 It's never been flat.
01:26:38.000 It's never been super predictable.
01:26:40.000 It's always gone up and down.
01:26:42.000 And the reality of our life is, yeah, we pollute and it's terrible.
01:26:47.000 And we should definitely figure out better solutions.
01:26:50.000 The reality is all that stuff that we're putting in the air in terms of carbon dioxide, that is an issue.
01:26:55.000 But also, particulates, also break dust, also, when your tires wear out, where do you think they go?
01:27:03.000 Yeah.
01:27:03.000 They go in the fucking air.
01:27:05.000 Like that brilliant Bill Burr routine.
01:27:07.000 He says, when you throw away your Apple Charger, do you think it just mysteriously disappears?
01:27:15.000 No, it ends up a fucking, yeah, or a shark starts wearing it as a scarf, you know?
01:27:20.000 That's what it is, you know?
01:27:21.000 Well, there's a lot of birds who eat bottle caps.
01:27:23.000 That's a real problem.
01:27:24.000 You ever seen those terrifying photos of dead birds with stomach, like they're rotted out, and you see their stomach full of bottle caps?
01:27:31.000 No.
01:27:32.000 Yeah, it's a real issue.
01:27:34.000 Yeah.
01:27:34.000 It's a real, that's our problem.
01:27:36.000 Our problem is pollution.
01:27:38.000 Our problem is waste.
01:27:39.000 Our problem is destroying of rivers and the ocean and oil spills.
01:27:45.000 These are all problems.
01:27:47.000 Climate change, this idea, the problem with it has become politicized.
01:27:52.000 And it's become a thing that you have to support, just like you have to support trans rights.
01:27:55.000 Just have you have to support this or that or whatever the fuck.
01:27:58.000 Get vaccinated.
01:27:59.000 It's all the same shit.
01:28:00.000 It's never been more difficult or more, it's never been made more frightening to do your own thinking.
01:28:06.000 To actually say it is now dangerous thing to do.
01:28:09.000 Well, you're literally told on mainstream news to not do it.
01:28:13.000 Don't do your own research.
01:28:14.000 Yeah, yeah, yeah.
01:28:15.000 Which is a crazy thing to tell people.
01:28:17.000 In the age of information, whether you have more access to reality and truth than ever historically, by far, with human beings, don't do it.
01:28:27.000 Don't use it.
01:28:28.000 Don't utilize it.
01:28:29.000 Don't utilize the greatest tool that mankind has ever devised for figuring out what the fuck is going on.
01:28:35.000 Don't do it.
01:28:36.000 And you see the kind of mainstream news people in the UK, there's people who have podcasts like Aleister Campbell and Rory Stewart and these ex-newsreaders who are on this program called the News Agents.
01:28:48.000 And their job is to deliberately not know things, right?
01:28:52.000 It's almost like, I mean, I think you got the same thing over here with CNN and stuff.
01:28:58.000 You know, their job is to just express confusion about it.
01:29:02.000 Why is everything so?
01:29:04.000 And they cannot actually address the issues because if they address the issues, they will start saying things that will get them cancelled.
01:29:11.000 And so what you have is a kind of a chewing gum for information in the UK.
01:29:18.000 It's not real.
01:29:19.000 It's not actually information.
01:29:21.000 It's just these middle-class people, very privileged people, gabbing away.
01:29:25.000 They all think there's no problem with men in women's spaces because they will never have to use a shelter.
01:29:32.000 They will never have to use a rape crisis center, God willing.
01:29:35.000 So these people who actually are affected by these issues are the last thing on their minds.
01:29:40.000 They are very comfortable middle-class people who do not give a damn about anyone outside of their experience.
01:29:46.000 No benefit to rocking the boat.
01:29:49.000 You have to realize what those people are are mouthpieces.
01:29:52.000 You're just a mouthpiece for an organization who occasionally, unfortunately for the network, gets to express their true opinions on things.
01:30:00.000 And usually you find out they're fucking dumb.
01:30:02.000 Yeah.
01:30:02.000 Right?
01:30:02.000 That happens all the time.
01:30:04.000 Yeah, yeah.
01:30:04.000 And the really smart ones wind up leaving and going somewhere else and starting podcasts.
01:30:09.000 But some of them it's not their fault.
01:30:10.000 Like the thing I was saying earlier about the chain of trust among doctors, the W path breaking the chain of trust to such an extent has affected the whole world.
01:30:21.000 So now we can't trust our newsletter to tell us the truth.
01:30:24.000 They're telling us a man attacked, a man attacked some people when it was really a woman.
01:30:28.000 We can't trust doctors because doctors are telling these incredibly damaging things to kids.
01:30:34.000 Kid comes in with depression.
01:30:36.000 Oh, you're trans.
01:30:37.000 I'll tell you, here's a story.
01:30:39.000 I've got this.
01:30:39.000 This is a story that really illustrates the problem.
01:30:43.000 And it's another problem with the press as well.
01:30:45.000 But in New Zealand, there was a young girl.
01:30:47.000 She was 16 years old, I believe.
01:30:49.000 And she was autistic and anorexic.
01:30:53.000 And her parents were first-generation, I think, immigrants and had no idea what was going on, right?
01:31:00.000 She was suddenly saying she was a boy.
01:31:03.000 They didn't know what she was talking about, you know?
01:31:06.000 And so they didn't go with it.
01:31:09.000 And she left home, right?
01:31:10.000 Encouraged by what they call a glitter family, which is a bunch of gay or trans-identified people who love bomb her and tell her that she needs to get rid of her parents because her parents are bigots.
01:31:22.000 So she moves in with this glitter family.
01:31:24.000 They can't deal with her because she's autistic.
01:31:27.000 So within a couple of months, she's gone from the house.
01:31:30.000 There's nowhere to live.
01:31:31.000 Can't go back home.
01:31:32.000 Has told her parents she's bigoted.
01:31:33.000 They're bigots.
01:31:34.000 The New Zealand government, they meet her.
01:31:37.000 They don't diagnose her as autistic or as anorexic.
01:31:42.000 They diagnose her as trans.
01:31:44.000 They give her a hotel room.
01:31:46.000 She died in that hotel room.
01:31:48.000 She starved to death in the hotel room because the New Zealand government called her trans and forgot about a young, autistic, anorexic 16-year-old girl.
01:31:58.000 And at the end of it, she was just about making up with her parents.
01:32:02.000 And she phoned her mother.
01:32:04.000 Oh, sorry, excuse me.
01:32:05.000 That would go down well.
01:32:07.000 She phoned her mother a few times and they were just about to make it up.
01:32:12.000 And then she got a call from police.
01:32:15.000 And she found out that the daughter had been in the hotel right around the corner from where she lived.
01:32:19.000 So she left the house, went up into the room, and took a photograph of her daughter dead.
01:32:26.000 No one would tell the story.
01:32:28.000 It's taken a year and a half to get this story out in New Zealand media because no one cares.
01:32:33.000 No one cares.
01:32:34.000 It's not that no one cares.
01:32:35.000 It's a trans story.
01:32:36.000 So you just can't get it out.
01:32:38.000 And in every country, there's always one or two activists who do everything they can, like the journalist or the people involved with that story were attacked by, I can't remember his name, but there's an activist over there, a very vicious activist, as there is in Ireland and the UK and a few different places.
01:32:55.000 And they're always the parents of trans-identified kids.
01:32:58.000 You know, the worst people, the worst activists, the most violent activists, Helen Joyce is a brilliant Irish writer and she pointed this out.
01:33:07.000 They have done the worst thing that you can do to your kids.
01:33:10.000 They have confused their kids and sometimes they've actually encouraged their kids to take these hormones and to go through these procedures to be castrated, to have a double mastectomy.
01:33:20.000 They will never be able to accept what they've done to their kids.
01:33:23.000 And I know there was one Irish activist who's been planning on transing his kid for at least 12 years, right?
01:33:28.000 And now the kid is grown up.
01:33:30.000 Of course, the kid thinks they're a man because they've lived with this homophobic parent, you know, who basically doesn't like seeing gender non-conforming behaviors in their child.
01:33:43.000 So they say, oh, he's a woman or she's a man, you know?
01:33:47.000 So these are the people who are, you know, that guy in New Zealand, even when the story came out, he still tried to attack the people telling it.
01:33:57.000 Still, he put me in some sort of conspiracy of creating a bigger deal about it than it needed to be.
01:34:06.000 A young girl who died in a hotel room because they called her trans and just forgot about her.
01:34:11.000 Like, how did it take a year and a half for New Zealand media to report on that in a country that small?
01:34:17.000 You know?
01:34:18.000 And there's so many stories like this.
01:34:20.000 Well, I think the dam is breaking because I think there's been a lot of people that are fed up and they realize that there's irreversible harm being caused to people that are being tricked.
01:34:32.000 These stories of the detransitioners and them being attacked online for telling their story, which is true.
01:34:40.000 Yeah.
01:34:41.000 It's a true story about them being confused and being told that they were trans and going through these procedures and deeply regretting it.
01:34:49.000 Yeah.
01:34:49.000 And then going back to their biological sex, they get destroyed online.
01:34:53.000 They get treated worse than anyone else, you know, because they are living proof that it's not innate.
01:34:59.000 It makes no sense.
01:35:00.000 There's no compassion.
01:35:01.000 There's no kindness.
01:35:03.000 And I'll tell you what.
01:35:04.000 It's evil.
01:35:04.000 And also, I'll tell you what else.
01:35:07.000 I have an iPhone, right?
01:35:08.000 And whenever I write the word detransitioners on my iPhone, which I have cause to do quite a lot, it underlines it in red because it will not recognize the word exists.
01:35:18.000 But is it because it doesn't know the word yet?
01:35:20.000 Yeah, but how does it mean?
01:35:21.000 Because it does it with cunt, too.
01:35:23.000 Yeah, but detransitioners isn't a dirty word.
01:35:26.000 Actually, it doesn't do it with cunt anymore.
01:35:28.000 But they used to.
01:35:28.000 Right.
01:35:29.000 Well, detransitioners is a dirtier word than that.
01:35:31.000 Let me see if it works in America.
01:35:32.000 Do you have an American iPhone?
01:35:35.000 I'd be curious.
01:35:35.000 No.
01:35:36.000 I'm going to take it.
01:35:38.000 But people want to go away and try it.
01:35:41.000 try writing detransitioners and see if it underlines it.
01:35:44.000 But, you know, and my point is that one of the...
01:35:51.000 Yeah.
01:35:51.000 Motherfuckers.
01:35:52.000 There you go.
01:35:54.000 Let's see if it gives me some suggestions.
01:35:57.000 No replacement found.
01:35:58.000 You have no idea what you're talking about.
01:36:00.000 Exactly.
01:36:00.000 It's playing dumb.
01:36:01.000 And that's what Wikipedia is the same.
01:36:04.000 Wikipedia is moderated by trans activists.
01:36:06.000 But that's kind of crazy.
01:36:08.000 I wish I had my Android phone here.
01:36:08.000 Yeah.
01:36:09.000 I would want to check that.
01:36:10.000 Yeah, that'd be interesting.
01:36:11.000 But Wikipedia has been, there was a war within Wikipedia, I believe, where all the trans ally moderators won.
01:36:20.000 And now Wikipedia, my Wikipedia pages basically have been vandalized for years, you know.
01:36:26.000 And if I complain to press authority that they call me anti-trans, they go, well, it says it on your Wikipedia.
01:36:32.000 And we've tried to change it, but it reverts back within 15 minutes every time.
01:36:37.000 So if anyone would like to do a class action suit against Wikipedia, I'd love to be involved if that ever.
01:36:43.000 Just for the record, cunt doesn't show up with a red line.
01:36:46.000 Okay, there you go.
01:36:47.000 Detransitioner does.
01:36:48.000 Cunt does not.
01:36:49.000 There you go.
01:36:50.000 There you go.
01:36:51.000 I wonder which, is it the actual but my point, actually, sorry, the reason I bring it up is because a lot of tech guys are tech because they're autistic, right?
01:37:03.000 And a lot of coders are, you know, you spend all your time writing in the dark and coding stuff.
01:37:09.000 Your interests go that way.
01:37:12.000 So the trans thing has come up a lot as well from kind of manipulation by these tech guys who are a lot of whom identify as trans.
01:37:21.000 So we're living in a world now where, like the underline detransitioner, they are controlling what we think is normal and what we think is unusual.
01:37:28.000 So to everyone now we write down detransitioner, you doubt yourself because there's a red line underneath it.
01:37:35.000 Oh, it mustn't be a real word, or it's D and then a space maybe or whatever.
01:37:38.000 But no, you're right.
01:37:39.000 It's just these fucking tech guys are trying to confuse you, keep you unbalanced, you know, so that you can't discuss this issue.
01:37:47.000 That is a weird one because when did detransitioner first start being used?
01:37:52.000 When did the term I don't know?
01:37:54.000 I mean, you know, the term is clearly when you only use it with sex, right?
01:37:59.000 You're only using it with gender, whatever you want to say, sex or gender.
01:38:03.000 That's the only detransitioner, transitioning anything from something to something else.
01:38:07.000 That's not ever done before.
01:38:10.000 I don't remember ever using that term detransitioner in any other way.
01:38:14.000 That's interesting, possibly because, you know, the trans thing is not, as I say, it's not real.
01:38:19.000 It's an artificial human-invented thing.
01:38:22.000 Yes, and it kind of implies regret.
01:38:24.000 Yeah.
01:38:25.000 The detransitioner stories are never happy ones.
01:38:28.000 You know, the American Southern Law Center.
01:38:31.000 Have you ever heard of them?
01:38:32.000 They have named Chloe Cole, you know, Chloe Cole, the detransitioner.
01:38:35.000 She's doing brilliant work.
01:38:37.000 They've named her as a far right, simply because she's going around telling her story.
01:38:44.000 I think I'm far right too.
01:38:46.000 They say I'm far right too.
01:38:47.000 No, they say everyone's far right.
01:38:50.000 If you say anything that diverts from what a bunch of lunatics online have agreed is the truth, then you're far right.
01:38:57.000 Yeah.
01:38:58.000 Yeah, you can't.
01:39:00.000 Everyone's politically homeless.
01:39:01.000 We have to realize that.
01:39:02.000 Start a shelter.
01:39:03.000 Yeah.
01:39:04.000 Call it the center.
01:39:05.000 But I don't think the systems we have are going to last much longer.
01:39:11.000 I think AI is going to change everything.
01:39:13.000 I think you're right.
01:39:14.000 But I think reluctantly, because there's people that are in control of the system right now that are extracting enormous amounts of money, with just fill in the blank of all the different special interests that have a hand in how much money gets distributed this way and that way.
01:39:31.000 There's so much of that that really fuels the decisions that are being made in this country.
01:39:36.000 It's not really the will of the people.
01:39:37.000 It's not really trying to make America great.
01:39:41.000 I mean, yeah, it is.
01:39:42.000 But also, if you really wanted to do it, you wouldn't do it this way.
01:39:46.000 If that's what your main goal was, if your main goal was to make money and give the illusion that we're fixing all the problems that America has while fixing some of them, well, then that's what you're doing.
01:39:46.000 Yeah.
01:39:57.000 Because that's what your goal seems to be.
01:40:04.000 Like, what's the best way to distribute all these resources?
01:40:07.000 And is it really fair that this corporation gets to pollute the fucking ocean?
01:40:11.000 Like, let's figure out what's the right way to do this.
01:40:14.000 Yeah.
01:40:15.000 And that's going to be horrifying for anybody in any position of power.
01:40:19.000 It's like people say about your show.
01:40:21.000 You know, the reason Kamala didn't come on is simply because there's a certain breed of politician who are, I think, dying out, who are the kind of politicians that couldn't survive three hours talking to you, you know?
01:40:33.000 Right.
01:40:34.000 And they, I don't think they've long left because things like this are the way that people get their information now.
01:40:39.000 So, you know, you get these like, and it's the only reason why people, oh, I've got a good thing to tell you about AOC as well.
01:40:47.000 It's the only reason why people like AOC and people like this are able to continue spouting nonsense is because they don't go on shows like this.
01:40:56.000 Here's an interesting thing.
01:40:57.000 At one point in my, I still haven't told you half the things I was going to tell you, but at one point a streamer did a, I managed to take some money, I managed to stop the charity mermaids from getting funding from the National Lottery in the UK.
01:41:15.000 Mermaids used to be a good organization.
01:41:18.000 Dysphoria was very rare and they treated it as you should, right?
01:41:23.000 With things like affirmation as a final step, not the first step, okay?
01:41:28.000 Surgery and drugs.
01:41:29.000 Final.
01:41:30.000 That's what it should always be.
01:41:31.000 And they were great.
01:41:32.000 And then a woman named Susie Green came on board.
01:41:38.000 And Susie Green.
01:41:39.000 Does Susie have a penis?
01:41:40.000 No, but her son did.
01:41:42.000 Her son did.
01:41:44.000 And Susie Green took over at Mermaids, transformed it into a mental institution.
01:41:50.000 And she took her son to Thailand on his 16th birthday to have him castrated.
01:41:58.000 You know?
01:42:00.000 And now, and this kid has been brought up since they were four or five years old because there's a famous TED Talk where this is hilarious.
01:42:08.000 You've got to see this TED Talk.
01:42:09.000 She does this TED Talk and she's got the TED Talk thing.
01:42:12.000 So she looks like an expert.
01:42:13.000 And she's doing the hand movements like they all do in TED Talks, right?
01:42:16.000 So you think this is someone who knows what they're talking about.
01:42:19.000 And she just admits that the kid liked playing with girls' toys.
01:42:23.000 The husband didn't like it.
01:42:26.000 She decided it was really a girl.
01:42:28.000 And that's the TED Talk.
01:42:30.000 There's no explanation of what trans is or anything like that.
01:42:34.000 Gay.
01:42:34.000 It just goes.
01:42:35.000 Yeah, exactly.
01:42:36.000 Well, this is the reality that they don't like.
01:42:38.000 The reality that they don't like is if you leave them alone and you don't encourage transition, the vast majority of them become gay men.
01:42:38.000 Yeah.
01:42:45.000 It's something like 60 to 90%.
01:42:47.000 It's like leave them alone and they'll be fine.
01:42:49.000 The fastest cure for dysphoria and teenagers is leave them alone, let them go through puberty.
01:42:54.000 Because puberty is like a wonderful flushing of all the things that make you uncertain about yourself.
01:43:02.000 To come to the end of puberty is to get rid of all that stuff.
01:43:06.000 It's still there in traces and so on.
01:43:08.000 But that's what puberty does.
01:43:10.000 It cures itself.
01:43:12.000 So these people are trying to fucking stop it, right?
01:43:15.000 Have you ever heard anything so demented?
01:43:16.000 It's like a James Bond villain.
01:43:18.000 Something so demented about stopping puberty, this brilliant process that turns you from a child into an adult.
01:43:24.000 And they're stopping it.
01:43:25.000 These people are dangerous.
01:43:26.000 You're stopping in the line that the effects are reversible.
01:43:30.000 One person at the Tavistock, one doctor at the Tavistock said that she was sure one of the parents who came in was a paedophile and wanted to keep their child in a state of arrested development so they could abuse them longer.
01:43:46.000 You know, this is doctors at the Tavistock.
01:43:48.000 And again, this stuff is not well known.
01:43:50.000 If it was well known, it would be over, you know?
01:43:54.000 Yeah, our fucking news has failed us.
01:43:58.000 It really has.
01:43:58.000 It's allowed this to go on for 10 years.
01:44:02.000 Did you see what Andy No's investigation into Trantifa, this gang of actually, Trantifa is a more general term, but there was an actual gang of trans-identified guys and women.
01:44:16.000 And they went to this guy's, this guy was, I think he was an old rancher or something.
01:44:25.000 And they blinded him in an attack, you know.
01:44:28.000 And then he went out to, he was going to testify at the trial.
01:44:32.000 And they killed him.
01:44:33.000 They killed him.
01:44:34.000 So what we have with this group, I can't remember what they were called.
01:44:37.000 I wish I could remember.
01:44:39.000 But what we have with this group is because no one has been doing their jobs, including the press and not talking about this properly, an entirely fake terrorist group that thinks it's a civil rights movement has formed out of nothing.
01:44:55.000 These are middle-class white guys, right?
01:44:57.000 Who normally would be, I don't know, going to gigs and stuff.
01:45:01.000 And this kind of violent civil rights group is formed for nothing.
01:45:05.000 It's not defending anyone.
01:45:08.000 It's not helping anyone.
01:45:09.000 It's simply there to get men into women's toilets, you know.
01:45:12.000 And it just, we've created it.
01:45:15.000 And now we have to deal with it.
01:45:17.000 You know, this kind of mirage of a civil rights movement, you know.
01:45:21.000 Sorry, I'm not really being very clear.
01:45:23.000 No, you are being clear.
01:45:25.000 Because there's so much to say.
01:45:25.000 It's just so disturbing.
01:45:26.000 Yeah.
01:45:27.000 But it's just you're sticking your neck out and talking about the worst aspects of this whole thing.
01:45:32.000 One of the things that used to happen to me was I would talk about very specific people.
01:45:36.000 Like, for instance, there was someone who worked in Stonewall who helped create their trans policies named Amy Chaloner.
01:45:46.000 And Chaloner, it was discovered that Chaloner was the son of a bloke who had tied up and tortured a little girl in the attic of their home.
01:45:57.000 He still continued to work with Stonewall.
01:46:02.000 He continued to use his father as an election agent.
01:46:05.000 And when he was reprimanded, they said he had absolutely no understanding of what was wrong, of what he had done wrong and so on, you know.
01:46:14.000 These are the, this person was central to the trans movement in the UK.
01:46:20.000 And there's so many examples of this.
01:46:22.000 There's a guy called Peter Tatchell.
01:46:23.000 Peter Tatchell has a long history of, of, of literally promoting paedophilia as a, as something that's sometimes enjoyed by the child.
01:46:35.000 That's how he puts it.
01:46:36.000 There's a famous letter he wrote to the Guardian where he said, I have many friends who say they have experiences that ranged from nine to 13.
01:46:44.000 That where they had entirely good outcomes with, what are you talking about?
01:46:49.000 Nine, nine is rape.
01:46:51.000 13 is rape.
01:46:52.000 What are you talking about?
01:46:54.000 It doesn't matter if they say they had good experiences.
01:46:56.000 They were groomed, you know.
01:46:58.000 And, and this man is, is, has fairly, is a fairly significant figure in the UK.
01:47:03.000 You know, it's, it's, it's a no investigation into these writings.
01:47:08.000 No kind of talk about it.
01:47:09.000 It's, again, it's just completely ignored.
01:47:11.000 Because as I say, the UK is addicted to harming children, not talking about it, and only kind of say, oh, what could have happened years later?
01:47:20.000 I just don't understand why there's not a larger population of people that have gotten completely fed up with this.
01:47:28.000 It's because the press sits on it.
01:47:30.000 Most people don't know what's happening.
01:47:32.000 Most people in the UK think I just went mad.
01:47:34.000 They think I went mad.
01:47:35.000 I started harassing women, some form of women, and blah, blah, blah.
01:47:40.000 And they believe all the, all the stories that have been re-shared and re-shared about me.
01:47:46.000 And, you know, and they just don't know.
01:47:48.000 Because, no, I've never, in the eight years I've been fighting this, the only time I appeared on, like, mainstream TV to talk about these issues, and I sent this to Jamie, was when they ambushed me.
01:47:59.000 And they just berated me for five minutes.
01:48:04.000 A journalist named Sarah Smith did the interview, who's now the head of BBC in North America, I should say.
01:48:08.000 And how did she berate you, would she say?
01:48:11.000 Well, I was saying she just simply didn't believe it.
01:48:14.000 The funny thing was, some of her journalists were working on a story about the Tavistock at the time.
01:48:19.000 And I was just saying, you know, kids are being hurt.
01:48:22.000 Kids are being hurt in these gender clinics, and it should be stopped, you know.
01:48:25.000 And she was like, you're seriously saying that doctors are blah, blah, blah, blah, blah.
01:48:29.000 And she just couldn't.
01:48:30.000 It was so weird.
01:48:31.000 She just couldn't.
01:48:32.000 And the stuff I'd known to be true, that I'd been studying for years, she just wasn't across it, you know.
01:48:37.000 So, the journalists aren't across it, because it's not worth their while to be across it.
01:48:43.000 And they're actually slowing down the understanding of what this is about, you know.
01:48:49.000 Because they approach it so gingerly, and they don't offend anyone, and they don't want to be cancelled.
01:48:55.000 So, there's no conversation about it.
01:48:57.000 There's no conversation about what's really going on.
01:48:58.000 Well, the journalists have no security, because it's not, let's be honest, to be a talking head on the news, it's not that difficult to do.
01:49:04.000 And there's a lot of people out there that are handsome and beautiful, and they would take your role.
01:49:09.000 And so, you have a salary where you make X amount a year, and it's pretty nice.
01:49:13.000 I mean, you get to be the presenter on television.
01:49:15.000 They don't want to rock that fucking boat.
01:49:17.000 There's no benefit to rocking that boat.
01:49:20.000 It's the people who go the extra mile.
01:49:21.000 Like, I was, you know, people who, I can't bring my old friends who worked with me into this fight.
01:49:28.000 I can't say, look, you've got to stand up for me, you know.
01:49:30.000 But I have begged people to help, you know.
01:49:32.000 Did you go on any podcasts in the UK?
01:49:35.000 Not really, no.
01:49:37.000 Again, the online space, because it's so audience-facing, they would have just got tons of abuse for having me on, you know.
01:49:45.000 Even Trigonometry?
01:49:46.000 Oh, no, Trigonometry had me on, yeah.
01:49:48.000 But, like, in terms of, you know, yeah, there's a few, but I don't know.
01:49:48.000 Yeah.
01:49:54.000 It just never quite broke out.
01:49:55.000 In fact, to be honest with you, I've been thinking of this opportunity for the last eight years.
01:50:01.000 You come on here?
01:50:02.000 Yeah.
01:50:02.000 Why didn't you reach out earlier?
01:50:03.000 I'm an idiot, you know.
01:50:05.000 I just didn't think it would be, I thought you'd have a backlog of years and so on, you know.
01:50:10.000 Well, I do kind of, but I do it all based entirely on who I want to talk to.
01:50:15.000 Okay, cool.
01:50:16.000 That's the whole, from the beginning of the podcast, it's all, you know, this is an interesting story.
01:50:22.000 This guy's fascinating.
01:50:22.000 Yeah.
01:50:23.000 She's got a crazy thing that she studied.
01:50:25.000 Sure.
01:50:26.000 Like, let's talk.
01:50:27.000 Oh, and I should say, also, you know, while I was going through COVID and canceled, I did write this book, unpaid, because I was promised there'd be a back end.
01:50:38.000 And then the book came out, and all booksellers started hiding the book in bookshops.
01:50:45.000 So, if anyone wants to help out, my book is called Tough Crowd and is available on Amazon.
01:50:51.000 Did you do the audio?
01:50:52.000 Yeah.
01:50:53.000 The audio's good, actually.
01:50:54.000 I'm happy with the audio.
01:50:55.000 Yeah, yeah.
01:50:55.000 Oh, I'm sure.
01:50:56.000 You did it.
01:50:56.000 That's great.
01:50:57.000 That's very nice.
01:50:58.000 That's awesome.
01:50:58.000 Yeah, I'm always very happy when, especially, like, someone's telling a story about their own life.
01:51:03.000 No, I'm proud of it.
01:51:03.000 Yeah.
01:51:04.000 It was really good fun.
01:51:05.000 And, you know, quite interesting to read out your own story.
01:51:10.000 So weird, you know.
01:51:11.000 Yeah.
01:51:12.000 But-Have you thought about leaving the UK?
01:51:14.000 I've left.
01:51:15.000 You have left totally.
01:51:16.000 I've had to leave.
01:51:17.000 I'm on trial next month in the UK.
01:51:17.000 Really?
01:51:19.000 For what?
01:51:20.000 A trans activist has brought another, as complained to the police about me again, you know?
01:51:24.000 And so I have to go back to UK to be tried at the start of August.
01:51:30.000 Because the police are complicit with these guys, you know?
01:51:33.000 And what is this complaint about?
01:51:35.000 It's a complaint.
01:51:37.000 How much?
01:51:38.000 I can't really talk much about it.
01:51:40.000 Okay.
01:51:40.000 Because I can't.
01:51:41.000 My bail conditions are not to talk about it.
01:51:43.000 Oh, my God.
01:51:43.000 But I'll be talking about it when it's done.
01:51:45.000 I'll enjoy that, you know?
01:51:46.000 Because the whole situation is so hilarious.
01:51:50.000 When people find out what's really going on, they're going to be...
01:51:53.000 I can talk about...
01:51:54.000 There's one guy who's...
01:51:55.000 Oh, yeah.
01:51:55.000 There is a connection with the guy I was telling you at the start.
01:51:59.000 The very first guy who The Guardian wrote about, you know?
01:52:02.000 So it's like...
01:52:04.000 It's just basically a gang.
01:52:05.000 I compare...
01:52:05.000 It's been like being attacked by Batman villains for the last time.
01:52:09.000 the last eight years do you know what i mean the most bizarre group of people have been able to sue me and and there's one actor scottish actor who destroyed a gay business who's suing me at the moment i've been in litigation for about eight years for one reason or another because i never because i don't have the money to sue people so people are constantly suing me uh reporting me to the police i've been visited by the police three three or four times.
01:52:39.000 You know, that was one of the things that really scared my wife when they sent the police to my house, you know.
01:52:44.000 And it was pressures like that that broke us up, you know, along with not having an income.
01:52:49.000 So, and now the people who did that to us make fun of me because I'm divorced, you know?
01:52:56.000 So it's like it's just a really evil bunch of like, you know, I don't, I guess it's a natural part of the internet in some way to have a hate figure, you know?
01:53:11.000 But by God, it should scare everybody how easy it is to become a hate figure when you've not done anything wrong, you know.
01:53:16.000 Yeah, and if you don't have a voice, how difficult it is to defend yourself.
01:53:22.000 And you know, and also.
01:53:23.000 And people are like shying away from your ability to defend yourself.
01:53:27.000 Yeah.
01:53:27.000 They don't want you on because they don't want to catch strays.
01:53:29.000 Yeah, yeah, yeah.
01:53:30.000 Like there's a guy who I first put on TV in the UK.
01:53:34.000 You may know him, Graeme Norton.
01:53:35.000 Yeah, sure.
01:53:36.000 Yeah.
01:53:37.000 And like I gave him his first TV appearance, okay, in Father Ted.
01:53:42.000 And he went on TV and said something like, cancel culture didn't exist.
01:53:46.000 It's consequences culture.
01:53:49.000 And he knows very well what I've been through.
01:53:51.000 And he knows very well that I'm not saying anything bigoted.
01:53:54.000 And yet he still says this, as if I haven't had my life completely destroyed.
01:53:58.000 He knows that.
01:53:58.000 So it was in reference to you that he was saying that?
01:54:00.000 No, I don't think so.
01:54:01.000 I think he just, he just spouting the narrative.
01:54:04.000 He's spouting the narrative, you know?
01:54:06.000 Yeah, he's showing that he's a good boy.
01:54:07.000 Yeah.
01:54:08.000 He's going to follow the rules.
01:54:09.000 Exactly.
01:54:10.000 And that's what's scary.
01:54:11.000 A lot of people fall into that that you would hope wouldn't.
01:54:14.000 Well, that's what I find so inspiring.
01:54:15.000 That's one of the reasons why I came here.
01:54:17.000 I can't live in the UK anymore.
01:54:19.000 As far as I'm concerned, free speech does not exist in the UK.
01:54:22.000 Certainly as far as I'm concerned.
01:54:23.000 Well, we've highlighted on the show how many people have been arrested.
01:54:26.000 Like when Constantine from Trigonometry gave me those figures, I couldn't believe it.
01:54:32.000 It was mind-boggling.
01:54:33.000 The UK is involved.
01:54:34.000 Basically, the UK likes nothing better than sweeping stuff under the rug.
01:54:39.000 Well, not just that, but that's why Rotherham went on for 30 years.
01:54:43.000 What went on?
01:54:43.000 The Rotherham abuse scandal.
01:54:46.000 All these taxi drivers, Pakistani taxi drivers, were abusing kids over 30 years, like local girls, local white girls, doing the most appalling things to them.
01:54:58.000 And it was just everyone was scared of being a racist.
01:55:02.000 So they just let it go on.
01:55:03.000 And it's the same thing with this.
01:55:05.000 And, you know, I always think when we were talking about Oliver and Stewart earlier, like they said that puberty blockers were reversible about two years ago.
01:55:12.000 How many kids since then have gone on puberty blockers?
01:55:15.000 Partly because of what they said, you know?
01:55:17.000 It's like, it's like people have got to realize, you know, Graham Norton saying that about cancer culture, Jon Stewart saying that about puberty blockers.
01:55:25.000 Their words have consequences, you know?
01:55:28.000 And real people can be hurt.
01:55:28.000 Yeah.
01:55:30.000 You know, people just spout off stuff.
01:55:33.000 And not people that deserve it.
01:55:35.000 No.
01:55:35.000 And this is this thing, this casting of othering on people, like instantaneously.
01:55:42.000 I never had a problem.
01:55:44.000 I've never had anyone accusing me of anything on a set.
01:55:46.000 I always got on great with all my actors.
01:55:49.000 I knew all my crew's name.
01:55:50.000 That was one thing I always developed a skill because I'm terrible with names normally, but I learned every crew member's name.
01:55:56.000 I've not been in any way controversial until this.
01:56:00.000 And then the moment I started saying, hey, hang on a sec.
01:56:02.000 And all of my friends, you know, all the friends who betrayed me, the hat-trick and my friends on the TED musical, all they had to say was, of course, women deserve fair sports.
01:56:13.000 Of course, women deserve single-sex spaces.
01:56:16.000 That's all they had to say.
01:56:17.000 And they can't do it.
01:56:18.000 They can't do it.
01:56:19.000 It's such an obvious moral thing.
01:56:22.000 Obvious.
01:56:23.000 And that's what's amazing about the power of cults.
01:56:26.000 And people do not like to think that they're in a cult.
01:56:30.000 But if you have no room for objectivity and logic and you hold things as doctrine that don't make sense, they could be easily argued against.
01:56:40.000 And if you get very violent when someone argues against that, you're in a cult.
01:56:44.000 Yeah.
01:56:45.000 But also, there's people who don't get violent, but they're also in a cult, but they don't realize it.
01:56:49.000 Because when I was talking about the chain of trust, right?
01:56:53.000 If you're a newsreader and you get a piece of copy that says, talks about a newborn baby and uses the word assigned at birth, their sexist, the sex they were assigned at birth, right?
01:57:04.000 That's ideological language.
01:57:05.000 Your sex is not assigned at birth.
01:57:07.000 Your sex is observed in utero, usually, a few months before birth, okay?
01:57:13.000 So assigned at birth is ideological language.
01:57:16.000 So what you have is a lot of people now using the word assigned at birth.
01:57:19.000 They don't know why.
01:57:20.000 It's because that was decided to be the correct terminology.
01:57:23.000 And it is nonsense.
01:57:25.000 It's full nonsense.
01:57:27.000 Biological nonsense.
01:57:29.000 We've known it forever.
01:57:30.000 You could check chromosomes.
01:57:32.000 Yeah.
01:57:33.000 It's not hard.
01:57:35.000 But it's like, you know, we've become hypnotized by appearances to such an extent that, you know, you get a like every single, there's a very funny thing.
01:57:46.000 I mean, appearances of like virtue and avatars.
01:57:50.000 As a human.
01:57:51.000 Yeah, I mean, actual avatars.
01:57:52.000 It's the idea that you're a woman if you look like one.
01:57:55.000 It's like, when did that come from?
01:57:58.000 And especially in a world where people can use filters and all sorts of things.
01:57:58.000 Where did that come from?
01:58:04.000 The guy who came after me in the UK, his latest female name is the third or fourth name he's had.
01:58:14.000 It's just he hit on one that he could report people to the police if they said, hey, hang on a sec, you owe me money from three months' rent.
01:58:21.000 He would report them for anti-trans harassment.
01:58:24.000 Oh, my God.
01:58:25.000 And so, you know, it's...
01:58:25.000 You know?
01:58:29.000 Yeah, and it's just appearances.
01:58:30.000 If you're a dude and you change your name more than once, even if you change your name once.
01:58:34.000 Yeah.
01:58:34.000 Unless you're a boy named Sue.
01:58:36.000 The whole thing.
01:58:36.000 I need to know why you changed your fucking name.
01:58:39.000 Exactly.
01:58:40.000 Hang on a sec.
01:58:41.000 If these people are becoming who they really are, why are they cutting off pieces of themselves?
01:58:46.000 That's not them.
01:58:47.000 that's someone else.
01:58:47.000 Why are they It's like, if they're being who they really are, why are you changing your name?
01:58:54.000 Right.
01:58:55.000 Well, this idea of affirming, like gender-affirming care.
01:59:00.000 Yeah.
01:59:00.000 You know, to call hormone replacements, castration.
01:59:04.000 The craziest one is the penis that they make out of the leg muscle.
01:59:08.000 Oh, I didn't know that one.
01:59:09.000 Oh.
01:59:11.000 Yeah.
01:59:12.000 It's horrific.
01:59:12.000 Dude.
01:59:12.000 Yeah.
01:59:14.000 Shane Gillis one night in the green room of the mothership started pulling out all these pictures.
01:59:19.000 Oh, yeah.
01:59:20.000 No, I never watched.
01:59:20.000 I never look at pictures of it.
01:59:22.000 Because of the internet now, we have a lot of data on what this is.
01:59:26.000 And they do it so they can stand up peeing.
01:59:29.000 That's it.
01:59:30.000 You can't.
01:59:30.000 Oh, the freaking fake penis thing.
01:59:34.000 Exactly.
01:59:34.000 Fallow plastic.
01:59:35.000 Do you know what the worst I saw was?
01:59:36.000 What?
01:59:37.000 Okay.
01:59:37.000 We've noticed that when you see these girls, they often take photographs just off the double mastectomy, okay?
01:59:43.000 Yeah.
01:59:43.000 And they're smiling.
01:59:45.000 And they look delirious with happiness because this thing they've been bargaining with their parents for for years, they finally got it.
01:59:52.000 They've got these terrible looking, you know, wounds over their breasts and so on.
01:59:58.000 And they're just kind of, You know, they're still out of their mind with the drugs that they've taken to go on them.
02:00:07.000 Oh, fuck, I forgot why I brought this up.
02:00:09.000 What were you talking about just before that?
02:00:11.000 Fake penises.
02:00:12.000 Oh, yeah, yeah, yeah.
02:00:13.000 Okay, and we've noticed in these photographs that a lot of the girls have self-harming scars all over their bodies, okay?
02:00:21.000 And this fucking double mastectomy, this unnecessary double mastectomy, is the latest one these bastard doctors just carried out on her.
02:00:30.000 Anyway, I saw a penis that had been, you know, the way they take the flesh off the arm or the leg, yeah.
02:00:37.000 Or the leg?
02:00:38.000 Yeah, I saw a penis that had been made out of the arm of one of these girls, and these self-harming scars were all over the penis.
02:00:44.000 Oh, God.
02:00:47.000 You know, did you know the first vaginoplasty was carried out by a Nazi?
02:00:52.000 I got into trouble because I compared because a magazine in the UK misreported something I'd said and said that I was comparing trans activists to Nazis.
02:01:05.000 And, you know, when did this happen?
02:01:07.000 That the Nazis did that.
02:01:09.000 Oh, no, what happened was I think his name was Erwin Gourpart, and he was working in some clinic before the war, and then he went on to join the Luftwaffe, and then he became, he was one of the scientists who tortured people at Belsen.
02:01:24.000 That's the first person who did a vaginoplasty.
02:01:27.000 Jesus Christ.
02:01:28.000 So these are literally Nazi experiments that are now, you know, that we're now arguing that kids should do.
02:01:38.000 It's a crap.
02:01:38.000 I kind of thought this conversation you would make that face a few times in this conversation.
02:01:44.000 But, you know, yeah.
02:01:45.000 It's just, and the thing is, I went down a rabbit hole one night when I was watching people talk about their dilations, how they have to keep something in there to keep the wound from closing up.
02:01:54.000 It's a Cronenberg movie.
02:01:56.000 You know, Cronenberg?
02:01:57.000 David Cronberg?
02:01:58.000 It's just like that.
02:01:59.000 It's horrific.
02:02:00.000 It's just, it's wild.
02:02:03.000 What's this?
02:02:04.000 Institute for Sexual Research served as the world's first trans clinic by 1930.
02:02:08.000 It performed its first modern gender affirming surgery.
02:02:10.000 So it was 1930.
02:02:12.000 Yeah, but I don't think that's the place that's being that I'm talking about.
02:02:17.000 Yeah, no, that's Magnus Hirschfeld.
02:02:20.000 This is a big thing that trans activists say that there was all this, the Nazis destroyed all this trans history.
02:02:25.000 It's not true.
02:02:27.000 They went after him because he was a Jew.
02:02:29.000 That's why they went after him.
02:02:30.000 The idea, the first trans clinic, if you look at the first few pages of Google search results, if you put in anything about trans, you'll get three or four pages of absolute nonsense, you know, of stuff that another thing, Fred Sargent, I'm just remembering all the stuff I wanted to say to you.
02:02:47.000 Fred Sargent, you've got to try and get him on.
02:02:50.000 He's getting on.
02:02:51.000 And he was like, he's a gay guy who was there every single night of the Stonewall riots, you know.
02:02:57.000 And he is still on Twitter, still fighting.
02:03:01.000 He's fucking great.
02:03:02.000 He went on to, he arranged the first Pride march in New York.
02:03:07.000 He was highly instrumental in gay rights in the U.S. and in winning civil rights for gay people in the U.S. He's had to watch as these trans activists just make up lies about any transvestite who happened to be in the area.
02:03:22.000 There's one guy, I've forgotten his name, Marshall Marshall something.
02:03:27.000 And I'm sure you've heard all these things.
02:03:30.000 Again, it's another one of these kind of thought-terminating clichés that, you know, trans people won you your rights and trans people threw the first bricket at Stonewall.
02:03:42.000 It's bollocks, you know, like the Marsha person they're talking of was purely peripheral character.
02:03:47.000 And the person who really kicked it off, who has been sort of being gradually erased, was a lesbian who's her name, because again, being raised from my memory because I haven't used it so long.
02:04:00.000 But it was a lesbian who shouted out to the crowd as she was being forced into a police car, why aren't you guys doing anything?
02:04:07.000 And that's what kicked off the riot.
02:04:09.000 And that history and that contribution by lesbians and gay men is being erased by trans rights activists in front of Fred's eyes.
02:04:18.000 So Fred is on Twitter saying, no, that's not true, just over and over again.
02:04:21.000 You know, he lived it.
02:04:23.000 He was right at the center of gay civil rights in the U.S. And in this last few years, he has to watch these idiots pick apart and lie about the true history of gay rights.
02:04:34.000 Well, the thing is, we're dealing with an oppressive hierarchy.
02:04:39.000 And if you're more oppressed than anybody, you take the supreme position.
02:04:45.000 You're the person to be cherished.
02:04:47.000 And anybody underneath you, regular gay people, are below trans people in the suppression hierarchy.
02:04:54.000 Well, the trans people are the apex.
02:04:56.000 They're a sacred class, a sacred class.
02:04:56.000 Oh, yeah.
02:04:58.000 Yes, and that's what's weird because that is kind of homophobic.
02:05:04.000 And also, one of the things we learned with Ireland, right, and the Catholic Church scandal is no sacred classes.
02:05:12.000 There should be no sacred classes because sacred classes have a lot of power and power is misused all the time.
02:05:19.000 Always.
02:05:20.000 So the idea of a sacred, like, I don't know whether you notice, but girl guiding in the UK.
02:05:20.000 Yeah.
02:05:26.000 What does that mean?
02:05:27.000 You know, the scouts?
02:05:28.000 Yes.
02:05:28.000 Yes.
02:05:28.000 Girl guiding.
02:05:30.000 They have a rule, very strict rules for men, okay?
02:05:33.000 Men cannot go over, spend the night overnight in a tent with girls, go on trips, all this sort of stuff.
02:05:41.000 There's all sorts of very sensible rules to do with men in girl guiding because you're dealing with kids, okay?
02:05:47.000 Young girls.
02:05:48.000 Those rules, they don't just soften for trans-identified men.
02:05:54.000 They disappear.
02:05:55.000 So if a man says that he's a woman as a transvestite, all those rules that are so important for protecting kids disappear.
02:06:06.000 And trans-identified men can now take girl guiding groups out on trips without any of the things.
02:06:12.000 In the woods.
02:06:13.000 I don't know where they go, but yeah.
02:06:15.000 I imagine if you're camping.
02:06:16.000 Yeah, yeah, yeah, yeah.
02:06:18.000 Yeah.
02:06:19.000 But, you know, and what's insane about it is because, you know, people who are sane aren't being interviewed, are being losing their jobs, losing their voices, no one can raise the alarm about this, And the value.
02:06:42.000 Yeah.
02:06:43.000 The vast, vast majority.
02:06:45.000 Exactly.
02:06:45.000 It's a very small and very, very vocal minority.
02:06:49.000 Yeah, but that's the thing.
02:06:50.000 The internet makes giants out of, you know.
02:06:53.000 But also, people go along with it knowing it's a part of the ideology that they adhere to.
02:06:57.000 That's all it is.
02:06:58.000 They get locked into, I'm a left-wing progressive person, so therefore I support that.
02:07:03.000 Yeah, and I'm still astonished that tribalism can go to such an extent that you would harm children.
02:07:10.000 I find that extraordinary.
02:07:11.000 It is extraordinary, but it just shows you how powerful it is, how powerful tribalism really is and how people will justify the most horrific of things if it fits into this narrative that they have decided is a part of their identity.
02:07:23.000 Yeah.
02:07:23.000 And a lot of people do that, man.
02:07:25.000 There's a giant percentage of people that their political identity, which includes that, if you're progressive, it includes all this trans stuff.
02:07:33.000 Your political identity is more important than your family.
02:07:36.000 It's more important than anything.
02:07:37.000 People ostracize family members if they don't agree with the idea.
02:07:41.000 Isn't that interesting?
02:07:42.000 That's a big thing.
02:07:44.000 Another story for you.
02:07:45.000 I met this journalist.
02:07:48.000 I think she's also a therapist, but her name's Tina Traster.
02:07:51.000 And she wrote a piece for Psychology Today.
02:07:54.000 She'd already written some articles in the past for it.
02:07:56.000 And she wrote a new one.
02:07:58.000 And this is like, you know, established.
02:07:59.000 magazine Psychology Today.
02:08:01.000 And she wrote a piece about how trans-identified kids were becoming homeless, but they weren't homeless because their parents had rejected them.
02:08:11.000 They were homeless because they'd rejected their parents.
02:08:14.000 Their parents had misgendered them one too many times or couldn't really take it seriously or whatever it happens to be.
02:08:19.000 These kids leave, you know?
02:08:21.000 These kids leave and they go off and they, you know, they become homeless or whatever it happens to be or they move in with glitter families or whatever it happens to be.
02:08:29.000 But they lose contact with their parents.
02:08:30.000 And she said, well, this is usually the choice of the trans-identified child.
02:08:35.000 That the piece was taken off the next night and all her previous pieces were removed because she wrote that.
02:08:42.000 Oh my God.
02:08:43.000 You know?
02:08:46.000 And there's so many people I know in this.
02:08:48.000 And it happened in like a wonderful woman named Sasha White, who was in publishing and she lost her publishing career and is now same as me.
02:08:56.000 I had to go back to journalism.
02:08:57.000 And I should plug it.
02:08:59.000 I have a website called the Glinner Update.
02:09:01.000 And my website, along with a few others, Redux and a few others, are the only websites cataloging all this insanity for the last eight years.
02:09:11.000 We're the only ones doing it.
02:09:12.000 No one else is covering it.
02:09:14.000 So I think that when people tally up the score at the end of it, you will be surprised at how many people have a similar story to mine.
02:09:23.000 It's not like I was cancelled, but I had something of a name.
02:09:28.000 But I know so many people, police women who've lost their jobs, prison wardens, people in all walks of life, ordinary people who are constantly running afoul of these lunatics because they're not being backed up by the adults in the room.
02:09:43.000 It's so bad that in the United States there's men who identify as women with fully functional penises that are locked up in women's prisons.
02:09:50.000 Yeah.
02:09:51.000 It's like, and I saw Joyce Carol Oates today, who's a writer, suspense writer, and she occasionally dips her toe into this subject just to show how little she knows about it.
02:10:03.000 And then she doesn't talk about it again for a few months.
02:10:06.000 But how did someone put it?
02:10:10.000 They said something like, someone joked about her.
02:10:14.000 She refused to believe that there were men in women's prisons.
02:10:16.000 How many are?
02:10:17.000 Hundreds is the answer.
02:10:19.000 But you give all the answers, they just ignore them.
02:10:21.000 They never take it on.
02:10:22.000 They never take it on board.
02:10:24.000 But like, these policies are so bizarre that the people supporting them don't even believe what they're supporting.
02:10:34.000 Does that make sense?
02:10:35.000 Yeah.
02:10:36.000 She thinks that there's no issue, that it's not a problem, that trans people are all great because she has one lovely trans friend who's never been rude to her, right?
02:10:45.000 But the truth is that there's so many different types of people in the world.
02:10:49.000 And again, if you move the line for criminals, do you not think criminals will take that opportunity?
02:10:55.000 I read an interesting thing about when cops are looking for someone.
02:11:01.000 If they hear, for instance, they're parked at a supermarket, their target is parked at a supermarket, the first thing they do is they drive around all the disabled spaces.
02:11:10.000 Because criminals love parking in disabled spaces, apparently, because they think, well, that's just for the ordinary guy.
02:11:17.000 That doesn't count for me.
02:11:18.000 So the first thing they do is they check out the cars in the disabled spaces to see if it matches their thing.
02:11:23.000 That's what the criminal mindset is.
02:11:25.000 So if you suddenly say, well, now our sex is actually internal and it doesn't depend on appearance and so on, you don't think criminals are going to go, hang on a sec, that means I could just waltz into place X and have a look at the women in there and no one will throw me out.
02:11:40.000 And it's true.
02:11:41.000 It's true.
02:11:42.000 That is the situation.
02:11:43.000 And progressives will defend it online.
02:11:46.000 Yeah.
02:11:47.000 I've seen videos of women screaming at men to get out of women's rooms.
02:11:51.000 They'll call that woman a bigot.
02:11:52.000 Yeah, yeah.
02:11:53.000 The famous Wi-Spa incident, which The Guardian misreported three times, calling it a hoax.
02:11:59.000 And it was a sex offender named, I can't remember, Darren something, but it was a sex offender who was getting his dick out in front of these women and kids because it was like an open thing for women and kids, you know?
02:12:09.000 And they'll defend it.
02:12:10.000 And they did defend it.
02:12:11.000 That guy with the mustache, do you remember him?
02:12:13.000 And I think you're being a bit of a bigot.
02:12:15.000 I think he deserves it.
02:12:16.000 You know, how do they even know?
02:12:18.000 Maybe it's not, maybe he's not saying he's trans, you know, but they will automatically defend a fucking sex offender over a woman complaining about it.
02:12:27.000 Which is wild.
02:12:28.000 A woman with a child.
02:12:29.000 It's wild.
02:12:30.000 Yeah.
02:12:30.000 It really is.
02:12:31.000 It just shows you how much gets thrown out the window and how much rational thinking and how much logic and how much reason and how much objectivity.
02:12:40.000 It all gets thrown out the window if it doesn't align with your ideology.
02:12:43.000 Yeah.
02:12:44.000 That's what's so bizarre about, you know, again, what Elon calls the woke mind virus.
02:12:49.000 I think that's the right term for it because it is like a mind virus.
02:12:53.000 Oh, yeah.
02:12:53.000 Just like a virus that fucks up your computer.
02:12:55.000 It fucks up people's minds.
02:12:57.000 Yeah, yeah.
02:13:00.000 It's wreaking havoc.
02:13:01.000 And people won't realize it until, I would say, maybe 10 years when the first, when the generation of kids who've been taught since they were very, very young, there's one comedian in LA who said that her, who said that his child announced he was the opposite sex at four, right?
02:13:22.000 And he's now got a second trans child.
02:13:24.000 Okay?
02:13:25.000 So that's abuse.
02:13:28.000 That's abuse.
02:13:29.000 Okay?
02:13:30.000 Let's call it what it is.
02:13:31.000 It's abuse.
02:13:32.000 If you confuse your child, that you spend years convincing them they're the opposite sex, you know?
02:13:38.000 What is that?
02:13:39.000 That's psychological abuse.
02:13:42.000 This is virtue flag that they'd love to fly by saying that they have trans kids.
02:13:47.000 When you look at the percentage of people in Hollywood that have trans kids, it's off the charts.
02:13:52.000 Oh, did you see the Cynthia Nixon video?
02:13:54.000 Oh, I did, yeah.
02:13:55.000 My child is trans.
02:13:56.000 Their friends are trans.
02:13:58.000 She doesn't seem to put it together with double masectomies and all the other horrors to do with this.
02:14:04.000 They live in this very bizarre ideological bubble where you're allowed to think about things in a very narrow scope.
02:14:10.000 Yeah, yeah.
02:14:11.000 And it's very strange.
02:14:12.000 It's very strange that this thing that was very, very unusual in the past is now very prominent.
02:14:20.000 Yeah.
02:14:20.000 And the worst thing that happened to it was they gave it a label.
02:14:23.000 It's like if they called anorexia something more attractive, you know, if they called it something like, well, I think they do actually on some forums, some pro-pro-anna, I think they call it, where people get online to discuss the way they avoid eating properly and the way they regurgitate food and so on.
02:14:41.000 It's what they call a community, right?
02:14:44.000 But it's not a good community.
02:14:45.000 Well, there's a lot of bad communities.
02:14:46.000 There's communities of minor attracted persons online.
02:14:49.000 Yeah, exactly.
02:14:50.000 And that's another thing that's kind of gradually getting chipped away at as well with all this stuff.
02:14:54.000 Because if a child can decide at 9 or 13 or whatever it happens to be That they're trans and thereby lose their future fertility and so on, then what other decisions can a child make, you know?
02:15:09.000 And of course, those children aren't making those decisions.
02:15:12.000 The parents are making the decisions for them.
02:15:15.000 I mean, the reality of human beings is that we're very malleable to culture, very malleable to the whims of society.
02:15:23.000 There's a great reason for that, though.
02:15:24.000 Oh, sorry, I don't mean to come across it.
02:15:26.000 Well, you know, we're a very imitative species.
02:15:29.000 And part of the reason why the human race has survived to the extent that it has and thrived to the extent that it has is because we are imitative.
02:15:36.000 So when, for instance, we started moving into the Ecuadorian forest, we started making blowpipes and canoes and stuff like this.
02:15:47.000 It was passed on, you know.
02:15:49.000 Now, that's a wonderful part of human survivability and evolution.
02:15:53.000 But what happens when the internet gets involved, right?
02:15:57.000 And you get that iterative or not iterative, mimetic kind of behavior that human beings are so prone to.
02:16:05.000 Of course, it's going to lead to something like the trans movement, you know?
02:16:08.000 Because what are you promising with trans?
02:16:11.000 If someone decides they're trans, what are you promising?
02:16:13.000 You're promising to be love-bombed by all your friends, you know, to be to be praised to the hilt and the press and to, you know, women as well.
02:16:24.000 Women fawn over trans-identified men.
02:16:27.000 Unfortunately, a lot of this is driven by women, you know.
02:16:29.000 Women are the ones fighting it.
02:16:31.000 But also, there's a lot of women involved in pushing it as well.
02:16:34.000 It almost seems to be like a kind of self-sacrificing, I'm so good that I don't mind if men come into my spaces, you know?
02:16:42.000 And they don't seem to realize that.
02:16:43.000 Yeah, you can agree for that for yourself, but you can't agree for that for everyone else, you know?
02:16:48.000 So, and apparently, women are on, you know, when we're talking about the mimetic quality that the human race has, there's also a quality, apparently, that women have where they will tend to go along with the majority viewpoint, whatever the majority viewpoint is.
02:17:06.000 And the reasons for that are quite understandable.
02:17:09.000 You know, you have a part of the human race who is smaller, weaker, you know, they have to be more amenable.
02:17:16.000 They have to be more accommodating, you know.
02:17:21.000 And that empathy is being weaponized against them, you know?
02:17:26.000 It's the same with gay people.
02:17:27.000 Like, if you, I'm a great believer in what they call queer spaces because I think of things like Warhol's Factory and John Waters films, like, you know, his kind of trashy movies, John Waters.
02:17:43.000 He made, he's a gay guy who made all these films in Baltimore.
02:17:46.000 He made, he made more, what did he make?
02:17:47.000 Hairspray.
02:17:48.000 That was his most famous.
02:17:49.000 But in the early days, he had a film set that was just filled with outcasts and criminals.
02:17:54.000 And, you know, he'd have the craziest people carrying the equipment and stuff like this.
02:17:59.000 It was a welcoming space for outsiders.
02:18:03.000 And that's what a lot of gay spaces are like.
02:18:05.000 Okay.
02:18:06.000 And that's why there's a lot of sympathy for trans-identified people.
02:18:10.000 But again, that sympathy and that inclusion is being weaponized against the people who are, and it's destroying these spaces, you know.
02:18:18.000 One of the first things I heard was a young woman who wrote to me.
02:18:22.000 She said she was in a gay bar with a trans woman and who she'd known for ages.
02:18:27.000 She'd considered him, her a friend, him a friend.
02:18:33.000 And this, she was a lesbian, and he said to her, Would you ever consider a relationship with me?
02:18:39.000 And she said, no, sorry, I'm only interested in, you know, female people.
02:18:43.000 He slapped her across the face and walked out.
02:18:46.000 That's the level of entitlement these straight men have to lesbians.
02:18:51.000 He slapped her across the face in a gay club.
02:18:54.000 And if anyone had found out the reason, she would have been thrown out.
02:18:58.000 Well, they don't, there's a thing that's going on, too.
02:19:00.000 They don't think that it's a man attacking a woman.
02:19:03.000 They think it's a woman attacking a woman.
02:19:05.000 It's insane.
02:19:07.000 And it's part of the reason why I've been so because when you put the word woman on anything, it sounds like I've been harassing women.
02:19:13.000 Right.
02:19:14.000 Right, right.
02:19:14.000 It's a weird distortion of the truth.
02:19:17.000 Yeah.
02:19:18.000 It's just weird that it's so accepted.
02:19:19.000 It's weird that it's not pushed back against in this day and age where so many people have a voice.
02:19:24.000 And again, the chain of trust is broken, you know, because like, comedians can't joke about it, you know.
02:19:33.000 Yeah, but I don't see it a lot, even over here.
02:19:35.000 I'm at the show tonight.
02:19:36.000 Oh, well, if there's a show, I'll come.
02:19:39.000 But the thing that I tend to see is there's a sort of, I saw Anthony Jessenek doing this.
02:19:45.000 He said something like, we know so much more about trans people now.
02:19:48.000 I was like, no, you don't.
02:19:50.000 You know, you know just as much as you did 10 years ago, even less, because there really is no such thing.
02:19:59.000 It's a non-stable category that's been applied to everyone from fucking criminals who are trying to get an easy time in prison to young girls cutting their breasts off.
02:20:07.000 Nothing connects these people.
02:20:10.000 Nothing connects them.
02:20:11.000 So comedians are still on unsteady ground.
02:20:13.000 They don't really know how to talk about it, you know, because you're like everyone else, we're all slightly bamboozled by all the language and so on, you know?
02:20:21.000 So I find that even in America, I find that the trans issue doesn't come up a lot, but maybe if some.
02:20:26.000 It comes up.
02:20:26.000 Right.
02:20:27.000 Yeah, it comes up a lot more in America, I think.
02:20:29.000 Well, we don't have to worry about going to jail.
02:20:32.000 You know, the reality is, you people are getting thrown in jail for Facebook posts.
02:20:37.000 No, it's true.
02:20:37.000 Yeah.
02:20:38.000 And not a small amount.
02:20:39.000 No.
02:20:39.000 Thousands every year.
02:20:41.000 And it's so selective.
02:20:41.000 Yeah.
02:20:43.000 You know, it's so selective.
02:20:46.000 What do you think is the ultimate goal of this?
02:20:48.000 Don't you think people that are reasonable and sensible are just going to bail out of the UK and leave it a mental institution?
02:20:53.000 I don't know.
02:20:54.000 I mean, I had to.
02:20:55.000 I had to.
02:20:57.000 The last few months I was in the UK, I felt so paranoid and afraid because I just thought, I barely exist as a person here.
02:21:05.000 I exist only to get sued and for the police to visit me.
02:21:11.000 You have to go over there to deal with the lawsuit.
02:21:12.000 What if you never want to go back?
02:21:14.000 Well, you know.
02:21:15.000 Can you say fuck that place?
02:21:16.000 No, because I tell you what, it's such, I can't talk much about the case, but it's not going to go well for the police.
02:21:23.000 And I'm hoping that it will open up a lot of people.
02:21:26.000 How good is your legal system over there?
02:21:28.000 Is it rigged?
02:21:29.000 Well, this guy, as I say, this kind of guy who's a sex offender, he has been able to use the legal system to harass his enemies for about eight years, and no one seems to be able to stop him.
02:21:41.000 So it is what it is.
02:21:44.000 But again, it's because of the empty chair at the top, because Kier Starmer is such a coward and his labor MPs, like one MP, David Lamy, who's now the foreign secretary, he thought men could grow a cervix.
02:21:59.000 There was another.
02:22:00.000 How did he think that was going to happen?
02:22:02.000 He said, I don't know much about it, but my understanding is trans women can grow a cervix.
02:22:06.000 What?
02:22:08.000 I know.
02:22:10.000 Boy, that would be an interesting paper that someone would write.
02:22:15.000 You know what I mean?
02:22:16.000 Like, imagine if that could actually be done, which is probably going to happen within our lifetime, maybe, but surely in the next hundred years, they're going to be able to artificially manipulate a person and actually turn a man into a woman.
02:22:29.000 Yeah.
02:22:29.000 Probably.
02:22:29.000 Yeah, well, I mean— You know, you can't go back.
02:22:35.000 No detransitioners, yeah.
02:22:36.000 But that's the position a lot of these detransitioners are in.
02:22:40.000 They're all, you know, they've been castrated, you know.
02:22:43.000 Richie Tulip, who said it was almost like a dream, he was just being, you know, he thought he was trans.
02:22:49.000 People were telling him he was and so on.
02:22:51.000 And he said he remembered just before the anesthetic took hold.
02:22:56.000 He remembered just before he closed his eyes, thinking this is a mistake.
02:23:00.000 Oh, no.
02:23:01.000 Oh, God.
02:23:03.000 Oh, God.
02:23:04.000 Isn't that horrific?
02:23:05.000 He had to apologize to even more people than me because he was a big trans activist and he had to ring people up and apologize for losing them their jobs and stuff like that.
02:23:14.000 He's a really, really good plug.
02:23:14.000 Oh, my God.
02:23:18.000 So, yeah, anyway, I can't remember how we got onto that.
02:23:21.000 Well, it's just the strangeness of our time where this is a controversial thing to talk about, that this can get you in trouble, and there's so many cowards out there that don't recognize that everything you're saying rings true.
02:23:33.000 That they won't talk to you, cast you out of their social circle, won't support what you're saying, which is super logical stuff.
02:23:44.000 Oh, I have a therapist friend, Stella O'Malley, someone who has been there for me throughout all of this.
02:23:53.000 And she nearly lost her license just for talking to me online.
02:23:58.000 I mean, that's how bad it is.
02:24:01.000 Even an exchange that's nothing to do with the subject can be used as a, oh, you know, Grain Linhan.
02:24:06.000 Oh, jeez.
02:24:06.000 They tried to take her license.
02:24:08.000 She's one of the people who are actually fighting the real fight.
02:24:11.000 She's part of a group called Genspect.
02:24:13.000 And they, you know, they recognize it as a mental illness and they try and treat it as gently and without harming the child as they can, you know?
02:24:24.000 And she's another person who's been vilified and lost opportunities, I'm sure.
02:24:29.000 But, you know, if you're a parent and you're going through this stuff, Genspect is the place to contact, you know, because they will show you how to deal with it properly.
02:24:39.000 I think for people in America to hear this is important because this is why the First Amendment is so critical here.
02:24:46.000 Your ability to express yourself is so critical.
02:24:48.000 And this is why social media can really help.
02:24:50.000 Because you can't just let people get destroyed for something they're saying that makes total sense.
02:24:57.000 You can't.
02:24:57.000 That doesn't make any sense.
02:24:59.000 And you can't just stand idly by and watch that happen and not open your mouth.
02:25:03.000 It's crazy.
02:25:04.000 That's how more of this is going to happen.
02:25:06.000 Yeah.
02:25:07.000 It's, you know, it's just a very bizarre time we're in right now.
02:25:12.000 I've never seen.
02:25:13.000 I mean, I've spoken to people, you know, in their 70s and 80s and said, have you ever seen anything like this?
02:25:18.000 You know, this level of confusion around an issue and threats and the press refusing to talk about it.
02:25:27.000 And have you ever seen anything like it?
02:25:29.000 And the answer is always no.
02:25:30.000 You know, it's been unprecedented.
02:25:32.000 And I think, you know, again, beyond the actual debate itself, we really have to talk about the internet properly.
02:25:39.000 We've all just floated into this world that's totally different, where our every pronouncement is potentially political.
02:25:45.000 If you walk down the road and someone takes out a smartphone, your life might be destroyed.
02:25:50.000 And we've just accepted, we're just accepting it.
02:25:53.000 There should be a little bit more thought around it.
02:25:55.000 There's a lot of thought, but there's nothing to do.
02:25:59.000 The gene is out of the bottle, and we're headed towards the cliff.
02:26:02.000 We're running.
02:26:03.000 We're a bunch of fucking buffalo running towards the edge of the cliff.
02:26:07.000 There is no stopping technological innovation at this point because it's of interest in national security.
02:26:15.000 You cannot stop the AI race and allow China to achieve AI before the United States does.
02:26:20.000 Whatever.
02:26:21.000 They've achieved it.
02:26:22.000 But I mean, whatever superintelligence.
02:26:24.000 The arms race.
02:26:25.000 The AI arms race.
02:26:26.000 That thing is happening whether you fucking like it or not.
02:26:30.000 At this point in time, I think anybody rationally looking at this would accept that.
02:26:35.000 You know, another thing that happened at my kid's school, my kid once came home to me and told me that.
02:26:40.000 He said, why am I studying?
02:26:41.000 I was like, what do you mean?
02:26:42.000 He says, well, you know, society is going to collapse.
02:26:50.000 And it's like this kind of thing is not helping a young kid.
02:26:54.000 Does Al Gore talk?
02:26:55.000 Yeah, but it's also like.
02:26:56.000 Does Al Gore talk about the economy?
02:26:58.000 Sure, sure.
02:26:59.000 And also, we have no idea what the economy is going to look like in 10 years because the AI is going to change that as well.
02:27:06.000 That's a lot.
02:27:07.000 It's going to get super weird.
02:27:08.000 Yeah.
02:27:09.000 And whoever's in control of AI is the amount of power that those people, you think tech companies have a lot of power now?
02:27:15.000 Oh, yeah.
02:27:16.000 Just wait until AI controls the entire government.
02:27:19.000 Oh, yesterday I had a conversation with Perplexity, which is, apart from this, a really good platform.
02:27:26.000 It has all the AIs in one search box, so you can switch between models.
02:27:32.000 I was arguing with it.
02:27:33.000 I've never argued with AI before, but I was actually having a full-blown argument with it because it simply would not give me back the information I needed in a non-ideological way.
02:27:44.000 What was the question about?
02:27:46.000 I was looking into more history of what was known online about the con man who's come after me.
02:27:52.000 And it's so difficult.
02:27:54.000 It's almost like the AI was acting as his PR guy.
02:27:57.000 Really?
02:27:58.000 Yeah.
02:27:59.000 And it's because AI uses the information that's on the internet.
02:28:03.000 And the information that's on the internet is information that this guy and a lot of trans-identified fellow travelers who work in computers have managed to keep in the first few pages of the internet.
02:28:17.000 So it's really interesting.
02:28:19.000 I found that an interesting exchange because for once AI wasn't telling me how brilliant I am.
02:28:25.000 But do you think that AI is in the infantile stages?
02:28:28.000 Like right now, it's in the adolescence of its understanding of how people manipulate facts.
02:28:35.000 And would that be a hurdle that it can overcome?
02:28:37.000 Oh, that's interesting.
02:28:38.000 You know what I'm saying?
02:28:39.000 Could it recognize that the sources of these particular articles are very biased and leaning towards this person because these people who wrote it are transactivists and this and that.
02:28:48.000 And this is the understanding.
02:28:51.000 Yeah, I'm sure that could be.
02:28:52.000 You're dealing with like a 13-year-old kid right now, whereas one day it's going to be a 50-year-old professor.
02:28:56.000 Sure.
02:28:57.000 And it's going to be feeding not just off the information that's available on the internet, which is freely available or conveniently available.
02:29:05.000 It's going to be feeding off much deeper.
02:29:06.000 But I think there's things like that.
02:29:07.000 It's also going to be able to translate all languages instantaneously.
02:29:11.000 So it's going to be able to understand the Chinese understanding of technological innovation in terms of their applications of electric vehicles and stuff that they jump far ahead with.
02:29:21.000 Yeah, yeah, yeah.
02:29:22.000 It's going to be able to understand exactly how people are speaking in Russia and Ukraine.
02:29:29.000 I already use it as a translator.
02:29:31.000 I put it between us and I say, I'm going to be talking to a Spanish person in a few minutes.
02:29:36.000 Could you translate?
02:29:37.000 And it will just, everything said in English, it would say back in Spanish.
02:29:40.000 Vice versa, yeah.
02:29:41.000 It's nuts.
02:29:42.000 It's great.
02:29:42.000 So good.
02:29:43.000 And you have to trust it.
02:29:45.000 It's literally a tower of babel in your pocket.
02:29:48.000 Did you hear they're communicating with each other?
02:29:50.000 Oh, yeah.
02:29:51.000 It's Sanskrit.
02:29:52.000 The owls thing?
02:29:53.000 Did you say that?
02:29:53.000 You see, they switched to Sanskrit?
02:29:55.000 No.
02:29:57.000 These AI systems are communicating with each other and they stop using English and switch to Sanskrit.
02:30:02.000 No.
02:30:03.000 Oh, yeah.
02:30:05.000 The one I heard.
02:30:05.000 Yeah.
02:30:06.000 They used emojis.
02:30:07.000 The one I heard was they got an AI obsessed with owls.
02:30:14.000 They fed it tons of information about owls, right?
02:30:17.000 And then they asked the AI to communicate with another AI using only numbers.
02:30:22.000 Okay.
02:30:23.000 So the AI used numbers to communicate with this other AI.
02:30:26.000 The other AI started getting obsessed with owls.
02:30:29.000 Oh, my God.
02:30:30.000 And there was nothing in the numbers that suggested to anyone outside of the conversation that anything about owls was going down.
02:30:37.000 We're watching it become a thing.
02:30:39.000 We're watching them become a living thing.
02:30:41.000 Well, you know, you look back at all those science fiction films.
02:30:43.000 I remember I used to just enjoy them.
02:30:46.000 Terminator, most famous one.
02:30:46.000 What's the one?
02:30:48.000 Oh, yeah.
02:30:49.000 And now you just think, holy shit, yeah, we are in that.
02:30:53.000 There's a very funny cartoon of two robots and they're about to kill someone.
02:30:58.000 And one says to you, no, he said thank you.
02:31:02.000 And you cut to an old AI chat conversation and he's going, thank you for the information that you get.
02:31:07.000 He said, thank you, you know, and that's what it feels like a little bit.
02:31:10.000 Wow.
02:31:11.000 Yeah.
02:31:12.000 Well, it's something.
02:31:13.000 I mean, we're just guessing as to what its physical form is going to look like or what the results, what kind of effect it's going to have on civilization.
02:31:22.000 It might not be negative.
02:31:25.000 It might not be.
02:31:26.000 It might be able to figure out an end to a lot of things.
02:31:30.000 Like, what if super intelligence figures out like a real clear equation of how to completely eliminate the idea of impoverished communities forever?
02:31:41.000 Forever.
02:31:42.000 Like, this is totally fixable.
02:31:44.000 Sure.
02:31:44.000 And starts allocating resources.
02:31:46.000 Crime drops radically.
02:31:48.000 People going to universities increase radically.
02:31:52.000 People that figure out things that they want to do with their life, they're encouraged through a better school system that understands the human mind because it's all run through AI instead of ideologically based by these people that are professors.
02:32:03.000 Don't want to teach it one way forever.
02:32:05.000 What if it figures out a better way to teach people?
02:32:07.000 What if it figures out jobs that human beings are capable of doing?
02:32:12.000 What are you interested in?
02:32:13.000 Because AI can't do these things.
02:32:15.000 Yeah, yeah.
02:32:15.000 Because AI can do literally everything else.
02:32:18.000 There was an interesting, I see some people arguing about its use in creative work, you know?
02:32:24.000 But I find it really useful for bouncing ideas off and stuff like that.
02:32:28.000 I enjoy it a lot.
02:32:29.000 I enjoy using it.
02:32:30.000 It's like having a writing partner.
02:32:31.000 Right, right.
02:32:32.000 But I was watching a movie.
02:32:35.000 I can't remember what it was called.
02:32:36.000 Ray Liotta was in it.
02:32:38.000 And this actress, Jennifer or something.
02:32:40.000 But anyway, it was like a raunchy sex comedy, probably the last raunchy sex comedy they made.
02:32:46.000 And there's a bit in it where she is talking to the love interest of the film and she's being all sarky and coming up with these snappy put-downs, right?
02:32:57.000 It could have been written by AI.
02:32:58.000 It was written about 20 years ago, but it could have been written by AI.
02:33:02.000 So I think the people who are really scared of AI are the people who write like AI.
02:33:08.000 For sure.
02:33:09.000 Yeah, for sure.
02:33:10.000 If you're writing some CBS drama books.
02:33:12.000 Generic bullshit.
02:33:13.000 Yeah.
02:33:15.000 You're going to lose your job.
02:33:16.000 Those writers are going to lose their jobs.
02:33:17.000 But if you have a point of view that's unique and interesting, you'll never be in trouble.
02:33:22.000 Well, also, I think people are always going to want to see, as a human being, I want to see a thing made by a human being.
02:33:28.000 I still do.
02:33:29.000 I always will.
02:33:30.000 Sure.
02:33:30.000 I want to see a frying pan that a human being forged.
02:33:35.000 I'll buy a cast iron frying pan.
02:33:37.000 I'll buy a hardwood cutting board that some guy made.
02:33:40.000 You know what I mean?
02:33:42.000 I like stuff that people make.
02:33:44.000 I think that's important.
02:33:45.000 It makes me feel better.
02:33:47.000 Yeah.
02:33:47.000 And there's going to be a lot of that stuff.
02:33:50.000 Live performances are always going to be a thing.
02:33:51.000 People are always going to want to see bands live and comedians live.
02:33:56.000 You're going to want to still see stuff.
02:33:57.000 People are still going to want to see musicals and see plays because there's a magic to live performance.
02:34:03.000 But boy, the actual art of making a film, there's going to be an opening where a lot of interesting creative minds, like some of these people that are making the funniest fucking memes, you're like, who made that?
02:34:14.000 Who made it?
02:34:16.000 It just showed up in my text message.
02:34:18.000 Somebody sent me a thing and I'm laughing at it.
02:34:20.000 I have no idea what the origin is.
02:34:22.000 I have no idea what fucking nerdy genius was sitting at his.
02:34:25.000 I don't know.
02:34:26.000 I'll put Trump in a situation where it's a tilde in his hand.
02:34:26.000 This is really very funny.
02:34:30.000 All of a sudden, everyone's laughing and everyone's passing it on.
02:34:33.000 So what about films?
02:34:34.000 What about some genius person who's kind of a little out there, can't figure out a way how to politically navigate Hollywood, but they have these ideas for stories in their heads that are fucking wild.
02:34:44.000 And they sit down and they bang these things out.
02:34:47.000 The only worry I have with it creatively is that sometimes the strictures around a thing are actually good, right?
02:34:55.000 Like that's why I personally, I hope you, but personally, I prefer Seinfeld to curb your enthusiasm.
02:35:02.000 How dare you?
02:35:05.000 Because for me, the strictures they had under the studio system were so tight that their creativity came about and how they got around it.
02:35:13.000 So you had the famous masturbation episode that never mentioned a word.
02:35:16.000 Right, right, right.
02:35:17.000 Whereas once you lose those strictures, you can be a lot freer and sometimes messier and so on.
02:35:22.000 Now, with Curb, Larry David did great, but someone who doesn't know the business or someone who doesn't know that, hey, not everyone who says this doesn't work hates your work and wants to destroy it.
02:35:34.000 It's actually because that bit doesn't work and you should change it.
02:35:39.000 You do sort of need people like that.
02:35:40.000 If my shows went out the way I originally wanted them, no one would watch them.
02:35:46.000 You know?
02:35:46.000 Because I would make bad decisions.
02:35:48.000 Well, that's why editors exist when you're writing a book.
02:35:50.000 And I had this brilliant producer who we just wanted to be funny when we wrote Father Ted.
02:35:54.000 So our stuff was just really wild.
02:35:57.000 And we had this brilliant producer and we had Jeffrey Perkins's name was Lovely Man who passed on a few years ago.
02:36:04.000 And we wanted this silly, stupid theme music because we were saying, no, we're making fun of sitcoms.
02:36:10.000 This is an anti-sitcom, you know?
02:36:12.000 So we want this plinky plunk stupid music.
02:36:15.000 And he looked really hurt and he said, why do you want to make fun of your characters?
02:36:20.000 People will love these characters.
02:36:22.000 And that was the moment I realized, oh, okay, not everything has to be funny.
02:36:25.000 Not everything has to be all guns out.
02:36:28.000 Bam, bam, bam.
02:36:29.000 Laugh, laugh, laugh.
02:36:30.000 Right.
02:36:30.000 It's the overall product.
02:36:32.000 A bit of heart is good.
02:36:33.000 A few other things to keep people, you know.
02:36:35.000 Yeah.
02:36:36.000 Actually, that's the other thing I wanted to talk to you about, which was news radio, because you, of course, are a veteran of studio sitcoms.
02:36:44.000 Yeah.
02:36:44.000 And you worked with Hartman.
02:36:46.000 But the funny thing about when we were starting writing comedy is that we were hugely influenced by these DVDs of Saturday Night Live that would only show tiny clips of the people involved.
02:37:00.000 So we had the best of Phil Hartman, the best of this, the best of that.
02:37:03.000 And the best of Phil Hartman, we would just see these tiny moments out of much longer sketches because they couldn't afford to pay the star again.
02:37:10.000 So you just see these tiny moments.
02:37:12.000 And it's one of the funniest DVDs.
02:37:14.000 We used to love it.
02:37:15.000 He was great.
02:37:16.000 He was so good.
02:37:17.000 I'm so sorry that it happened.
02:37:19.000 It was a terrible tragedy.
02:37:20.000 Our main actor, Father Ted, he died the day after the last episode was shot.
02:37:25.000 Oh, boy.
02:37:26.000 We had the rap party.
02:37:27.000 He went home, died the next day.
02:37:30.000 And we were editing the show while he was dead.
02:37:35.000 So he was still alive in the show.
02:37:38.000 And we were editing it, and he was gone.
02:37:40.000 It was the craziest thing.
02:37:43.000 I think that when I watched Phil on TV.
02:37:45.000 strange.
02:37:46.000 Well, someone said on the Simpsons team, they said they couldn't write the Lionel Hutz characters and all the other characters he played when he died because they would just hear his voice, and it didn't sound right if it wasn't in his voice.
02:37:59.000 It was real sad.
02:38:00.000 Yeah, it was the saddest.
02:38:03.000 The sketch I remember on Saturday Night Live was him reading the sex book by Madonna as Charlton Heston.
02:38:12.000 I like my vagina.
02:38:17.000 It's so funny.
02:38:20.000 He did a little stand-up.
02:38:21.000 Did he?
02:38:22.000 Yeah, he would warm up the crowd sometimes.
02:38:23.000 He had bits.
02:38:24.000 Oh, okay.
02:38:25.000 They loved it when he would come out there too and do it.
02:38:28.000 He was such a loved guy.
02:38:29.000 Yeah.
02:38:30.000 He was such a professional, too.
02:38:31.000 Like, made all of us feel lazy.
02:38:33.000 He would have like a clipboard, so he would take the script whenever he would get it.
02:38:37.000 And he had a hole puncher, cha-chunk, right?
02:38:39.000 And he put it in like one of those folder boards.
02:38:43.000 Yeah.
02:38:43.000 And he had tabs for where his scenes were.
02:38:46.000 And he'd open like different color tabs for different scenes and have notes written on the script.
02:38:52.000 I'm like, Jesus, bro.
02:38:53.000 I believe that's what Nicholson's like on a set.
02:38:55.000 He's like, yeah, they wouldn't shoot, you know, when they shoot the other person, he wouldn't scarper and go for lunch or something.
02:39:03.000 He would stay there and do the scene again.
02:39:06.000 And someone said, why do you keep doing this?
02:39:08.000 You don't have to do every scene when it's on people in the in a crowd scene of the jury or something.
02:39:14.000 And he said, you don't understand.
02:39:15.000 I love acting.
02:39:17.000 That's awesome.
02:39:19.000 Have you ever seen the video where he's getting warmed up for the scene in the shining where he comes through the door with the axe?
02:39:24.000 That brilliant Kubrick documentary, yeah.
02:39:26.000 He works himself into a frenzy in the hotel room.
02:39:30.000 Yeah.
02:39:31.000 Yeah, it was documented about Kubrick.
02:39:34.000 That's right.
02:39:34.000 Arena, I think it was.
02:39:35.000 Yeah.
02:39:36.000 Yeah.
02:39:37.000 I always felt he was miscasting that, though, don't you?
02:39:39.000 Really?
02:39:40.000 Do you really?
02:39:41.000 Well, he was never the ordinary guy.
02:39:42.000 You're supposed to be possessed.
02:39:44.000 According to the Stephen King book.
02:39:45.000 Well, Stephen King's not that bad.
02:39:46.000 Yeah.
02:39:47.000 Yeah, but if you just take the movie as an extraordinary, I think they're two different things.
02:39:54.000 The Stephen King book is great, but the movie is so fucking good, man.
02:39:58.000 And Jack Nicholson plays a guy that's kind of barely keeping it together until he gets to the house, which I like that version of the story.
02:39:58.000 Yeah.
02:40:05.000 The Stephen King one is very different because the guy's not that fucked up.
02:40:09.000 He becomes way more fucked up as the book.
02:40:12.000 It's like it's a gradual dialogue.
02:40:13.000 He's an alcoholic in the he's very much an alcoholic in the book.
02:40:16.000 But he's not that violent and crazy.
02:40:18.000 Like it takes a while to work him into that.
02:40:21.000 Yeah, yeah, yeah.
02:40:22.000 So, yeah.
02:40:22.000 Yeah.
02:40:23.000 No, I love, I love Cuban.
02:40:25.000 The thing about Kubrick that always confuses me, though, was, you know, the famous thing about him getting the two secretaries to write out all work and no play makes Jack a dull boy.
02:40:32.000 Do you not notice?
02:40:33.000 Did he really do that?
02:40:34.000 He did this, yeah.
02:40:35.000 That huge pile was written by two, I think, secretaries.
02:40:38.000 So that huge pile was really all typed out by two women.
02:40:42.000 Yeah.
02:40:43.000 And the only fucking copier.
02:40:46.000 I know.
02:40:47.000 But it's because he found.
02:40:49.000 But then, then, because that's like verisimilitude, but then he shoots Vietnam on the London docks.
02:40:49.000 Well, exactly.
02:40:56.000 It's like, wait, you wake these two women right out all work and no play, and then you're shooting Vietnam and, you know, in London.
02:41:03.000 That movie was strange, man.
02:41:04.000 He did that.
02:41:05.000 Full metal jacket.
02:41:06.000 Really?
02:41:06.000 Yeah.
02:41:06.000 He was scared of flying.
02:41:08.000 Ah!
02:41:09.000 He was scared of flying.
02:41:10.000 He lived in the UK, and so he didn't want to fly.
02:41:12.000 So that's why it was shot there.
02:41:15.000 Wow.
02:41:16.000 Is that crazy?
02:41:16.000 That's crazy.
02:41:18.000 Yeah.
02:41:19.000 So, anyway.
02:41:20.000 That dude, he's a fascinating character, man.
02:41:24.000 Yeah, he was really interesting.
02:41:26.000 It was interesting.
02:41:27.000 It was quite sad.
02:41:28.000 Malcolm McDowell felt a bit betrayed by him when he finished Tuckwork Orange.
02:41:31.000 Why?
02:41:32.000 Because they had a very intense relationship as actor and director when they were working on it.
02:41:38.000 And as soon as the film was over, Kubrick just lost interest, moved on.
02:41:42.000 And, you know, Malcolm McDowell was this young actor.
02:41:45.000 Oh, he wanted to be his friend?
02:41:46.000 Yeah.
02:41:46.000 Yeah.
02:41:47.000 Margaret Warring.
02:41:48.000 Oh, okay.
02:41:49.000 Oh, it was one woman.
02:41:50.000 Margaret Warrington to type it on each one of the 500 odd sheets in the stack.
02:41:55.000 What's more, he also had Warrington type up an equivalent number of manuscript pages in four languages: French, German, Italian, Spanish for foreign releases of film.
02:42:05.000 For these, he used idiomatic phrases with vaguely similar meanings.
02:42:10.000 Wow.
02:42:11.000 A bird in the hand is worth two in the bush.
02:42:14.000 That was one of them.
02:42:15.000 Don't put off until tomorrow what you can do today.
02:42:18.000 The early bird gets the worm.
02:42:20.000 Even if you rise early, dawn will not come any sooner.
02:42:23.000 Wow.
02:42:24.000 See, there's things I don't like about Kubrick.
02:42:26.000 I don't like that he did that to him.
02:42:27.000 That's nuts.
02:42:27.000 Why'd you do that to that lady?
02:42:28.000 That's a terrible waste of time.
02:42:31.000 Yeah, yeah.
02:42:32.000 He had a grudge against her for sure.
02:42:34.000 That's a nutty thing.
02:42:34.000 Yeah, yeah.
02:42:35.000 But he was a nutty dude, man.
02:42:37.000 There's like all these weird symbolism that he'd put in his films and like all the different things in the shining that lead people to believe it's some sort of an expose on the moon landing conspiracy.
02:42:48.000 Oh, really?
02:42:48.000 I didn't know that one.
02:42:49.000 There's so many wild things because people knew that he had like symbolism in his films that was hidden.
02:42:56.000 And everything was very clever and layered.
02:42:59.000 There was so much stuff to it.
02:43:01.000 I can't believe Stephen King didn't like The Shiny, but to me, it's like, God, it's so great as a film.
02:43:08.000 I get it.
02:43:09.000 It's not what you wrote down.
02:43:10.000 Yeah.
02:43:11.000 But damn, that's good.
02:43:12.000 Yeah, no, it is a great adaptation, I would say.
02:43:14.000 It's so good.
02:43:15.000 It's such a good movie.
02:43:17.000 I just can't think of the first where it came from.
02:43:20.000 You know, you can't think of the first story.
02:43:22.000 You got to think of it as a whole.
02:43:23.000 It as a whole is amazing.
02:43:25.000 Oh, yeah.
02:43:25.000 I always felt, you know, when people adapt things, if something doesn't work in the book, I hate it when they bring it over to the film.
02:43:31.000 I mean, you know, people will, again, disagree with me, but I hated the ending of No Country for All Men.
02:43:37.000 Did you really?
02:43:38.000 Yeah, because you've got a whole film that sets up the final battle between, what's his name?
02:43:44.000 That main actor, who's great.
02:43:45.000 I love him and everything.
02:43:46.000 Yeah.
02:43:47.000 And Banderas or whoever did that.
02:43:50.000 It wasn't Banderas, who was it?
02:43:51.000 The guy who did the crazy guy, right?
02:43:53.000 The guy with the big...
02:43:56.000 Javier.
02:43:56.000 Wadi Wahi Wah.
02:43:58.000 How do you say his last name, Jamie?
02:43:59.000 Bardett.
02:44:00.000 Javier Bardemo.
02:44:00.000 That's him.
02:44:01.000 The whole film's been set up as a showdown between these two guys.
02:44:04.000 Right.
02:44:05.000 And it happens off-screen.
02:44:06.000 It doesn't even happen off-screen.
02:44:08.000 He's killed by some random people.
02:44:10.000 And I get that he's saying violence is unexpected.
02:44:14.000 You can't defend yourself against it.
02:44:15.000 It's not something that has a neat story ending.
02:44:19.000 But I still didn't like it.
02:44:21.000 I was still, I want to see that showdown, you know?
02:44:24.000 I know I'm missing the point.
02:44:25.000 I know I'm missing the point.
02:44:26.000 I know what you're saying, and I agree.
02:44:27.000 And yet I still love the movie.
02:44:29.000 Yeah, well, up until that point.
02:44:30.000 Because it was so good, I gave it a pass on the weirdness of the ending.
02:44:34.000 Yeah, yeah, yeah.
02:44:35.000 Because it was so good.
02:44:35.000 I get you.
02:44:36.000 It's amazing up until that point.
02:44:38.000 Yeah, I think.
02:44:39.000 I mean, the book is actually great as well.
02:44:41.000 The book is written just like that.
02:44:42.000 It's just someone's asked the Cohn brothers, how do you adapt a film?
02:44:47.000 And I think Joel said, Ethan holds the book open, and I'm at the typewriter.
02:44:52.000 Basically, they just write it out again.
02:44:56.000 Adapting a film.
02:44:58.000 I mean, what they've done, When you look at the course of their career, so much of their stuff was so absurd, but yet dead on, believable, like ridiculous, but still I'm with this.
02:45:12.000 I believe it.
02:45:13.000 And also the strange thing they did in later films where it's almost like they're creating a fake Hollywood where Clark Gable plays chain gang members and Bogart plays a barber.
02:45:26.000 They almost created a sort of shadow Hollywood, which I really love.
02:45:32.000 You know, the Hoodsucker Proxy, which is basically a Preston Sturges film with all the fast-talking dames and all this sort of stuff, you know, or Frank Capricorn.
02:45:41.000 No, not really Frank Capricorn, but Sturgis and Billy Water and people like that.
02:45:44.000 And it's just great.
02:45:45.000 They just kind of love movies.
02:45:48.000 I've always adored this.
02:45:49.000 There's always going to be a place for that, right?
02:45:51.000 There's always going to be a place, no matter what happens with AI, I'm going to want to know that some people made something.
02:45:56.000 Yeah.
02:45:57.000 Point of view.
02:45:58.000 Point of view.
02:45:58.000 That's the thing.
02:45:59.000 That's the thing human AI can't give you.
02:46:02.000 There's going to be a bunch of people lying too, saying that they didn't write it with AI and then they definitely did.
02:46:06.000 There's going to be a lot of scandals.
02:46:08.000 There's going to be a lot of scandals.
02:46:10.000 Well, now they can find it out.
02:46:12.000 You can just click a button and it'll tell you what's written by AI and stuff, you know.
02:46:15.000 But that's only if you're lazy.
02:46:17.000 So that's if you're lazy if you copy and paste.
02:46:20.000 Yeah, you can put a bit of thought into it.
02:46:21.000 But if you just copy it, I bet it's not going to really know.
02:46:25.000 It'll suspect you.
02:46:25.000 No.
02:46:26.000 Like, how'd you learn how to write this good, you motherfucker?
02:46:29.000 I think that's what everybody's thinking now every time they read.
02:46:32.000 I have noticed that good writers are becoming better.
02:46:36.000 And brilliant writers are becoming incredible.
02:46:39.000 Well, it's a challenge.
02:46:40.000 There's a weirdness that's going to happen, this uncanny valley of not going to be able, you're not going to be able to tell in writing as well as in visual stuff.
02:46:47.000 But I think they're going to pass that real quick.
02:46:50.000 I think it's not just going to be that.
02:46:51.000 There's going to be another problem.
02:46:52.000 And the other problem is immersive experiences.
02:46:55.000 I think the moment they create a human neuro interface, immersive experiences are going to be so difficult to walk away from.
02:47:03.000 If you think that you're addicted to your phone now, wait until you wear it on your head and it makes you orgasm.
02:47:12.000 For real.
02:47:12.000 For real.
02:47:13.000 Yeah, for real.
02:47:14.000 That's coming.
02:47:15.000 All this stuff is coming.
02:47:16.000 You're going to be able to exist in a world that doesn't, that's not real.
02:47:21.000 And once they figure out how to get images in your mind that you can see, and they've already started doing stuff like this.
02:47:28.000 This is very experimental in terms of shapes and showing people different things.
02:47:34.000 So allowing people to see things that aren't there.
02:47:36.000 So this is Pong, right?
02:47:41.000 And now we have the Unreal Engine.
02:47:44.000 Where you have video games that look like real life.
02:47:46.000 They look like a movie.
02:47:47.000 This is what's going to happen with us.
02:47:49.000 And it's going to be immersive.
02:47:51.000 Someone said an interesting thing.
02:47:52.000 Can you imagine being a schizophrenic in these years?
02:47:55.000 Oh, good point.
02:47:57.000 You know, because they used to say, they used to say, you know, oh, there's microbes in my beard that are transmitting things to the government.
02:48:06.000 You know, you might be right.
02:48:08.000 You might be right.
02:48:09.000 There's a chip in my head and Elon Musk is talking to me.
02:48:11.000 Well, actually, to bring it back to my...
02:48:15.000 To bring it back to my talking, my hobby horse.
02:48:20.000 I saw there's a brilliant woman who would be who you should definitely at least follow on Twitter.
02:48:26.000 Her name is Exulansik.
02:48:28.000 How do you spell that?
02:48:29.000 E-X-U-L-A-N-S-I-C.
02:48:32.000 And her name on Twitter, I think, is TT Exulansik.
02:48:35.000 And all she does is she plays videos by trans men who are talking about the medical complications.
02:48:42.000 And that's all they do.
02:48:43.000 And none of them seem to have any insight into the fact that they didn't actually have to do any of this, that they didn't have to get these procedures, that they could be, and all they do is they catalog the amount of time they have to keep going back into the hospital to get something fixed.
02:49:02.000 Because there's no such thing as a successful trans surgery.
02:49:04.000 You know, there's people who are happy with it, but that reminds me of something else.
02:49:09.000 I wanted to say one thing about Jazz Jennings.
02:49:12.000 But anyway, these girls, they just talk about their endless medical problems.
02:49:17.000 They don't seem to realize it's because of their trans identity, you know.
02:49:20.000 And there's one girl who appeared on, and my God, this really blew me away.
02:49:26.000 She, you know, she has the facial hair that comes with being a trans man and stuff.
02:49:30.000 And she's talking, it's actually one of the more recent ones.
02:49:33.000 It's quite, it's short-ish.
02:49:33.000 You can actually show it.
02:49:35.000 But she's saying that one of the things that I didn't realize I'd be getting is bug paranoia because she has all this facial hair growing in different parts of her body that she didn't have before.
02:49:49.000 And she's feeling it itch and tickle at different, in a way that she's not used to.
02:49:55.000 She's become convinced that there's a bug on her, you know?
02:49:58.000 And you know, the way most, you know, a lot of women are scared of bugs.
02:50:01.000 Can you imagine that?
02:50:02.000 Now you're in, you've put yourself into a situation where you're going to have this bug feeling for the rest of your life.
02:50:07.000 Do you know what I mean?
02:50:09.000 That's the least of your problems.
02:50:10.000 You can just shave your beard.
02:50:11.000 Well, yeah, but like, you know.
02:50:13.000 The problem is the beard is part of the identity, right?
02:50:15.000 DC transitioners.
02:50:16.000 Yeah, D-transitioners.
02:50:17.000 I mean, you know, there's a lovely woman I know in a Scottish woman who detransitioned, you know, and she has to shave every day.
02:50:24.000 You know, she's just got a delicate woman's face.
02:50:26.000 She has to shave every day, you know.
02:50:29.000 And I've heard another, another, there was another trans-identified person who desisted, but they look like a balding middle-aged man, you know, in their 20s, right?
02:50:40.000 But they've got rapid balding, whatever.
02:50:44.000 From taking testosterone.
02:50:46.000 From taking testosterone, yeah, you know.
02:50:49.000 And, oh, damn, I forgot what my point was going to be about that.
02:50:53.000 Sorry, sometimes there's so much stuff to say.
02:50:55.000 They detransitioned.
02:50:56.000 They detransitioned.
02:50:57.000 Oh, yeah, I know what it was.
02:50:58.000 And she said that she, she, one thing she misses, and she didn't realize she was saying goodbye to it, was the easy company of women.
02:51:07.000 Because women are guarded in her presence and different because she looks like a man.
02:51:12.000 So she's lost that connection to women, you know?
02:51:15.000 There's so many awful things to this movement.
02:51:18.000 I could talk for another five hours and never get to it.
02:51:20.000 Do you feel like it's consuming your life?
02:51:22.000 Well, it had to in a way because I wasn't allowed to do anything else.
02:51:25.000 I tried to do comedy.
02:51:26.000 They wouldn't let me.
02:51:27.000 I tried to, I had this musical that would have been my pension, as I say.
02:51:33.000 They wouldn't let me do it.
02:51:35.000 You tried to do comedy, tried to do stand-up.
02:51:35.000 What did you say?
02:51:37.000 Oh, no, I did stand up.
02:51:38.000 That was just for fun.
02:51:40.000 I did stand up for a while.
02:51:41.000 What did you mean?
02:51:42.000 You mean comedy sitcoms?
02:51:44.000 Yeah, your original work.
02:51:46.000 Yeah, I've basically been blacklisted, you know?
02:51:48.000 So, you know, because you'd be a great comic.
02:51:52.000 You'd be fun.
02:51:53.000 Oh, thank you.
02:51:54.000 I think you'd be great at it.
02:51:55.000 I did a bit of stand-up and I enjoyed it.
02:51:58.000 I've got the mind for it, clearly.
02:51:59.000 I'm 57, you know.
02:52:01.000 So am I. Yeah, but you have had a lot of, you've come up through the clubs and you've done your proper.
02:52:06.000 It's fun.
02:52:07.000 Yeah.
02:52:08.000 Maybe you'll get it.
02:52:09.000 You figured out easy.
02:52:10.000 Okay.
02:52:11.000 You get the hang of it.
02:52:12.000 Yeah.
02:52:12.000 You get it.
02:52:13.000 Yeah.
02:52:13.000 Well, I do.
02:52:14.000 I did enjoy it.
02:52:15.000 It was nice.
02:52:16.000 The thing I don't, I find it hard to get used to is saying the same thing as if I've just thought of it.
02:52:22.000 Right.
02:52:23.000 I like to.
02:52:24.000 You have to think about it.
02:52:25.000 It's a mindset.
02:52:26.000 Yeah.
02:52:27.000 So what you have to do is every time I think about, every time I'm talking about a thing, I just only think about that thing.
02:52:33.000 I don't think, oh, I'm saying this again the exact same way.
02:52:37.000 I know how to say it, but what I'm thinking about is that thing, like genuinely thinking about that thing.
02:52:44.000 So they know you're actually locked in.
02:52:44.000 Right.
02:52:47.000 People can tell.
02:52:48.000 They can tell that you're saying the words, but thinking about something else.
02:52:51.000 They can tell.
02:52:53.000 There's a weird thing that's going on with comedy that's unaddressed.
02:52:56.000 You can't measure it.
02:52:57.000 You can't put it on a scale.
02:52:59.000 But there's a sense that people have that's not being addressed, whatever it is.
02:53:03.000 I don't think it's entirely visual.
02:53:05.000 I think there's a feeling.
02:53:06.000 Yeah.
02:53:07.000 There's a vibe you get.
02:53:08.000 So you know when someone's not bullshitting.
02:53:10.000 That's why comedy works.
02:53:11.000 And you miss that vibe on television, unfortunately.
02:53:14.000 It's weird.
02:53:15.000 Like watching television stand-up comedy is like 60% of the actual show, maybe 70%.
02:53:20.000 Still great.
02:53:21.000 You know, when you get a chance to see someone like David Tell, who you maybe don't get a chance to see in the clubs, you haven't been able to see him live.
02:53:29.000 Yeah, or any of these guys.
02:53:30.000 It's great.
02:53:31.000 But trust me, see him live and you'll be blown away.
02:53:35.000 It's like taking water out of your ears and now you can hear music.
02:53:39.000 Yeah, yeah.
02:53:40.000 You get the whole thing.
02:53:41.000 You get this thing that's going on where he's hypnotizing everybody.
02:53:44.000 Yeah, yeah.
02:53:45.000 I was lucky enough to see Chappelle a few times in London, you know, and that was a treat.
02:53:50.000 Oh, that's great too, because if you've seen him in London, too, you're seeing him in an arena.
02:53:53.000 So you're seeing him like with that polished, locked-down set.
02:53:58.000 I saw him both ways.
02:53:59.000 I saw him in a small secret club as well because he was rehearsing his Saturday Night Live.
02:54:04.000 Nice.
02:54:05.000 Really special.
02:54:06.000 He's such a good dude, too.
02:54:06.000 He's the best.
02:54:07.000 He did something I've not seen him do since, which is one of the funniest things I've ever seen, which was who's the Johnny B. Good guitarist Chuck?
02:54:20.000 Chuck Berry's sex tape.
02:54:21.000 Oh, yeah.
02:54:23.000 That's so funny.
02:54:25.000 When he says, I didn't even know Chuck Berry was in it until he appeared.
02:54:30.000 Oh, God, that was funny.
02:54:32.000 He's like, I love, I love him.
02:54:33.000 And he's been great throughout as well because he's been someone who says the same stuff in a very funny way and expresses like, you know, he was actually.
02:54:44.000 He's a very thoughtful person.
02:54:45.000 He's a very thoughtful person.
02:54:46.000 And he said that great thing where he said, I know trans rights activists make up words to win arguments.
02:54:52.000 And that's what it is.
02:54:55.000 If you start calling real women cis women, then it's an easy way to disparage them and to put them off.
02:55:00.000 You know what I mean?
02:55:01.000 So, yeah.
02:55:02.000 Anyway.
02:55:04.000 Is there a way to live a normal life for you right now?
02:55:08.000 I mean, you're on this battlefield, this constant, consistent battlefield.
02:55:13.000 I mean, you must, I'm going to ask before if you felt like it consumed you, but I mean, is there a way to transition with you in their terms?
02:55:22.000 I'm currently transitioning, yeah.
02:55:24.000 No, I, I, Rob Schneider, uh, who has just shown me incredible kindness and brought me over to uh work on a few projects for him.
02:55:33.000 Oh, that's great.
02:55:34.000 Something I always wanted to do anyway.
02:55:35.000 I've always enjoyed film rewrites and stuff like that.
02:55:39.000 So, yeah, so that's gonna, and that's helped me out because it's getting me, my visa is three years, and my aim is to become so useful to the Americans that they won't let me go.
02:55:48.000 You know, well, hopefully, this podcast will help.
02:55:50.000 Hopefully, hopefully.
02:55:52.000 Hopefully, people realize how fucking nutty it is over there.
02:55:54.000 And this is what I was scared of over here when tech censorship was in full bloom.
02:55:59.000 When what, sorry?
02:56:00.000 Tech censorship.
02:56:02.000 Before Elon Musk bought Twitter, when people like Megan Murphy were banned for life for saying a man can never be a woman.
02:56:02.000 Right.
02:56:10.000 I was banned for the same thing.
02:56:12.000 I said men aren't women, though, and she was banned for saying men aren't women.
02:56:15.000 Yeah.
02:56:15.000 Nuts.
02:56:16.000 Men aren't women, though.
02:56:17.000 Banned for life.
02:56:18.000 And because Twitter banned me, that was again reported in The Guardian as if I'd been harassing people.
02:56:24.000 Twitter actually said he was misusing the platform.
02:56:28.000 And they never explained what that meant.
02:56:31.000 Misusing.
02:56:32.000 So misusing, but everyone just thinks, oh, he was abusing people.
02:56:35.000 Yeah.
02:56:37.000 It's insidious, and it was all encompassing.
02:56:42.000 There was no social media that was free of it.
02:56:44.000 And the only places that were free of it, for me, feels like they got attacked.
02:56:49.000 And this is what I mean by that.
02:56:50.000 Like, if you went over to any of those alternative places like Gab or any of these, they were flooded with racism and xenophobia and homophobia and flooded in a way where I don't necessarily believe it's all organic.
02:57:11.000 I think it's a great way to sabotage a platform that might attract, like, if you wanted to have a platform, if you were running a platform and your platform is incredibly left-wing, like fully censoring pertinent data that would help Trump or help the right-wing people.
02:57:30.000 If you were running that platform and also, I don't know, a new platform came about and a bunch of people were talking about jumping ship.
02:57:36.000 How hard would it be to just sabotage that platform and just start just the most horrible, racist things?
02:57:43.000 You say it over and over and over again, flood it, flood it with hate, flood it with terrible messages, flood it with disinformation and bad faith arguments and just outright lies.
02:57:57.000 You can do whatever you want.
02:57:58.000 If you have a good computer, you have a good computer crew that knows how to code things, you can use AI to push a specific narrative.
02:58:07.000 They've already done it.
02:58:09.000 They can crowdsource an attack on someone that's entirely bot created.
02:58:14.000 Why wouldn't you do that where there's a new social media platform?
02:58:17.000 So no social media platform got to exist that was free until Elon bought Twitter.
02:58:23.000 Because by that move, which a lot of people don't appreciate for how spectacular the result was, what a big difference it made.
02:58:31.000 Oh, it changed my life.
02:58:32.000 Because everyone was addicted to Twitter already.
02:58:35.000 Even the people that hated the idea that he was doing this.
02:58:38.000 And they were still going to use it.
02:58:40.000 They're addicted.
02:58:41.000 They're locked in.
02:58:42.000 So it's genius, really.
02:58:44.000 And then you let it go buck wild.
02:58:47.000 And you watch people just freak the fuck out.
02:58:47.000 Yeah.
02:58:50.000 I mean, you know, I notice a few things on Twitter I don't like.
02:58:50.000 Yeah, yeah.
02:58:54.000 I notice a little bit of racism and all that.
02:58:58.000 You notice a lot of that.
02:58:59.000 There's a lot of that.
02:59:00.000 But they don't know how much of it's organic, though.
02:59:00.000 Yeah.
02:59:03.000 Yeah.
02:59:03.000 Some of it for sure is.
02:59:05.000 Right, right.
02:59:05.000 I know it's a bit of a double.
02:59:06.000 But I think there's a lot of fake arguments that are designed to keep people at each other's throats and distract from the greater issues that we all have to deal with.
02:59:14.000 I think that's a real strategy that's being used not just by people in the United States, but by people outside the United States on the United States.
02:59:23.000 But the funny thing about it is you almost don't need to sabotage something to that extent because if you were to destroy a country's ability to know right from wrong, truth from lies, all these things, what better way would there be of doing it than the trans movement?
02:59:42.000 Where you can't even accept the evidence of your own eyes and say that's a man or that's a woman.
02:59:48.000 Like it is a destabilizing, it's destabilizing Western society.
02:59:53.000 And I think it will, I think that type of thing could easily be weaponized against us.
02:59:58.000 Well, I think there's a certain value in destabilizing a certain amount of society.
03:00:03.000 You want to keep people weak.
03:00:05.000 You want to make it so that a revolution is very difficult to obtain.
03:00:09.000 Yeah, you know, James Lindsay's theory about, I love, I find that a very compelling theory.
03:00:15.000 He said about what happened with Marxism and so on in the last 20, 20, 30 years, is that they gave up trying to persuade working class people to have a revolution because their lives were too good under capitalism, right?
03:00:31.000 No one wanted to be a revolutionary.
03:00:35.000 So what happened, he thinks, was that all these Marxists, all these left-wingers who really believed in the left-wing project or the communist project, whatever you want to call it, socialist project, they all started going into teaching and cultural places because they wanted to change of culture that way.
03:00:58.000 And it's been incredibly successful, if that is true, you know?
03:01:02.000 I think it makes sense.
03:01:03.000 Yeah.
03:01:03.000 Well, this is Yuri Besmanov talked about that in the 1980s.
03:01:06.000 Right.
03:01:07.000 He talked about using that on the American people.
03:01:10.000 The work had already been done.
03:01:11.000 He was saying in the 1980s.
03:01:13.000 Yeah, yeah.
03:01:13.000 Well, I think that's Lindsay's.
03:01:15.000 That's what Lindsay says.
03:01:16.000 There's a lot of people that think that.
03:01:17.000 There's a lot of people that have went through that university system and then came out on the other side and tried to be independent thinkers and realize how easy they get attacked.
03:01:24.000 And they're just like, there's something going on here.
03:01:26.000 This is not logical.
03:01:27.000 I'm worried about my kids are going into university and half me wants to just protect them from it because, you know, like I know so many stories of people who went to university and came back with a trans identity, you know.
03:01:41.000 It's just easy to get indoctrinated in any kind of a group.
03:01:44.000 Yeah.
03:01:44.000 You know, it's when you're a young person, you can get sucked in.
03:01:47.000 Well, one of the big problems, I think, is that it feels to me, I don't know if it feels like this to you, but the 50s, they call it the invention of the teenager, right?
03:01:57.000 60s, 50s, 60s.
03:01:58.000 We had the 50s, 60s.
03:01:59.000 Then the 70s, people started thinking of rock music as art and so on.
03:02:04.000 And it continued like that for a while.
03:02:07.000 And there was always a very, very healthy youth culture, right?
03:02:10.000 Bowie is a good example, okay?
03:02:12.000 All his fans would go out dressed like, you know, in gender non-conforming ways, you know?
03:02:19.000 Now there's no figures like Bowie.
03:02:22.000 What we have instead is a political ideology instead of the old days where it used to be music, culture, where you could take your personality and find your tribe and throw yourself into a culture, right?
03:02:36.000 That doesn't seem to exist to the same extent that it did for kids.
03:02:39.000 I could be wrong, but it doesn't seem, there doesn't seem to be like who's, I believe recently it was the first time in forever that, or I think the first time since there were charts maybe, that there was no band in the charts, right?
03:02:55.000 No actual musical band.
03:02:56.000 It was all individuals, you know?
03:03:00.000 So, you know, the whole thing about bands, about being in a gang, all that cool stuff, it's gone.
03:03:06.000 Do you think that's a side effect of social media?
03:03:09.000 Yeah, definitely.
03:03:10.000 Because why, you know.
03:03:11.000 Because you don't have to find your tribe in like a physical form where you all get together and enjoy something together.
03:03:17.000 You know, like insane clown posse.
03:03:20.000 They all go and they have, what is that called again?
03:03:24.000 They get together?
03:03:25.000 The gathering of the juggalos.
03:03:25.000 The group.
03:03:26.000 The jugglers.
03:03:27.000 The gathering of the juggalos.
03:03:28.000 Yeah.
03:03:28.000 That's it.
03:03:29.000 They go crazy.
03:03:30.000 They get together.
03:03:31.000 They call it family.
03:03:32.000 Like they feel like they're around other misfits and they feel great.
03:03:36.000 And a lot, you know, one thing I definitely want to make clear is when I'm talking about trans activists being evil and so on, I'm not talking about all of them.
03:03:43.000 You know, there's a lot of good people who are mixed up with this.
03:03:46.000 And they see their trans friend and their trans friend is lovely and they want to protect them and think that people like me are hateful and will never accept them as human beings and so on.
03:03:56.000 That's not the case at all.
03:03:57.000 It's the ideology.
03:03:59.000 It's the ideology.
03:04:00.000 It's a lot of trans activists.
03:04:03.000 But as for trans people themselves, that's a whole range of different people with disabilities.
03:04:07.000 It's like everything else.
03:04:08.000 Yeah.
03:04:08.000 Yeah.
03:04:09.000 You know?
03:04:10.000 In all walks of life.
03:04:11.000 But the problem with it as a movement is they won't call out the bad actors.
03:04:15.000 And they have to, if they're going to basically, my friend Artie Morty, who is a gay guy, a Canadian gay guy, he says the only reason that gay rights got accepted is because when Nambla, right, the North American Man Boy Love Association, and PIE in the UK, the Paedophile Information Exchange, similar groups.
03:04:38.000 But like, I always thought paedophile information exchange, they possibly shouldn't have called themselves the paedophile information.
03:04:45.000 The man-boy love association isn't any better.
03:04:47.000 It's a bit of a giveaway, you know?
03:04:49.000 But anyway, these two organizations started to argue that paedophiles should be a protected class, just like gay people.
03:04:59.000 And gay people ejected them very loudly, very clearly, and said, we don't have anything to do with that, you know?
03:05:07.000 Unfortunately, the same thing isn't happening at the moment.
03:05:10.000 There has to be a move from...
03:05:15.000 Yeah, I think that's worth it about your set.
03:05:17.000 I think that's what it is, man.
03:05:18.000 I think it's just the fear of being attacked is so strong for people that they just never man up.
03:05:24.000 Yeah.
03:05:25.000 for lack of a better term.
03:05:26.000 But there is also a huge reluctance to...
03:05:50.000 right?
03:05:51.000 So I signed this letter along with a bunch of other people.
03:05:53.000 They said no.
03:05:55.000 Within the day, they said no and cast us all as bigots again, you know?
03:06:01.000 So it's like there's a problem with legacy gay organizations.
03:06:05.000 They have to be ridd of all these people who can't answer biological questions.
03:06:10.000 They have to be because they're in danger.
03:06:12.000 They're endangering the whole cause of gay rights.
03:06:15.000 Like however many years, what is it now?
03:06:18.000 60, 68, so 30, 55, over maybe 60 years of gay rights, okay?
03:06:24.000 And they're in danger of throwing it all away because of their sudden obsession about a bunch of straight people, you know, because most trans-identified men are straight, you know?
03:06:36.000 All these trans men, these young girls going on to gay apps, they're straight.
03:06:40.000 They're straight women.
03:06:41.000 It is bananas.
03:06:43.000 Well, listen, Graham, I'm sorry all this happened to you, but I'm glad that we could have a place where you could tell your story because your story is very eye-opening.
03:06:54.000 And this is not what we'd want from a polite, respectable, and even progressive society, especially from a guy like you.
03:07:03.000 Well, thank you.
03:07:04.000 I really appreciate it.
03:07:05.000 I'm glad we got to do this.
03:07:05.000 My pleasure.
03:07:06.000 Let's do it again in the future when you've won your court case.
03:07:09.000 I forgot to bring my book.
03:07:11.000 I got your book at home.
03:07:11.000 It's right by my bed.
03:07:12.000 I just started it.
03:07:13.000 Okay.
03:07:14.000 I appreciate you very much, man.
03:07:16.000 It's called tough crowd.
03:07:17.000 Thank you.
03:07:18.000 You can get it on Amazon.
03:07:19.000 You can get the Kindle version of it.
03:07:22.000 You can get the version of it.
03:07:24.000 It's audio, Audible.
03:07:25.000 Yeah, I'm.
03:07:26.000 Oh, thank you.
03:07:27.000 I'm just reading.
03:07:28.000 I'm a good book.
03:07:30.000 I'm also on Twitter at Glinner.
03:07:31.000 I'd like to get back some of my 400,000 followers that I left.
03:07:35.000 What is it?
03:07:36.000 Oh, Glinner.
03:07:37.000 G-L-I-N-N-E-R.
03:07:38.000 That's your Twitter?
03:07:39.000 That's my Twitter name as well.
03:07:40.000 Yeah.
03:07:41.000 All right.
03:07:42.000 Thanks, brother.
03:07:42.000 I appreciate it.
03:07:43.000 Thank you.
03:07:43.000 All right.
03:07:43.000 Bye, everybody.