TRIGGERnometry - January 21, 2024


Journalist CANCELLED From Women’s Group For… - Katherine Brodsky


Episode Stats

Length

53 minutes

Words per Minute

167.04486

Word Count

8,981

Sentence Count

579

Misogynist Sentences

7

Hate Speech Sentences

7


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Catherine Brodsky joins Betsy and Amanda to talk about her experience with online hate and how she dealt with it. She also talks about how she was doxxed, and why she decided to start her own group called "Binders for Women."

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:01.000 Somebody basically posted a job opening at Fox,
00:00:05.000 which turned out to be absolutely sacrilegious.
00:00:09.000 So I put out what I thought was sort of a kumbaya post,
00:00:12.000 saying, hey, let's stay away from politics.
00:00:15.000 I'm sure it worked out very well, Catherine.
00:00:17.000 Everybody sort of went, ah, yes, kumbaya.
00:00:19.000 So people called me a white supremacist, a Nazi, adjacent,
00:00:24.000 and I'd collaborate with the Nazis and the KK clan.
00:00:29.000 And it just got so extreme.
00:00:31.000 How do we fix this, Catherine?
00:00:34.000 Catherine Brodsky, tell us your cancellation story.
00:00:37.000 It's a question we ask all our guests at this point.
00:00:40.000 Because every guest has been canceled.
00:00:42.000 That's the qualification.
00:00:44.000 OK, well, my cancel story, which is now about two years ago.
00:00:49.000 And, you know, are you checking your watch to make sure it's for two years?
00:00:52.000 No, no, no, no.
00:00:53.000 OK.
00:00:54.000 That...
00:00:55.000 Sorry.
00:00:56.000 You know, it's something that...
00:00:59.000 It's funny kind of to talk about it now in some ways,
00:01:01.000 because at the time it was so intense what was going on for me,
00:01:05.000 because I've never encountered that sort of thing before in a personal way.
00:01:08.000 So what happened was I used to...
00:01:13.000 Well, I still kind of do.
00:01:14.000 I had a group for women journalists and it was a job board.
00:01:19.000 And it was a job group on Facebook.
00:01:23.000 And what we did was basically, you know, it was a spinoff of another group that was called kind of binders for women.
00:01:31.000 It was a joke sort of off of that.
00:01:35.000 Reference to Mitt Romney's...
00:01:36.000 Exactly.
00:01:37.000 Binders full of women.
00:01:38.000 Exactly.
00:01:39.000 There's some debate as whether it was or wasn't now.
00:01:42.000 But anyways, I decided, well, what does everyone want?
00:01:46.000 Jobs.
00:01:47.000 I started this group.
00:01:48.000 It grew to being about 30,000 members.
00:01:51.000 And it was just people posting jobs.
00:01:55.000 We didn't have any, like, personal discussions, nothing like that.
00:01:58.000 Just jobs.
00:01:59.000 And one day, somebody basically posted a job opening at Fox, which turned out to be, you know, absolutely sacrilegious.
00:02:12.000 And even the way she posted it was sort of apologetic.
00:02:15.000 We're trying to diversify the newsroom.
00:02:17.000 We're changing things.
00:02:18.000 But people just went and attacked this poor woman.
00:02:21.000 And I thought, you know, that was not right.
00:02:26.000 So I put out what I thought was sort of a kumbaya post saying, hey, let's stay away from politics.
00:02:34.000 We've had so much coming apart.
00:02:36.000 Let's come together.
00:02:38.000 I did not expect what sort of...
00:02:40.000 I'm sure it worked out very well, Catherine.
00:02:42.000 Everybody sort of went, ah, yes, kumbaya.
00:02:44.000 When I tell this story, everybody knows what's going to happen next.
00:02:47.000 Yeah.
00:02:48.000 But I really, truly did not see it coming.
00:02:50.000 I really didn't.
00:02:51.000 So people called me a white supremacist, a Nazi adjacent, you know, and I just collaborate with the Nazis and the KK clan.
00:03:00.000 It just got so extreme.
00:03:02.000 And after that, I was also somebody who's always had believed in freedom of speech.
00:03:07.000 I mean, I still do.
00:03:08.000 But I did learn a lot from that because there is a difference between engaging with all speech and a difference between engaging in bad faith, good faith speech.
00:03:18.000 So at a certain point I realized, okay, this is not, this is not very helpful.
00:03:25.000 And there were like 660, no, 666 comments at the time where I sort of decided, okay, that's enough.
00:03:35.000 And everybody was attacking me and they said that inherently you can't take politics out of a group that's for women because by nature that makes it political.
00:03:48.000 And I said, okay, I did think about this for a little while and I said, well, I'll make it open to everyone.
00:03:54.000 I'll give a month's notice.
00:03:57.000 You can remove your content if you want.
00:04:00.000 If you want to start your own group, you can do that.
00:04:03.000 So, but that actually went even worse.
00:04:07.000 And that's when I started getting doxxed.
00:04:10.000 I had people emailing editors saying, you know, so that they would never hire me again since I'm a journalist.
00:04:19.000 And also people sending me like, I would get images of people, mobs with tiki torches, just vile kind of messages all around.
00:04:28.000 People would attack me, you know, and if I'd be on a panel or something, it would suddenly come up and people would sort of harass me.
00:04:37.000 And now I kind of feel about it, you know, sort of okay about it.
00:04:40.000 Not okay, but I think I've gone through that process and I've come on the other side.
00:04:47.000 But at the time, as somebody who is actually pretty much kind of a people pleaser in many ways, I'm not.
00:04:54.000 And like, I think we talked about this at some point, you're much more.
00:04:58.000 I'm in the top one percentile for disagreeable.
00:05:00.000 Yeah, you're disagreeable.
00:05:02.000 I can be disagreeable on things that I have very strong feelings about that are strong convictions, but I'm not generally trying to be disagreeable with people.
00:05:13.000 So it was it was just a very sort of at the time now I can use the word traumatic because I remember for a while I talked to people and I felt like I had to give them.
00:05:22.000 I wonder what they knew about me, what they Googled, what they thought.
00:05:26.000 And it's it's been kind of a long journey.
00:05:31.000 And as a result of all of this happening, though, I because I was already sort of struggling with my own sort of free speech and being able to articulate what I was thinking on in my head.
00:05:43.000 And I little by little started, you know, doing that more.
00:05:47.000 And I think that opened the watershed for it.
00:05:49.000 And I got to know a lot of interesting people.
00:05:51.000 And that's where I'm now.
00:05:54.000 Catherine, what do you think it says about our society that a post as innocuous as yours that was designed to be, in your words, kumbaya, can open the door to all this vitriol?
00:06:08.000 Well, I think there is.
00:06:10.000 I think there's a few things going on, because I think now that I look at it, a lot of it was really mobs and bullying.
00:06:17.000 That's at the essence.
00:06:19.000 That's what it is.
00:06:20.000 It's it's a mob and it's big enough that they're all sort of bullying people.
00:06:25.000 And when and they all sort of align with each other and therefore they confirm and reaffirm every thought that they have.
00:06:35.000 So they're not really able to think outside of the mob.
00:06:38.000 And it's that very human instinct that people have had throughout history.
00:06:46.000 But that's what I was saying, because people talk about, you know, cancel culture and they say, well, it's accountability culture.
00:06:53.000 And you know what? I agree. Sometimes it is.
00:06:56.000 Sometimes there there do need to be repercussions for certain things.
00:06:59.000 The problem that I have with it is that a post that is I do think is quite innocent, was intended to sort of do the right thing, gets this disproportionate reaction where there is no conversation.
00:07:14.000 There's no way out.
00:07:15.000 And and the punishment that's being doled out is kind of the same, no matter what the crime is, if there even was a crime.
00:07:22.000 And there obviously was no crime.
00:07:24.000 You were allowed to watch Fox News.
00:07:27.000 You're allowed to work for Fox News, but not in the eyes of a certain segment of society who you seem to royally piss off.
00:07:35.000 Right. And the thing is, it's not like I'm a big Fox News supporter myself.
00:07:40.000 I don't watch the Fox News. I don't like a lot of it.
00:07:44.000 But but if somebody wants to work for an organization, like why am I in the shoes of deciding who should apply for a job where?
00:07:54.000 Like if you go to LinkedIn, the LinkedIn, you know, you can post a job at any organization and nobody really has an issue with it as long as they're within the bounds of the law.
00:08:05.000 And for me, you know, when I you know, when I was being attacked, it's hard not to think, OK, maybe people have a point because so many people are going against you so aggressively.
00:08:15.000 And in my case, again, because I am somebody who's kind of a people pleaser in some ways, you know, I didn't want people to be mad at me.
00:08:23.000 And so I did think a lot about, you know, where they might have a point where they might not.
00:08:28.000 And where I realized I drew the line is I asked myself the question.
00:08:33.000 If I were in their shoes and I disagreed with myself very strongly, how would I behave?
00:08:40.000 And one thing I would never do is try to ruin somebody's career, write them atrocious messages, you know, send tiki torches, call them names.
00:08:51.000 All those kinds of things were not ways I'd engage.
00:08:53.000 So even if they had some sort of a point, you don't I realized for myself, you don't engage with those kinds of people.
00:09:00.000 If somebody sent me, you know, thoughtful messages that were not trying to, you know, destroy my life and were in good faith, I'm fine.
00:09:10.000 I will I will absolutely listen to that. I think it's important to.
00:09:13.000 But in that case, that's not what was happening. And that completely changed my thinking on it.
00:09:18.000 And actually, what I ended up doing is I also wrote my first op ed at the time. It was for Newsweek.
00:09:23.000 And it was about this particular thing that happened to me.
00:09:27.000 And at the time, people thought that if I published that, I would be completely destroyed.
00:09:33.000 And I thought that, too. I didn't know for sure. Right.
00:09:36.000 And but I thought if I couldn't speak out on this issue that I'm saying that wasn't affecting just me, because in all of this, people were sending me messages of saying, you know,
00:09:47.000 I see what's happening to you. It's not right. And I feel so ashamed because I'm too afraid to speak out.
00:09:54.000 I was getting a lot of those messages. And then on top of it, I was getting messages saying, you know, this has happened to me and this is my story.
00:10:03.000 And, you know, some people lost their livelihood completely. Some people just can't do anything.
00:10:10.000 So, you know, it wasn't just my story uniquely. This was happening all around.
00:10:17.000 I just happened to walk into it myself as well. But because I did, I was suddenly really exposed to all the people have gone through this.
00:10:25.000 And when I wrote the story, you know, which I think it was called the rise of the righteous bullies or something like that.
00:10:32.000 And I that is something that I completely while I was scared, I felt like I had to do it.
00:10:40.000 And when I came out, I actually had a lot more support. And the people who were bullying me stopped.
00:10:47.000 And I think that was sort of because if they continued to do that directly to me, well, that would just reinforce what I just said about them.
00:10:57.000 Well, bullies always pick on people who don't fight back. That's how they operate.
00:11:01.000 I'm curious to tease out the point you made about accountability culture, because you said that there are situations where you kind of agree with it.
00:11:09.000 What kind of situations are you talking about?
00:11:11.000 Well, if somebody has committed a certain crime that hurt people, if someone I think you really have to judge it on such a situational basis.
00:11:22.000 And I think that's what's not happening.
00:11:24.000 We are taking like anybody does anything a little bit wrong and it's treated equally wrong.
00:11:31.000 So there are certainly people who might have, you know, let's say they abuse their employees, I think, or they said some, you know, really atrocious things that I don't want to say it.
00:11:46.000 But there's certainly some things that I think would qualify to having some sort of repercussion, even if that repercussion is like, oh, I won't work with you. Right.
00:11:56.000 I think you're kind of an abhorrent human being. Maybe you're praising terrorists or something like that.
00:12:01.000 But and that is up to each individual to decide because we do have freedom of association as well.
00:12:08.000 And I think that's important to remember. Nobody's forced should be forced to spend time with people they don't like.
00:12:15.000 But I do like to encourage people to spend time with people they might have some disagreements with.
00:12:22.000 Well, there's a difference between those two things.
00:12:24.000 And I think the point you're talking about, I've made this point a lot, which is like what happened to Harvey Weinstein?
00:12:33.000 Well, I fully support that. Me too.
00:12:36.000 What happened to Aziz Ansari was an abomination.
00:12:38.000 And that's that's the issue is that we tie everybody with the same brush.
00:12:44.000 And there's no difference in people's heads between someone who's actually like a rapist or a terrorist or whatever.
00:12:51.000 Yeah.
00:12:52.000 And somebody who just happens to be on a similar, you know, the same concept is involved, but like sex is involved, let's say.
00:13:00.000 But nothing else is similar. But yet people get treated very, very similarly.
00:13:04.000 Well, I know a writer I've been talking to recently.
00:13:07.000 He got in trouble and lost his job because, you know, of a comparison that he made between Hitler and Stalin in terms of impact on lives.
00:13:15.000 And, you know, I actually didn't necessarily agree with the way he framed it, but he doesn't deserve to be canceled for it for for something he's just thinking through and maybe stated it not in the most proper way, I suppose.
00:13:33.000 So you hear stories like that all the time.
00:13:36.000 And obviously I've been exposed to hearing a lot of people's stories.
00:13:39.000 And that's the problem that I have.
00:13:41.000 It's not that there is some sort of an accountability culture is that there is this disproportionate way of punishing people and weaponizing it.
00:13:50.000 And that it is applied sometimes to people who don't deserve it at all, like quite often, actually, but also to people who, you know, might have committed a small little sin, you know, and don't need to be punished in a way that takes away their and destroys their whole life.
00:14:09.000 And I think a lot of the times it comes out of this sort of psychological, gleeful, bully mentality and people gang up.
00:14:17.000 Some of the people that I've talked to when one of the people that I talked to when researching for a book that I wrote, he was saying how when when he got canceled, some of the people who were fiercely attacking him didn't even remember it later on.
00:14:34.000 So can you imagine, like, what is the psychology of that, where someone was participating in a mob, essentially, it's an online mob, but it's a mob and doesn't remember.
00:14:46.000 So that happens a lot, too.
00:14:50.000 And I think a lot of times people are not thinking through what that's such a good point.
00:14:55.000 And I guess what you're what you've got your finger on there is online communication makes bullying much easier.
00:15:04.000 Mm hmm. So if you were to go and physically bully people, you you'd struggle to not remember that.
00:15:10.000 But online, it's like, oh, yeah, done.
00:15:12.000 Yeah. And then you move on to, you know, what's in the news, what's for breakfast, etc.
00:15:16.000 It makes it easy. And therefore, you can see why these dynamics that have always existed in human society, as you say, become more prominent, because it's like, well, it's right there, you know, take your phone out, bully somebody, go to work, you know.
00:15:30.000 Well, it's easy. And it's also you don't have to face somebody. Right.
00:15:34.000 And I think that's a colossal difference.
00:15:36.000 I, you know, even I speak about pretty contentious things with people in real life all the time, people with all sorts of perspectives on these things, people who strongly disagree sometimes.
00:15:48.000 And I've never had any incidents that were like this or even really having bad experience exchange, you know, because you see each other's reactions.
00:16:01.000 You're face to face. You understand that this person is maybe thinking something through exploring or, you know, comes in good faith and has has their heart in the right place.
00:16:11.000 But you don't see that online because it takes out your whole humanity and makes you a cartoon character to people.
00:16:19.000 So even when they attack you, they don't really see you as a human being.
00:16:23.000 They see you as some sort of a descriptor of what they've decided to kind of imprint on you.
00:16:30.000 And it's also that the difference in perspective, which Constantine touched on.
00:16:36.000 But I think sometimes people aren't even aware.
00:16:41.000 Do you see what I mean? You're not even aware of what you're doing because it's you're so disengaged from the process.
00:16:48.000 Like you could even be just making a joke about it.
00:16:51.000 But it appears to the person who's at the receiving end, if you are on the receiving end of literally six million jokes or whatever it is, you're like, this is awful.
00:17:00.000 That is a good point. I've been told not to read comments on appearances I do and I am bad at listening.
00:17:10.000 So I've read a lot and sometimes they're so, so terrible.
00:17:16.000 And but in particular, if it was just one, you shrug it off.
00:17:21.000 But then when there are so many, you use...
00:17:23.000 Quantity has a quality all of its own, as Joseph Stalin has said.
00:17:27.000 It does. It's true. It kind of does. Yeah.
00:17:32.000 And what I'm interested in particular is how this culture has affected the media, because you've been a journalist for a long, long time and a writer.
00:17:41.000 So how have you seen this affect journalism in the media from when you started to now?
00:17:46.000 Yeah. I mean, the biggest changes, I would say, there is this a lot of young people go into journalism with the state of mind that they want to be activists.
00:17:55.000 The people that I worked with who are, you know, old school, they don't have, I mean, the same mindset.
00:18:01.000 I mean, all of us want to make a difference in the world.
00:18:04.000 We choose which stories we report on because we might be more interested in them or maybe want to bring attention to certain issues.
00:18:10.000 But there was a structure that promoted a different quality of journalism.
00:18:18.000 And now it's like it is opinion journalism in many, many publications.
00:18:24.000 But the other thing I see, I see a lot of friction between the old generation, new generation.
00:18:29.000 And you can I know specific cases and some of those big kind of mainstream newsrooms, too, where they are fighting it out.
00:18:38.000 And, you know, the fight, the people who fight to be activists are more vicious.
00:18:44.000 So so so people lose their jobs or become very it becomes very difficult for them to speak.
00:18:53.000 And, you know, the there's a lot of opinions in the world, you know, and and people on the left have a lot of opinions.
00:19:01.000 Like there's a tremendous diversity of opinions on different issues that we're grappling with.
00:19:06.000 And I worked in journalism and I worked in film and I know that there's people who have who are very sort of high level, very successful, who have very strong opinions about things, including stronger opinions than I do.
00:19:23.000 And they're so afraid to say anything.
00:19:27.000 That is the key. And when you're afraid to say anything, then we don't have a world that makes sense because we don't we don't have the back and forth that we really, really need to make sure that we understand the different perspectives, that we understand what's actually going on when people are afraid to speak.
00:19:46.580 That's the big issue for me. And people get more radicalized, whether it's the left or the right, because they're now only talking to people in their little orbit.
00:19:55.340 And so and those people are just confirming their views.
00:19:58.760 So it causes the centers to sort of collapse, because I do believe a lot of people have pretty reasonable points of views, but they're the people who are quiet and the people who are loudest are the people with radical thoughts.
00:20:15.500 And so that's all we end up hearing. And then the world becomes shaped by these radical thinkers.
00:20:22.780 And I would say as well, there is another radicalization vector that's going on as well.
00:20:27.980 And we've seen it. We're recording this a couple of weeks after the Hamas attack on Israel.
00:20:32.820 And you see in the West, for example, the conversation about illegal immigration and mass immigration in general has been very it's been very curtailed by the fact that certain opinions,
00:20:47.520 which are quite widely shared in society about restricting immigration, about restricting immigration, particularly from certain parts of the world where, you know,
00:20:56.360 you are bringing people in where there's a much higher risk of them being a danger to the society into which they're coming.
00:21:03.080 All of that was kept under a lid. And then suddenly people see in the streets of Western capitals massive.
00:21:11.860 This is even before Israel responded massive, effectively pro-terrorism rallies.
00:21:17.180 Yes. Right. And people are like, whoa, how did this happen?
00:21:20.340 And now they're being radicalized by that experience, as I think everybody is actually in some extent, to some extent, certainly in our space,
00:21:27.960 because the consequences of not being able to talk about those things for two decades are now showing up in your streets.
00:21:36.720 And now people probably are going to overreact.
00:21:40.800 I certainly think that is one of the possible outcomes of this.
00:21:44.260 No, absolutely. I think so, too. And I would say that, I mean, that happens a lot, the overreaction, right?
00:21:50.920 A lot of the sort of counter movements that are not so great do happen.
00:21:55.640 But I don't blame normal people for overreacting in the situation because they're finally aware of the scale of the problem.
00:22:01.840 Yeah.
00:22:02.100 You see what I'm saying?
00:22:02.740 Yeah, I do. But I think that when it comes to immigration in particular, and both you and I are immigrants,
00:22:08.540 to probably have some understanding and some sympathy, I think the problem is because there's a difference talking about something and being politically correct
00:22:19.280 versus, you know, being a complete jerk about it and having, you know, maybe bigoted racist views.
00:22:26.600 Those are not the same things.
00:22:28.180 You want to be able to be honest about the facts of anything.
00:22:32.940 And you have to also understand the different perspectives.
00:22:36.280 Like, you know, you take a country like Japan, for example, which people rarely criticize for their immigration policies.
00:22:43.600 But people love Japan. Why do they love Japan?
00:22:46.540 They love it because the culture is...
00:22:48.680 Because it's Japanese.
00:22:49.280 Yeah, because it's Japanese. Exactly.
00:22:51.600 And Japan has a deeply restrictive immigration policy.
00:22:55.720 And, but it's sort of necessary for the culture not to be destroyed.
00:23:02.060 And, you know, if we're talking about countries like America or Canada, where they're all immigrants, essentially, it's a little bit different.
00:23:11.660 And I think you have to look at different countries in different contexts and the context of how most people want to live there as well.
00:23:19.780 But that's not happening.
00:23:21.500 It's like the same lens is applied everywhere.
00:23:24.180 And you can't be honest about the things that aren't working.
00:23:28.160 Because if we can talk about the things that aren't working, maybe we can...
00:23:31.640 And we can all talk about it together, by the way.
00:23:34.280 That's how we can come up with solutions to these problems.
00:23:37.900 And maybe those solutions are not going to be so extreme.
00:23:40.980 Or maybe, you know, not discriminatory because you'll have a balance of points of views.
00:23:45.440 But people are afraid to talk to each other who disagree.
00:23:47.960 We're not afraid, but they don't want to.
00:23:49.740 Sorry, Frances, I know you want to jump in and you should.
00:23:52.220 But I just want to finish on this point.
00:23:54.240 Is that what's happening?
00:23:55.720 Is it that people are afraid to talk to each other?
00:23:57.960 Because I would argue...
00:24:00.100 I mean, yes, I personally would argue.
00:24:03.080 I'm not even playing devil's advocate.
00:24:05.000 That it's not that people are afraid to talk to each other.
00:24:08.960 It's that the expression of concerns from people of a particular view on one side of the political spectrum has been consistently demonized.
00:24:18.320 So those phrases you used about bigoted, racist, etc., they have been badly misapplied in order to maintain the lies that people on the left want to maintain in the public space.
00:24:29.860 Yeah, I understand what you're saying.
00:24:34.320 I think there is a reaction that is coming from the right that is a direct response to the left, you know, calling them racist and not listening to them and demonizing them.
00:24:45.820 At the same time, I think it's about who has the power at any given moment.
00:24:50.920 So if the...
00:24:52.240 Right now, the left clearly has more at least institutional power.
00:24:56.980 When the right has more institutional power, they'll do the same thing.
00:25:01.140 The reactions, I think, are...
00:25:04.560 I understand them, but they're not really healthy and helpful.
00:25:10.580 So you have people who really vilify each other from both sides.
00:25:15.840 And one is...
00:25:17.520 I think the right is feeling sort of victimized.
00:25:20.980 And victims lash out.
00:25:23.460 And we saw that, again, the left did the same.
00:25:25.880 When they didn't have power, they also lashed out.
00:25:28.700 And interestingly enough, they were much more pro-free speech.
00:25:32.300 You know, I grew up with that being a very inherent quality, right?
00:25:35.840 Like a very important value.
00:25:38.800 And now, because the left has that power, suddenly, well, we want to control it.
00:25:45.240 This speech is okay, but not this speech.
00:25:47.120 And I constantly meet people and they say, yeah, no, I totally support free speech.
00:25:52.480 But, you know, there should be limits.
00:25:54.280 Well, outside of that limit of, you know, let's not incite violence,
00:25:59.560 then you don't actually believe in free speech.
00:26:02.680 You believe in limited speech.
00:26:05.240 But I think what I'm seeing on the right, unfortunately,
00:26:08.740 because when I first, you know, started this journey,
00:26:12.240 my allies were often people on the right
00:26:15.220 because they, you know, were proponents of free speech.
00:26:19.080 They didn't like what was going on with the culture.
00:26:21.280 But as time kind of went on, I feel like people have gotten much more radicalized,
00:26:27.840 including the people who are sort of fighting this anti-woke,
00:26:31.960 or rather the wokeness, right?
00:26:34.420 So the anti-woke warriors, as I would call them,
00:26:37.980 because they start employing the same techniques as the people they've criticized.
00:26:43.180 And that's something that I'm seeing in the culture that's been disturbing me.
00:26:46.580 The other thing that's been disturbing me, actually,
00:26:49.420 is seeing the media reflect these practices.
00:26:52.260 So you read a media article, a newspaper article from a publication
00:26:57.120 you thought was once reputable, and you're going,
00:26:59.920 am I just reading someone's Facebook post?
00:27:02.920 Is that what this is?
00:27:04.340 You probably are.
00:27:05.480 You know, there's only so much time in the day you have to make do,
00:27:08.600 so you have to post it everywhere.
00:27:09.840 Yeah, I mean, there's so many articles where I would catch them,
00:27:14.980 and they would say, you know, it would say news.
00:27:17.620 But it's clearly, in every objective metric, it's an opinion piece.
00:27:23.020 And this is the problem.
00:27:24.180 Like, as a writer, you know that certain publications are captured,
00:27:27.600 and it will never, if, in fact, if I pitch certain stories to them,
00:27:31.280 they're going to, you know, put me on a blacklist.
00:27:34.860 But in, you know, in their heads, I don't think there is.
00:27:37.500 Not yet, so there is not a blacklist.
00:27:40.400 But at the same time, because I've been so much more vocal
00:27:44.060 with my criticism and with my voice,
00:27:46.580 a lot of people reach out to me, often very discreetly and privately,
00:27:51.420 and they all, they see the issues.
00:27:54.960 And some of them are in very top positions, too,
00:27:57.700 at some of these publications, which I find sort of baffling,
00:28:01.820 because while I believe, like, everybody should be able
00:28:06.040 to make their own decisions how they want to, you know,
00:28:08.740 tackle this or not, it's the silence.
00:28:13.980 It's the deafening silence that is actually allowing
00:28:16.480 for a lot of this to thrive, because if these people, like,
00:28:19.340 push back right away, and they have a lot of power.
00:28:22.200 Like, they're not all, you know, people at the bottom
00:28:25.280 who don't have any power or control,
00:28:28.160 but they're all, they really are scared.
00:28:30.040 And that's depressing, because if you look at the media
00:28:35.860 from 20, 30 years ago, or 40 years ago,
00:28:39.880 there was, you can name so many journalists who were firebrands,
00:28:43.780 who were out there, who said outrageous things,
00:28:46.380 but were interesting and had a unique perspective.
00:28:49.980 Is that all gone?
00:28:51.640 Well, and I think it's really important to highlight
00:28:53.840 the importance of being wrong and allowing people to be wrong,
00:28:58.660 because if you're never wrong, you never get corrected.
00:29:03.360 And you continue thinking that wrong thing in your head, too.
00:29:07.760 So I think there is this kind of Puritan culture, almost,
00:29:13.140 where you have to adhere to these very specific things.
00:29:16.340 You can never say anything a little bit off,
00:29:18.260 a little bit wrong.
00:29:19.460 But, you know, if you're hanging out with your friends,
00:29:22.140 and, you know, a friend made a joke,
00:29:24.260 and it's like, dude, that's a little bit much.
00:29:27.000 It's a little over the line, right?
00:29:29.580 Okay, you say that, you move on.
00:29:32.320 And A, it gives them feedback.
00:29:34.900 And B, you understand that your friend is a good person
00:29:39.820 that may have accidentally uttered something
00:29:42.780 that maybe wasn't quite appropriate.
00:29:45.500 We don't destroy the friendship over this.
00:29:48.360 We don't go and destroy that person over this.
00:29:50.940 But somehow it's become okay to destroy people
00:29:53.780 just because they misspoke people who are strangers,
00:29:59.000 because you don't understand what the whole context of it is,
00:30:02.240 who they are, what they really mean, what they're thinking.
00:30:05.200 I'm sure we all say inappropriate things at times.
00:30:08.040 Sometimes it's purely an accident.
00:30:10.720 It's not malicious a lot of the time,
00:30:13.000 but you're being held to this standard.
00:30:15.920 I had somebody go after me for mispronouncing her name.
00:30:21.800 Really?
00:30:22.060 Yeah, because that was a microaggression.
00:30:26.500 And that ended up being a whole campaign against me
00:30:32.160 because suddenly I'm a racist person
00:30:34.700 because I mispronounce my name.
00:30:36.840 And, you know, another day I mispronounced a person
00:30:40.860 from my own background's name nine times repeatedly
00:30:44.220 because I'm not very good at pronouncing certain words.
00:30:48.860 And the thing that I want to touch on
00:30:51.460 is that you talked about destroying friendships.
00:30:54.300 We are now destroying friendships.
00:30:56.260 We're destroying relationships.
00:30:58.200 We're destroying working relationships.
00:31:01.300 And the thing that's really sad is we have a loneliness crisis.
00:31:05.000 People are talking about being more lonely than ever.
00:31:08.380 Men in particular are lonelier than ever.
00:31:11.380 There's a horrifying stat.
00:31:12.820 I can't remember exactly what the percentage is
00:31:14.960 of men who have got no friends, zero friends.
00:31:19.560 These are the things that keep you going when times are tough.
00:31:22.860 And we're just throwing it on the metaphorical fire
00:31:25.540 because people say things that we disagree with.
00:31:28.780 And to me, that seems tragic.
00:31:31.560 Yeah, and I think a lot of it is also social media
00:31:33.760 because I did a survey, for example, amongst my followers,
00:31:40.920 but also Elon Musk amplified it.
00:31:43.100 So I got quite a few responses and I thought it was interesting
00:31:46.020 because I had a hunch already about this.
00:31:48.260 But I asked how many people are currently in a relationship
00:31:53.140 or have been dating, you know, in the last year.
00:31:56.280 And then the second category was, you know, people who've been dating between,
00:32:01.040 you know, somebody, have dated somebody within the last year or two.
00:32:04.880 And then the last category was, you know, two plus years.
00:32:08.400 It was about, I think it was like 45%, something like that,
00:32:12.940 haven't gone on a single date in over two years.
00:32:16.440 So you've got a massive incel audience.
00:32:18.080 I don't even think it's my audience, but I, it's something, no, I understand.
00:32:25.560 But it's something that in a lot of, because a lot of discussions
00:32:29.300 that I've sort of said, and it was quite a surprising thing
00:32:32.600 because you're saying this pandemic of loneliness is very true.
00:32:36.080 A lot of times people write to me about their loneliness
00:32:39.400 and they want someone to talk to.
00:32:42.800 And, and it's very, you know, I think we don't really have communities
00:32:46.640 in the same way anymore.
00:32:48.280 And I think that's a major contributing factor to this.
00:32:53.180 But also, you know, COVID didn't help matters, right?
00:32:57.160 Dating apps, people aren't used to meeting each other
00:33:00.420 and just in real life and, hey, let's, let's go out.
00:33:04.980 I don't know how you do it.
00:33:05.740 I've never picked anybody up.
00:33:07.180 So, but, but it's like, that isn't happening.
00:33:11.460 And then on top of it, you know, you'd have these communities,
00:33:14.820 you know, and they'd get, and you think you belong,
00:33:19.700 whether it's women writers or there's some,
00:33:24.460 a lot of controversy for some reason in the knitting circles,
00:33:27.740 which you would think, but, but you, you become part of a group
00:33:33.060 and that becomes your identity.
00:33:34.960 And then your identity is completely shattered when the group turns on you
00:33:40.060 because you somehow strayed from the orthodoxy.
00:33:44.180 And that's the thing that people are really scared of,
00:33:48.400 really scared of.
00:33:50.260 And understandably so, because we're programmed to want to be in the group.
00:33:54.640 So, I guess, can you really blame people for not wanting to speak up?
00:34:00.020 Yes.
00:34:00.680 Yes.
00:34:03.100 Man up, you cowards.
00:34:04.560 Yeah.
00:34:04.760 That's right.
00:34:05.340 Well, look, I had, I had a bit of a different perspective starting out with this
00:34:09.260 and it's shifted.
00:34:10.580 I was a lot more forgiving of people not speaking up
00:34:14.020 because, you know, I understood their fears.
00:34:17.580 I felt it myself.
00:34:18.460 So, I get where they're coming from.
00:34:21.460 But I think there's a point at which you are contributing to this.
00:34:27.400 And also, if you cannot stand up for a friend, for example,
00:34:31.420 or if you can't stand up for something that is absolutely the right thing to do,
00:34:37.560 that isn't, that's how, it's very difficult to live with yourself.
00:34:42.660 That's why people expressed feelings of shame.
00:34:46.060 And I did, I reassured them at the time.
00:34:48.320 I said, please don't feel this way.
00:34:51.220 It's, you know, your support, even like this, means a lot.
00:34:54.500 And I did.
00:34:55.000 It was sincere.
00:34:56.560 But it is contributing.
00:34:58.580 That is how we got here.
00:34:59.960 I think people really do need to find their voices and speak up.
00:35:04.540 Especially, you don't have to do it for everything.
00:35:07.280 You know, there's things where it doesn't matter.
00:35:10.040 Or, you know, it doesn't matter enough to you.
00:35:12.980 But the things that really matter, that you believe really matter,
00:35:17.780 you need to find the courage to do that.
00:35:20.320 And a lot of the people that I've talked to who have sort of found the courage,
00:35:24.760 it's not even courage.
00:35:25.660 It's just the sense of, I can't do it any other way.
00:35:29.600 I can't live with myself if I don't stand up for a friend,
00:35:33.100 or I don't stand up against this thing that is wrong,
00:35:36.280 and how somebody is being treated, if somebody is being prosecuted.
00:35:39.400 And you see it in so many areas, too.
00:35:43.320 So, you know, people are so afraid to dissent.
00:35:46.740 Well, whether it's art or it's science,
00:35:50.560 in a lot of scientific communities, that has become very controversial.
00:35:55.760 But science is based, to some extent, on dissenting.
00:36:00.880 And if there is, there's a false sense of consensus,
00:36:05.080 when sometimes there isn't consensus on particular things.
00:36:08.820 You know, the whole science is settled aspect.
00:36:11.500 It's not always settled.
00:36:12.560 But people are under the impression that it is
00:36:16.540 because other people are afraid of what is going to happen to them.
00:36:21.360 At the same time, you do have to recognize, like,
00:36:25.200 you hear all these stories of people who stood up,
00:36:28.180 and they are so successful now, and they're famous.
00:36:31.660 There's so, so, so many stories that are not told of people
00:36:38.220 who've lost everything, have lost their jobs, are not successful.
00:36:42.400 That happens a lot more often.
00:36:45.320 And, you know, when you make a choice,
00:36:47.400 that's why I can't just say, hey, you have to do this,
00:36:50.000 because that person does have to make that choice for themselves,
00:36:52.840 because they could lose everything.
00:36:54.400 Well, right.
00:36:54.900 And I was joking when I was, like,
00:36:56.440 with my man up your cowards to some extent,
00:36:58.360 because you're right, it's not something that's easy for everybody.
00:37:02.800 There is an argument to be had about whether, you know,
00:37:07.300 speaking up for something that really matters,
00:37:09.000 like a friend being destroyed and paying the price for it
00:37:12.460 is a bad thing to do.
00:37:13.980 I mean, sometimes it's okay to pay a price for doing the right thing.
00:37:17.960 If that weren't the case, it would be easier
00:37:20.280 and everybody would do it, right?
00:37:22.300 But I'm curious, because it is a lot to ask of people,
00:37:25.580 whether we can spend a few minutes talking about
00:37:28.420 how you think some of this gets addressed,
00:37:30.240 because in your upcoming book, you talk about this.
00:37:33.260 How do we fix this, Catherine?
00:37:35.420 Yeah, I mean, that's a really great question.
00:37:37.520 And I find that it doesn't get asked very much.
00:37:40.300 I think we have a lot of grievances, but we don't know.
00:37:42.580 Yeah, both sides and everybody who's not on the side like us
00:37:46.280 is very tempted to talk about victimhood,
00:37:49.960 and we don't spend enough time talking about
00:37:52.200 how to do things constructively.
00:37:53.680 Yeah, I agree.
00:37:54.280 Yeah.
00:37:54.580 So let's do that.
00:37:55.660 Let's do that.
00:37:56.560 So, I mean, in my book, I really dive into some of this
00:38:01.340 through the stories themselves
00:38:02.840 of the different people who have experienced cancellations.
00:38:07.620 Not everyone in the book has,
00:38:09.680 but, you know, there's a few things that people can do.
00:38:13.460 What's the book called?
00:38:14.380 It's called No Apologies.
00:38:16.680 Ah, very good.
00:38:17.840 And, you know, it's about this sort of phenomenon
00:38:23.080 where people are silenced, essentially.
00:38:27.160 I didn't even want to call it cancel culture
00:38:29.080 because I think a lot of it is actually just even self-silencing.
00:38:33.200 And when we self-silence, we don't know what is actually going on in the world
00:38:38.760 when anybody's thinking.
00:38:40.340 Sometimes we're thinking the same things
00:38:41.680 and we think we're not on the same team.
00:38:43.140 So one way, I know for me, the way that I first started using my voice
00:38:48.040 is it's just in small ways.
00:38:50.660 You know, I started having open conversations with people in person.
00:38:54.660 And because it's in person, it's a different experience.
00:38:58.460 That gave, I noticed, other people permissions
00:39:01.620 to discuss these same kinds of things
00:39:04.040 and sometimes even more, sometimes controversial things.
00:39:08.640 My views are, by the way, not the most controversial, I would say.
00:39:11.440 But there's a lot of topics that are hot-button topics even today.
00:39:16.840 And so I noticed that once I started doing it,
00:39:20.500 it had sort of an effect on other people.
00:39:23.520 I mean, the whole reason I wrote this book
00:39:25.480 was because I wanted to encourage the people that I call,
00:39:30.220 you know, the silence majority.
00:39:33.540 I wanted to empower them to use their own voice
00:39:36.440 because the only way to kind of deal with these things
00:39:39.340 is to push back in some way.
00:39:41.440 And I don't like the idea of pushing back in, you know, aggressive ways,
00:39:46.160 vilifying people, just mocking everybody.
00:39:48.740 You can mock ideas.
00:39:49.840 I don't think, I don't believe in mocking people.
00:39:53.700 There's some religions that would disagree with you on that one, Catherine.
00:39:56.820 Oh, yes, I'm sure they will.
00:39:57.900 But, you know, I'll take them on.
00:39:59.940 I'll be disagreeable on that one.
00:40:02.080 But I think you should, humour is the best way to mock bad ideas.
00:40:06.240 If you, the best, most effective thing is if you can make somebody laugh
00:40:10.700 at their own bad idea, that's a superpower.
00:40:13.920 And that would be a fantastic tool in your toolkit.
00:40:17.400 I think supporting people when they're being attacked
00:40:21.680 or standing out for something,
00:40:23.540 that's a really important thing to do as well.
00:40:26.620 And people don't do it enough.
00:40:28.200 Even if it is just sending them a message of support.
00:40:30.900 Because often they think they're alone.
00:40:33.180 So they don't even know people are on their side.
00:40:36.260 So even if you're not willing to, you know, go out publicly,
00:40:40.500 send them a message of support.
00:40:42.220 So I think supporting, rallying out behind people is big.
00:40:46.760 I think that building communities, alternative communities,
00:40:49.960 is a really important thing.
00:40:51.440 And that's something that comes up in the book.
00:40:54.280 There's a lot of people who, after having had these experiences,
00:40:58.440 sort of grew their own communities of people who are good-faith individuals,
00:41:03.720 people who don't just judge people for no reason.
00:41:07.480 And so they were able to build these alternative communities.
00:41:10.940 And even within businesses, you know,
00:41:13.580 that they're able to sort of appeal to an audience
00:41:17.260 that's more willing to engage with them
00:41:21.120 as opposed to appealing to everyone.
00:41:23.660 I think those are kind of, you know, basic things,
00:41:28.560 but at the same time important things to do in order to sort of solve.
00:41:32.760 I mean, the other issues that are going to be much more difficult
00:41:35.680 to overcome are institutions.
00:41:38.420 And when it comes to institutional sort of issues,
00:41:41.520 I think it's important to, again, voice things,
00:41:46.480 whether it's writing letters to politicians,
00:41:49.140 because I know that politicians, when they get a letter,
00:41:51.400 it counts as if it's 100 people,
00:41:53.040 because they know people don't send letters.
00:41:56.580 Letting people know that they're not alone when things go, you know, happen.
00:42:02.460 Really, the solution is to speak.
00:42:05.520 Not only, I would also add to that,
00:42:07.380 I think it's really important that if you see somebody
00:42:10.400 who's writing about these issues
00:42:12.860 in a way that you don't feel able or empowered to do
00:42:16.060 or host a video channel like ours
00:42:19.120 or whatever it might be,
00:42:20.280 people who are resisting this way of doing business,
00:42:24.760 I think if you can't speak up yourself,
00:42:27.520 send that person some money
00:42:28.840 or give them a monthly subscription or whatever.
00:42:30.900 Like if you can't,
00:42:32.080 if you are so afraid
00:42:33.140 because you're going to lose your job for speaking
00:42:35.340 and you can't speak,
00:42:36.720 well, support people who do speak.
00:42:38.480 You know, I know it's a self-serving point from my position,
00:42:41.160 but I actually...
00:42:42.460 Send all your money to Constantine.
00:42:45.220 Absolutely.
00:42:46.300 We'll put the link in the description.
00:42:48.320 But, you know, when I was in comedy
00:42:51.380 and I wasn't yet ready to actually say what I think
00:42:54.680 and host a show like this or anything,
00:42:56.560 I used to do this
00:42:57.400 and I didn't have a lot of money.
00:42:58.440 I used to support YouTube channels
00:43:00.440 that talked about it
00:43:01.540 and send money to people
00:43:02.740 because I thought it was like,
00:43:04.300 well, I can't speak,
00:43:05.120 but this person is.
00:43:06.600 Let's give them support
00:43:08.040 and encourage them to do it.
00:43:09.420 Because from our perspective,
00:43:10.840 I can just tell you,
00:43:11.760 like the more people support us,
00:43:13.660 the more we know we're doing the right thing.
00:43:15.780 Yeah.
00:43:16.220 And the more empowered we are
00:43:17.320 and also the more resources we have.
00:43:18.740 Or the more radical followers you have.
00:43:22.040 Well, you know, it's interesting actually
00:43:23.680 because with our followers,
00:43:25.020 we have monthly calls with like our top supporters.
00:43:27.640 Because other than one exception,
00:43:31.800 they're really not very radical people at all.
00:43:34.220 Actually, they're very grateful
00:43:35.600 that moderate people are doing this.
00:43:39.720 There are, of course,
00:43:41.040 other people who have a more radical position,
00:43:43.360 who are more extreme.
00:43:44.220 I'm sure they get plenty of support as well.
00:43:45.920 But it's sort of,
00:43:47.060 I feel what you're putting out there
00:43:49.580 is what comes back.
00:43:51.040 So if you're radical,
00:43:52.080 you're going to attract radical people.
00:43:53.560 If you're moderate,
00:43:54.460 like the three of us,
00:43:55.400 you're going to attract moderate people
00:43:56.840 who nonetheless feel
00:43:57.660 that these conversations are important.
00:43:59.760 I sometimes will get a few radical followers,
00:44:02.560 which comes out quite quickly.
00:44:04.800 But they'll leave very quickly
00:44:06.200 when they realize you are not radical.
00:44:07.560 But sometimes they self-reflect.
00:44:10.400 So it's kind of an interesting process.
00:44:12.360 It's hard to do it one-on-one in this way,
00:44:15.380 but sometimes they do self-reflect
00:44:17.180 and sort of change a bit
00:44:19.580 because they're now entering a more moderate arena
00:44:22.840 where people's, you know,
00:44:24.720 if you read people's comments,
00:44:25.800 they're quite thoughtful and nuanced.
00:44:28.440 And so it does sort of force them sometimes
00:44:30.640 to go into that as well.
00:44:33.760 You're not going to catch a lot of that,
00:44:35.520 a lot of change,
00:44:36.820 but sometimes it does.
00:44:38.240 I mean, I don't know if you've had
00:44:39.840 Daryl Davis on your show.
00:44:42.460 The guy who converts KKK into,
00:44:45.520 yeah, no, we haven't yet,
00:44:46.980 but I'm sure we will at some point.
00:44:48.360 He's just to me,
00:44:49.580 and he's in the book,
00:44:50.360 but he to me is such a,
00:44:53.200 he's, since I've heard his background,
00:44:55.860 he's been sort of an,
00:44:57.200 I don't want to use the word idol,
00:44:59.000 I don't worship anyone,
00:45:00.500 but definitely somebody
00:45:02.860 who I hold in high regard
00:45:04.780 and inspired by his work
00:45:06.880 because it aligns with my own philosophy.
00:45:09.840 It was like,
00:45:10.760 look, some people are going to,
00:45:12.140 and he says this in the book,
00:45:13.180 he's like,
00:45:13.420 some people are going to go to the grave,
00:45:15.300 you know,
00:45:15.760 believing the racist things that they do,
00:45:18.100 but some people won't.
00:45:19.740 Some people will change,
00:45:20.840 and I remember watching a news segment,
00:45:25.040 and there was a former member
00:45:26.560 of the KKK on there,
00:45:28.580 and, you know,
00:45:29.280 you think, like,
00:45:30.020 that's as vile as it gets,
00:45:31.880 or one of,
00:45:33.420 and he actually was now getting people out,
00:45:38.240 so somebody had made that transformation
00:45:40.680 from being an active member of the KKK,
00:45:43.420 doing illegal things,
00:45:45.360 to being someone who now,
00:45:48.100 battles them.
00:45:50.060 People change,
00:45:51.400 not all people,
00:45:52.680 but sometimes they do,
00:45:54.080 and you can't have that
00:45:56.020 if you don't have some sort of conversation
00:45:58.140 to begin with,
00:45:59.220 give them some entry point,
00:46:01.300 and understand why they're the way you are.
00:46:03.240 Absolutely,
00:46:03.860 and it's not just that as well,
00:46:05.600 it's also redemption,
00:46:07.540 because that's the thing
00:46:09.740 that this stuff lacks,
00:46:11.220 this whole movement,
00:46:12.080 whatever you want to call it,
00:46:12.980 lacks,
00:46:13.720 is a possibility for people
00:46:15.100 to redeem themselves.
00:46:16.060 Well, you know,
00:46:18.080 it's interesting,
00:46:18.960 actually,
00:46:19.680 I kind of tackle this in the book,
00:46:21.580 because I,
00:46:22.720 there is a chapter about
00:46:24.420 false accusations,
00:46:26.640 true accusations,
00:46:27.720 but, you know,
00:46:28.520 this whole idea
00:46:29.440 that everything
00:46:30.220 is arbitrated
00:46:32.480 in the court of public opinion
00:46:34.260 versus
00:46:35.240 actual court,
00:46:37.740 and how do we balance
00:46:39.700 these things.
00:46:40.960 So,
00:46:41.500 the person that I feature
00:46:43.360 in the book
00:46:43.720 is Stephen Elliott,
00:46:45.140 and he is somebody
00:46:46.800 who was accused,
00:46:48.660 he was put on this list of,
00:46:50.960 am I allowed to say
00:46:52.120 shitty?
00:46:52.360 Shitty media men.
00:46:53.380 Yes.
00:46:54.300 Shitty media men.
00:46:56.100 And,
00:46:56.680 you know,
00:46:57.280 there was an anonymous
00:46:58.200 accusation of rape,
00:46:59.820 and in his case,
00:47:00.760 for various reasons,
00:47:01.820 it's not even,
00:47:02.820 you know,
00:47:03.140 it's quite obvious
00:47:04.840 that this was not true,
00:47:06.220 but regardless,
00:47:07.600 right,
00:47:08.520 it's an anonymous accusation.
00:47:11.060 That's a really big factor.
00:47:13.260 I remember when that list came out,
00:47:15.080 I myself sort of grappled
00:47:16.380 with this a little bit,
00:47:17.420 because I sort of understand
00:47:18.860 the idea of wanting
00:47:19.960 to warn other women
00:47:21.860 of,
00:47:23.300 you know,
00:47:23.980 somebody who might be
00:47:25.060 a predator.
00:47:26.060 I get it.
00:47:27.060 Well,
00:47:27.480 in fact,
00:47:28.160 sorry to just interject this,
00:47:29.940 in almost every industry,
00:47:31.760 not that I'm a woman,
00:47:32.600 but I know,
00:47:33.500 in almost every industry,
00:47:35.240 there are,
00:47:35.820 like,
00:47:36.200 WhatsApp groups
00:47:37.040 or other things where,
00:47:38.180 like in comedy,
00:47:39.220 there are women's
00:47:40.200 WhatsApp comedy groups
00:47:41.440 where,
00:47:41.900 that are dedicated
00:47:42.480 to the specific task
00:47:44.760 of warning fellow
00:47:45.980 female comedians
00:47:47.000 about creepy people
00:47:48.320 in the comedy industry.
00:47:49.700 So,
00:47:50.260 like,
00:47:50.900 women helping
00:47:52.420 other women
00:47:53.020 in this way,
00:47:54.160 without necessarily
00:47:55.080 publicly damaging
00:47:56.180 the reputation
00:47:56.980 of people
00:47:57.520 who may or may not
00:47:58.560 have committed
00:47:59.080 various degrees
00:48:00.360 of inappropriate,
00:48:02.160 you know,
00:48:02.460 whatever.
00:48:03.660 They've always existed,
00:48:04.860 I imagine,
00:48:06.140 through word of mouth
00:48:07.440 and things like that.
00:48:08.500 There's a difference,
00:48:09.360 though,
00:48:09.500 when you're doing this
00:48:10.320 in public
00:48:10.760 and people are paying
00:48:11.940 the price
00:48:12.500 for anonymous allegations
00:48:13.700 and so on.
00:48:14.460 Yeah,
00:48:14.660 and that's the thing.
00:48:15.420 I mean,
00:48:15.740 people are going to whisper
00:48:16.580 to each other
00:48:17.220 and say,
00:48:17.760 hey,
00:48:18.160 so-and-so,
00:48:18.780 watch out.
00:48:19.960 But,
00:48:20.600 yeah,
00:48:20.940 this was a very public list,
00:48:23.740 very publicized list as well.
00:48:27.040 And,
00:48:27.380 you know,
00:48:27.680 I'm sure some of the people
00:48:28.580 on that list were guilty,
00:48:29.920 some were innocent.
00:48:30.560 But that sort of poses
00:48:32.980 the question,
00:48:34.080 do we,
00:48:35.140 you know,
00:48:35.580 go after the guilty
00:48:36.720 and the innocent
00:48:38.560 pay the price?
00:48:40.320 And so that was a question
00:48:41.300 I had to ask myself.
00:48:42.560 And actually,
00:48:43.600 in having the conversation
00:48:44.820 with Stephen,
00:48:45.400 I think that really
00:48:46.480 helped shape
00:48:47.780 my own perspective
00:48:48.660 on it
00:48:49.200 because I think
00:48:50.340 I did a little bit
00:48:51.120 of a change
00:48:51.760 in my perspective
00:48:53.320 from how I
00:48:54.640 originally perceived
00:48:56.080 the list
00:48:56.580 and seeing
00:48:57.400 what happened
00:48:58.040 to him.
00:48:59.160 But there were so many,
00:49:00.520 I mean,
00:49:00.800 kind of go into
00:49:01.440 the whole Me Too thing
00:49:02.540 where I think
00:49:04.380 it started
00:49:05.060 with some
00:49:05.840 good intentions.
00:49:07.100 I think it did
00:49:07.680 some good things,
00:49:08.780 you know,
00:49:09.180 in terms of bringing
00:49:09.960 that conversation
00:49:10.860 to the forefront.
00:49:12.080 But at the same time,
00:49:14.180 then you have
00:49:14.900 a lot of people
00:49:15.700 who are either
00:49:16.960 innocent
00:49:17.420 or the crime,
00:49:18.600 again,
00:49:19.020 just like in any
00:49:19.740 of these
00:49:20.240 cancel culture things,
00:49:21.540 like the crime
00:49:22.240 is not so big,
00:49:23.920 but the punishment
00:49:24.700 is the same
00:49:25.700 to everyone
00:49:26.820 no matter
00:49:27.460 what they've done.
00:49:28.520 You know,
00:49:28.760 somebody saying
00:49:29.740 something slightly
00:49:30.500 inappropriate
00:49:31.000 is not the same
00:49:32.060 as somebody
00:49:32.560 raping a woman
00:49:33.620 or assaulting
00:49:34.460 women or men.
00:49:36.360 There's definitely
00:49:37.420 been some men
00:49:38.480 who've also
00:49:39.360 had predators
00:49:41.020 unleashed on them,
00:49:42.640 but it's something
00:49:43.980 that, you know,
00:49:45.080 you have the same
00:49:46.060 punishment
00:49:46.500 and also it's
00:49:47.440 being decided
00:49:48.060 in the court
00:49:48.800 of public opinion
00:49:49.920 and not law,
00:49:51.800 and that is
00:49:53.180 a very dangerous
00:49:53.960 thing.
00:49:54.940 Agreed.
00:49:55.580 And the problem
00:49:56.660 is with these
00:49:57.300 movements
00:49:57.700 is they start
00:49:58.560 off and,
00:49:59.340 you know,
00:49:59.940 let's talk
00:50:00.520 about me too,
00:50:01.380 Harvey Weinstein.
00:50:02.280 Great.
00:50:02.940 Everybody agrees
00:50:03.720 with it.
00:50:04.080 Well, probably
00:50:04.680 apart from Harvey,
00:50:05.500 but, and then,
00:50:06.560 and then we,
00:50:07.860 Even Harvey's
00:50:08.300 probably like,
00:50:08.820 you know what,
00:50:09.340 fair enough.
00:50:11.300 He'll come around.
00:50:12.640 Yeah.
00:50:13.100 But then it goes
00:50:15.000 on and on and on
00:50:15.960 and it goes off.
00:50:16.600 There's some people
00:50:17.120 you go,
00:50:17.440 okay, I can see it.
00:50:19.040 But in order
00:50:19.520 for the movement
00:50:20.140 to keep going,
00:50:21.180 it needs to find
00:50:22.120 new injustices.
00:50:23.340 And eventually,
00:50:24.700 you're run out
00:50:25.680 of injustices.
00:50:26.700 So what happens
00:50:27.500 is you get
00:50:28.120 Anziz Ansari
00:50:29.060 who you go,
00:50:30.740 well,
00:50:31.320 he's behaved
00:50:32.380 a bit of a dick
00:50:33.260 if you take
00:50:33.880 what the woman
00:50:34.680 says verbatim.
00:50:37.020 But you can't
00:50:37.960 destroy someone
00:50:38.920 for being a bit
00:50:39.620 of a dick.
00:50:40.200 No, you can't.
00:50:41.460 People are entitled
00:50:42.140 to be dicks
00:50:42.760 if they want to be.
00:50:43.880 You don't have
00:50:44.740 to hang out
00:50:45.280 with him
00:50:45.680 or invite them
00:50:46.660 to your party.
00:50:47.880 Or go to their house.
00:50:49.160 Or go to their house.
00:50:50.580 Correct.
00:50:51.340 But also,
00:50:51.860 what you touched on
00:50:52.680 is that
00:50:53.420 they are always
00:50:55.420 seeking new injustices.
00:50:56.720 And we're seeing
00:50:57.240 this in a lot
00:50:58.000 of organizations
00:51:00.700 in particular.
00:51:02.080 For example,
00:51:03.080 organizations
00:51:03.760 that had
00:51:04.820 to fight
00:51:05.580 for equal rights
00:51:07.180 for gay people
00:51:07.920 or marriage,
00:51:08.880 now they don't
00:51:09.720 really have anything
00:51:10.780 because they've
00:51:11.320 maybe not fully solved it
00:51:13.340 but some pretty much
00:51:14.680 solved it
00:51:15.240 in North America.
00:51:16.980 But instead of
00:51:17.440 sort of moving on
00:51:18.140 to maybe countries
00:51:19.060 around the world
00:51:20.280 that have
00:51:21.540 similar issues
00:51:22.800 that we had,
00:51:24.820 they are
00:51:26.240 finding other ways
00:51:28.200 because they have
00:51:28.620 all this money
00:51:29.340 and they have
00:51:30.080 an infrastructure
00:51:30.800 and jobs
00:51:31.600 and the monster
00:51:32.460 has to sort of
00:51:33.160 keep feeding itself.
00:51:34.920 And I think
00:51:35.540 a lot of these
00:51:36.300 cultural issues
00:51:37.720 arise from that
00:51:39.020 as well
00:51:39.440 because you have
00:51:40.380 all these organizations
00:51:41.200 with nothing to do
00:51:42.320 so they find new things.
00:51:44.000 Every great cause
00:51:44.720 begins as a movement,
00:51:45.840 becomes a business
00:51:46.540 and eventually
00:51:47.220 degenerates into a racket.
00:51:48.760 Eric Hoffer.
00:51:50.000 Catherine,
00:51:50.300 we're going to go
00:51:50.840 to locals
00:51:51.460 in a second
00:51:52.220 for all the people
00:51:53.660 who do support us
00:51:55.000 where we ask
00:51:56.280 their questions to you.
00:51:57.720 Before we do though,
00:51:58.620 as you know,
00:51:59.100 we always end
00:51:59.600 with the same question
00:52:00.400 which is
00:52:01.060 what's the one thing
00:52:02.200 that we're not talking
00:52:03.100 about that we really
00:52:03.740 should be?
00:52:05.040 That is a great question.
00:52:06.700 I know.
00:52:07.720 And I know
00:52:08.360 you always ask it.
00:52:09.980 And how long
00:52:10.820 does it take
00:52:11.240 for your guests
00:52:11.820 to solve that one?
00:52:14.580 They usually
00:52:15.220 all answer straight away.
00:52:16.300 Yeah.
00:52:16.580 So no pressure.
00:52:17.360 Oh god darn it.
00:52:18.060 Gosh darn it.
00:52:19.380 Okay.
00:52:20.300 I think that it is,
00:52:22.220 I think people
00:52:22.900 aren't talking,
00:52:23.980 I think there's a tendency
00:52:24.900 to go into ideology.
00:52:27.280 This ideology is bad,
00:52:28.720 this ideology is good
00:52:29.980 and the way
00:52:30.880 that I wish people
00:52:31.760 would talk about it more
00:52:32.860 is about how
00:52:34.360 it's about behavior.
00:52:37.400 It's about human behavior
00:52:38.860 and we need to
00:52:41.580 stop looking at things
00:52:42.740 from, you know,
00:52:43.900 these polarized sides
00:52:45.420 and if there's
00:52:46.780 some good ideas,
00:52:47.780 it's a good idea
00:52:48.720 because it helps people.
00:52:51.000 It doesn't hurt
00:52:51.920 too many people,
00:52:53.060 you know,
00:52:53.880 and then
00:52:55.180 we can't use
00:52:56.800 the tactics
00:52:57.440 that we criticize
00:52:58.260 thinking that
00:52:59.240 we're going to win the war
00:53:01.400 and then
00:53:02.300 we'll make everything right
00:53:04.020 and then we'll adhere
00:53:04.960 to principles.
00:53:06.940 And the other thing
00:53:08.500 that I think
00:53:09.020 people aren't talking
00:53:09.820 enough about
00:53:10.500 is AI
00:53:11.220 and how it's
00:53:13.600 going to completely
00:53:14.740 reshape
00:53:15.300 our sense of the world
00:53:16.660 because of all
00:53:17.400 the disinformation
00:53:18.280 that's going to spread.
00:53:19.940 So we're not,
00:53:21.040 our sense of reality
00:53:22.180 is already fragmented
00:53:23.840 and now it's going
00:53:24.740 to get even more
00:53:25.700 because we won't know
00:53:26.740 what to believe
00:53:27.360 and what not.
00:53:28.460 Happy news.
00:53:29.380 Yeah.
00:53:29.960 All right.
00:53:30.640 Catherine Brodsky,
00:53:31.280 thank you so much
00:53:31.880 for coming.
00:53:32.440 Head on over to Locals
00:53:33.560 where we continue
00:53:34.320 the conversation
00:53:35.020 with your questions.
00:53:36.600 Why is the silent majority
00:53:38.600 always associated
00:53:40.000 with right-wing views
00:53:41.140 and degraded
00:53:41.980 as being somehow
00:53:43.000 non-educated
00:53:44.760 and populist?