TRIGGERnometry - May 19, 2019


Jess Butcher on Women in Tech, Social Media and a Positive Vision of Men and Women


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

179.23763

Word Count

11,191

Sentence Count

336

Misogynist Sentences

44

Hate Speech Sentences

23


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Hello and welcome to Trigonometry. I'm Francis Foster. I'm Constantin Kissinger. And this is
00:00:10.020 a show for you if you're bored with people arguing on the internet over subjects they
00:00:14.340 know nothing about. At Trigonometry, we don't pretend to be the experts, we ask the experts.
00:00:19.860 Our fantastic guest this week is a serial entrepreneur and the co-founder of Tick,
00:00:25.260 Jess Butcher. Welcome to Trigonometry. Thank you for having me. Before we get into the interview,
00:00:28.560 I just noticed Francis has replaced the word of with the word with,
00:00:31.840 which you've been getting a lot of shit on YouTube for.
00:00:33.820 I know, I have.
00:00:34.440 And social peer pressure is finally...
00:00:37.480 You've crumbled me.
00:00:38.620 Yeah, I know, I have crumbled.
00:00:40.140 You've sold out.
00:00:40.880 I know it's not geratically correct,
00:00:42.340 but it's a colloquialism of where I come from.
00:00:44.440 But due to you vicious bullying bastards,
00:00:47.240 you've impinged on me, and it's made me sad.
00:00:50.780 You've made him improve his grammar.
00:00:52.300 Sorry, Jess, that was just totally random.
00:00:54.740 Yeah.
00:00:55.340 Very warm welcome to Trigonometry.
00:00:57.000 We obviously know exactly who you are, but tell us a little bit about For Our Viewers.
00:01:01.400 What's your journey through life? How are you where you are?
00:01:03.760 How have you ended up sitting in the chair in which you currently sit?
00:01:06.780 Oh, the short version. I'll give you the short version.
00:01:10.220 I'm from the Midlands. I've worked very hard at losing the accent.
00:01:15.060 I am the product of a very happy middle class upbringing, two siblings, very supportive parents.
00:01:24.840 And I've been in London since graduation.
00:01:28.340 I had the scrappiest, messiest CV ever where I kind of jumped jobs every two years because I'd get bored or I couldn't be managed.
00:01:38.960 But there was a theme to all of it, which was one of just really enjoying working on the front end,
00:01:46.740 as in the sales and the marketing of disruptive new businesses, businesses that were trying to change behaviors.
00:01:52.560 And I guess what I always was and which I now am, to cut to the end of the story, is I always wanted to be an entrepreneur.
00:01:58.660 I always wanted to build businesses. I always wanted to be my own boss, I guess.
00:02:02.440 But it took me a long time, I guess, to get the confidence and, of course, the golden bullet of an idea to do that.
00:02:08.000 So I kept riding the coattails of all these other entrepreneurs in my 20s in a very scrappy way and learning from some of the best and also some of the worst about how to build businesses.
00:02:20.000 And then I met my business partners at my first company, which was Blippa, which I started in 2011.
00:02:29.260 And that was, for some years, one of the big tech success stories that came out of the UK.
00:02:34.660 We were the global pioneer in the augmented reality space.
00:02:39.500 My job, again, sales, marketing, communications, building brands and media interests, helping with investment.
00:02:46.480 And it was a thrilling ride.
00:02:48.340 You know, we went global. We were talking on stages all over the world and really, you know, disrupting something.
00:02:56.900 And I loved the story of that. And it was certainly the most on and alive I've ever felt in my life, building that business.
00:03:04.580 And that's just given me the hook. And I'm now, I was for a couple of years, I moved into a sort of advisory consulting role
00:03:14.560 where I was working with a lot of other startups,
00:03:16.400 helping to scale them, give them advice,
00:03:18.460 did some investing consultancy.
00:03:21.120 And then a year ago, I got my head turned
00:03:22.840 and now I am back at the bottom of the pile
00:03:25.320 building another tech business for my sins.
00:03:28.040 A really, really exciting, very, very ambitious
00:03:30.820 micro-video platform business
00:03:33.200 for user-generated how-to content.
00:03:37.080 And yeah, I'm back on the rollercoaster again.
00:03:40.100 It's very exciting.
00:03:40.880 So you're a woman in tech.
00:03:42.220 I am.
00:03:42.560 So you're definitely oppressed, as we know.
00:03:45.280 Fusely oppressed.
00:03:46.260 I'm a woman in tech and I'm a female entrepreneur.
00:03:48.740 So I am double intersectionally oppressed with those two labels.
00:03:56.920 And yeah, and I guess because I, the journey between Blippo and Tick involved a couple of years where I stepped out of the front seat of that business because I had three children in four years.
00:04:06.880 Wow.
00:04:07.300 Yeah, a bit foolish with hindsight.
00:04:09.660 obviously hugely rewarding but it just meant that I was kind of less inclined to do the 100 hour
00:04:15.400 work weeks of my life on a plane and I needed to reclaim some balance and frankly sanity to
00:04:20.000 my day-to-day so I had sort of a year and a half two years at home and it was during that time that
00:04:27.500 I really started to think about in more depth and explore this apparent oppression or not oppression
00:04:35.320 this sort of discrimination within these two fields that I was so closely connected with
00:04:39.500 and which I was being invited to talk about on platforms all the time.
00:04:44.040 And there were a number of sort of catalysts to my thinking at that time.
00:04:48.340 The first was my mother, actually.
00:04:51.640 I took her to a speaking event that I was at where I was being interviewed
00:04:55.060 about how hard it was to be a woman in tech.
00:04:57.680 And I was very good at this line.
00:04:59.840 I had all the stories and things.
00:05:02.420 And it was only afterwards that she sat me down over a glass of wine.
00:05:05.100 And she said, tell me, has it been hard?
00:05:08.420 How has it been hard?
00:05:09.620 I went, you know, I really genuinely just want to understand how it's been hard for you.
00:05:14.100 And I kept telling these stories.
00:05:16.300 And she said, that's not your story.
00:05:17.260 That's someone else's story.
00:05:18.660 That's not your story.
00:05:19.800 That's someone else's story.
00:05:21.320 Have you found it hard?
00:05:22.000 Because what I've seen is that you're getting more opportunities to speak on platforms than any of your male business partners.
00:05:27.900 You keep getting these award nominations that none of your male business partners get.
00:05:31.660 You get a lot of PR and column inches.
00:05:33.660 um have you found it hard and the way she grilled me on this I kind of thought
00:05:40.260 I am using other people's stories and I am like saying this sort of stock line and in actual fact
00:05:48.360 it has it hasn't been hard I've actually loved being a minority within my two spheres because
00:05:55.820 it does enable me to stand out it enables me to get recognition and then I also started to think
00:06:01.380 about how I got all those opportunities and I realized that so much of it had come from the
00:06:07.840 love and support of the men in my life you know that those men who really believed in me it started
00:06:13.760 with my incredible father who just brought me up thinking I could be and do anything the boys could
00:06:19.440 be there was no concept of you're going to find it harder than your brother none of that you know
00:06:25.960 whatever you want to be you can be you know how can I help you you know what what should we talk
00:06:30.700 about today what's interesting you at the moment and then business bosses that I had mentors that
00:06:36.900 I had people that I looked up to they always supported me and pushed me forward and gave me
00:06:42.340 opportunities and I'm now fortunate enough to be married to a wonderful man who does that who
00:06:46.700 really enjoys my success is a bore at dinner parties and always wants to tell people what
00:06:51.880 I've been doing you know my wife's got an MBA you know stop how embarrassing is that
00:06:56.660 and then I started to think
00:06:59.780 well actually some of the people that I've had
00:07:01.720 the biggest challenges in my professional life
00:07:03.580 have been women and I'm not allowed to say that
00:07:05.760 you know that is
00:07:06.820 but ultimately I don't subscribe
00:07:09.540 to this
00:07:10.660 identity box definition
00:07:13.640 I'm not a female entrepreneur
00:07:15.500 and I started to
00:07:17.260 resent that you know I'm just an entrepreneur
00:07:19.720 and I want recognition for building great
00:07:21.680 businesses I don't want people to buy my product
00:07:23.480 because I'm a woman I don't want any
00:07:25.500 special favors because I'm a woman but I always felt this slight sense of
00:07:30.000 conflict because obviously I do have a product to sell so if there's positive
00:07:34.060 discrimination to be taken advantage of then absolutely I will do that and I
00:07:40.620 also feel a conflict because it's not like I'm saying the whole narrative is
00:07:44.460 is wrong I do believe that women respond very well to role models and I
00:07:50.980 I understand a lot of the initiatives behind these awards and the publicity and the media
00:07:57.720 narrative around getting more women into the public eye and getting them to share their
00:08:01.500 story because younger women, including myself when I was younger, are hugely affected by
00:08:08.200 seeing people like them succeeding and that is very empowering as far as being able to
00:08:14.080 identify with that and see yourself in that position.
00:08:17.580 And of course, we come in all different shapes and sizes. It's not the sort of Deborah Meadon-esque dragon in business. It's, you know, sort of scrappy skateboarding girls and pink haired girls and every color, every age, introverts, extroverts.
00:08:34.700 you know, it's really important. I think women do get influenced by that. So a lot of what's
00:08:39.700 happening within, I guess, the diversity industry in my two fields, I support and I'm actively
00:08:47.040 involved in. But I stop or draw the line when it comes to the overemphasis around discrimination
00:08:54.000 and disadvantage. Because I think when you start tipping into negativity and definition of
00:09:01.800 ourselves around our gender i i have less time for that because i ultimately don't think it does
00:09:08.020 girls and women any favors and do you think there is a problem uh with the tech uh industry and
00:09:15.520 women in particular and attracting women do you think it's it's male dominated and you know and
00:09:21.860 therefore it's harder for a woman to make her way i know you've got your story but somebody else's
00:09:26.340 story can be different for example yeah um oh there's no question it's male dominated there's
00:09:31.320 no question about that to the second part is it harder for a woman to make their way no I don't
00:09:36.440 believe it is I mean if they if they're going to feel uncomfortable in a majority male environment
00:09:41.920 then they might feel less inclined to it but I don't believe it's harder I think that there's
00:09:46.940 been an overcorrection there and there's a lot of positive discrimination for women coming into the
00:09:53.400 technology space which I don't agree with either because I'm perhaps naively a meritocrat and I
00:09:58.420 believe it should always be best person for the job but it's a vibrant buzzy exciting sector and
00:10:05.680 very few women that I know that come into it you know have day-to-day anxiety within that sector
00:10:12.300 and also tech isn't really an industry I also get a bit cross with that as a definition everything's
00:10:18.760 tech there's no business in the world right now that isn't tech in some in some way and there's
00:10:24.780 so many creative media communications,
00:10:29.320 product opportunities that aren't pure coding.
00:10:32.460 But we seem to obsess on that under-representation
00:10:35.900 within the technology job specifically,
00:10:38.840 and the fact that that's apparently having an impact
00:10:41.460 on how products are designed, and they don't have women
00:10:43.620 in mind.
00:10:44.280 And that certainly has been the case,
00:10:47.380 and some high-profile examples of that.
00:10:49.780 So I'm glad to see some of that being corrected.
00:10:52.120 But I think I feel a fantastic industry for women,
00:10:55.580 and most of the women, all the women I see,
00:10:58.000 are doing tremendously well within it.
00:10:59.460 And I would love to see more women in it,
00:11:00.880 but I'm afraid I believe in the differences,
00:11:04.580 the average differences between the predilection of men
00:11:08.780 and women.
00:11:10.000 And it's borne out time and time again in the data
00:11:12.480 as far as their aptitudes and interests
00:11:15.860 in certain types of fields of science, for example,
00:11:19.360 more people softer social interests than things you know that there is evidence for that and i
00:11:28.960 find it compelling before we get into that i just wanted to go back for a moment can i just stop i
00:11:34.080 just want to make it clear to everyone here that i don't see gender carry on you don't see gender
00:11:37.840 no i don't call me bob yeah yeah it's interesting that your girlfriend's female how did that happen
00:11:44.240 Well, I was attracted to Aurora.
00:11:47.300 Anyway, carry on.
00:11:51.140 Okay.
00:11:52.860 Well, there you are.
00:11:54.120 I was going to ask you if we go back a little bit
00:11:56.240 before we get into the details of the kind of men and women
00:11:59.300 and all the rest of it.
00:12:00.460 I was interested that you've had a journey of kind of
00:12:03.300 almost a transition from a particular way of talking
00:12:06.200 about this issue to a totally different one.
00:12:08.980 What do you think it was that made you talk about
00:12:12.300 uh you know discrimination and all that stuff from that perspective in the first place
00:12:18.100 um that is really interesting and it was a journey um it it was actually quite accidental
00:12:24.900 it was something i was feeling um sort of disillusioned with for a long time but also
00:12:30.640 very very nervous about putting my head above the parapet on because i was conscious that i'd be
00:12:35.620 swimming against the stream that frankly i work within you know all these people um are great
00:12:42.220 friends of mine. Sorry to interrupt. I actually meant when you were talking about how there is
00:12:46.260 discrimination, what was it that pushed you to do that in the first place before you changed?
00:12:51.720 Because that was what everyone was saying. It's really hard to be within these sectors. I mean,
00:12:56.480 that's what the, that would almost be the title of the session that I'd be in. So it was inviting
00:13:02.640 us to all relay those moments when we'd been the only woman in a room and, you know, the client
00:13:08.860 has been talking that we've been talked over or you know someone had directed comments to junior
00:13:14.000 male colleagues um you know next to us so I guess I was yeah I guess I heard so many of those stories
00:13:21.800 that I um I felt well I must have some to contribute and you know and there is a problem
00:13:28.500 here and there has been a problem there I should say you know I don't believe that that problem
00:13:32.580 totally no longer exists I'm very far from that and you know I think it's important that they're
00:13:38.040 but it's when we get to a stage that now we're looking for everywhere and that's kind of where
00:13:45.920 we are now and that troubles me. I'll give a really cool example actually because this week
00:13:50.480 on Twitter I saw about 12 of my friends resharing this tweet where a woman had
00:13:57.740 applied for a credit card and she put her title as doctor and then her gender is female and this
00:14:07.460 alert had come up to say, you know, we don't recognize that title or that gender. And of
00:14:12.500 course, she was horrified. And within 24 hours, 14.5 thousand likes and shares. And I just
00:14:22.400 thought, seriously, I work in tech. I know that those two fields are not in any way connected
00:14:27.100 on the back end and that no computer is going to see those. And sure enough, you know, I
00:14:31.920 followed this thread down and some guy had replicated exactly the same user experience
00:14:36.700 bug that had happened on the tech platform. And yet the HuffPo had picked it up. They
00:14:43.380 all wanted to talk about this sort of misogyny in this product. And this credit card company
00:14:50.240 is basically getting demonized for what is a stupid UX technical bug. But it was the
00:14:55.920 mob mentality. There was always this glee that, oh, look, we found another example of
00:15:00.220 it. And I kind of just thought, how is this doing women any favors? And it's not fair
00:15:06.040 to the company i feel sorry for them i need to fire a ux guy or or look into the back end a bit
00:15:12.100 but that's not what this was it had nothing to do with that and that to me is so symptomatic of
00:15:18.740 where we've got to with the narrative that looks for it everywhere you know peppa pig's dad fixes
00:15:24.200 the computer and not the mom this is a gender stereotype take aside the fact that peppa pig's
00:15:29.300 Dad's a fucking idiot, and the men's rights hate Peppa Pig.
00:15:34.420 Do they?
00:15:34.940 Oh, yeah, because they think that he's just betrayed
00:15:39.940 as such a fool in everything that he does,
00:15:42.800 which he is.
00:15:43.300 My husband hates it.
00:15:44.520 This is what our children are growing up to.
00:15:46.760 But the fact that he fixed the computer and his wife didn't
00:15:50.300 is evidence of this sort of subconscious indoctrination
00:15:54.100 that we're doing of children about gender roles.
00:15:56.100 so as I said
00:15:57.760 be a mum of
00:15:58.480 three children under five
00:15:59.380 and watch enough
00:15:59.840 Peppa Pig
00:16:00.320 and you'll know
00:16:00.780 that you've actually
00:16:01.540 got that entirely
00:16:02.340 the wrong way around
00:16:03.120 I would never have
00:16:04.840 predicted that
00:16:05.300 we would have
00:16:05.620 gone to Peppa Pig
00:16:06.440 we were talking about
00:16:08.200 it in the context
00:16:08.820 of everything
00:16:09.320 there's a scandal
00:16:10.460 now about how
00:16:11.420 there's some kind
00:16:12.620 of children's book
00:16:13.520 that talks about
00:16:14.480 fire men
00:16:15.300 as opposed to
00:16:16.420 fire people
00:16:17.220 Clim and Sam
00:16:17.960 is the other big
00:16:18.840 course du jour
00:16:19.520 on how we're
00:16:21.060 indoctrinating children
00:16:22.000 and I'm like
00:16:22.740 seriously
00:16:23.240 give children some credit
00:16:24.320 you know
00:16:24.940 it's
00:16:25.880 they just I just don't believe they see it um it's you know is it insidious and it's waving
00:16:32.440 I don't think it is anymore I'm glad that somebody at one stage said actually you know there's a
00:16:36.980 little too much of this going on but I don't find girl power a particularly compelling message from
00:16:44.220 my daughter either I you know I find that all a bit I went into paper chase and there's this whole
00:16:51.400 row are sort of glittery, sparkly, girl power, you know, we can do it type of...
00:16:56.360 Defeat the patriarchy.
00:16:57.760 Yeah.
00:16:58.320 Smash.
00:16:58.920 Smash.
00:16:59.240 And I say, no, work with them.
00:17:01.340 You're not a girl first.
00:17:02.640 You're you, your kitty.
00:17:05.080 You're your own unique individual.
00:17:08.100 And you have more in common with people that share your values, your outlook, your upbringing,
00:17:13.340 arguably, than you do with anyone because they happen to share the number of chromosomes.
00:17:18.300 And that, to me, is what's worrying.
00:17:20.680 we distill people down and then we exacerbate the difference and that's of course all identity
00:17:24.940 politics not just the gender debate. Do you think it's a problem as well with these types of
00:17:29.660 messages where somebody would go oh I'm not going to go into tech because somebody like me isn't
00:17:34.200 welcome or there is a problem with it? That's where my angle is entirely so I did a TEDx on
00:17:41.100 this and this is what I was alluding to earlier the fear I felt for putting myself out there on
00:17:46.280 this line my my argument is that firstly I'm an optimist I do believe that women have many many
00:17:53.280 more opportunities than the narrative would have you believe and secondly I feel that this
00:17:59.200 victimhood narrative is really damaging for how women perceive themselves so I think it's only
00:18:06.380 going to exacerbate the confidence gap that is already existent in more women than men as a trait
00:18:13.000 you know as an average trait women are much more subject to imposter syndrome the confidence gap
00:18:18.440 than men are and i was a woman with all the opportunities in the world a fantastic education
00:18:23.400 a great upbringing you know i i still had that imposter syndrome i shouldn't have had that so
00:18:28.120 if i'm having it and i see it amongst so many of the women that i mentor that i work with
00:18:32.300 you know there is this sort of we say what we're good at we don't oversell ourselves you know we
00:18:39.120 ask for what we want we don't ask for more than that you know we we apply for jobs where we can
00:18:43.400 do 80 percent of the job spec whereas a guy is more likely to do it at 30 40 percent there is
00:18:49.060 there is this just sort of more innate risk aversion in women and I feel that the narrative
00:18:55.020 encourages that further it does two things one it will make them more anxious about certain fields
00:19:00.800 where they're told they're going to be discriminated against you know they're not going to be walking
00:19:04.560 into those quite as bullish and as confidently as I think they would otherwise. And what was the
00:19:11.680 second thing? Oh, the second thing is that this narrative lets them off the hook. And that's not
00:19:17.880 good for them. You know, in my field, when I meet with young female entrepreneurs, the number of
00:19:23.900 times I've heard people say, oh, I didn't get that funding because, you know, they don't invest in
00:19:29.960 women or they're not you know that it was a gender reason why I wasn't taken as seriously in that
00:19:35.680 scenario I've now got I guess the experience the age of gravitas I'm mentoring them to actually
00:19:41.040 say to them no actually I was in your pitch it wasn't good you know don't let yourself off the
00:19:47.980 hook you know you need to take feedback and as an individual and you need to constantly self-improve
00:19:55.060 and look at what you can do and even arguably even if it was due to
00:19:59.980 discrimination the most productive reaction to that is not wounded
00:20:05.500 insecurity you know go cry to someone about how you might have been you know
00:20:09.280 gender discriminated against but it's to actually go well come on then I'll show
00:20:14.740 you and take the onus to circumvent the situation in some way you know resilience
00:20:21.460 It should be about resilience and I feel that the narrative of discrimination and victimhood undermines both that confidence and also that resilience and also the individual onus to take ownership of how you put yourself forward and to mould yourself, change yourself to the circumstances as required, which everybody needs to as an individual, man or woman, that's not gendered.
00:20:46.600 The point you make is really important, I feel,
00:20:49.480 because a lot of the time, you know,
00:20:52.480 there seems to be this narrative that women are more oppressed than ever.
00:20:55.780 But I used to be a teacher. I was a teacher for 10 years.
00:20:58.580 Girls are outperforming boys on every level in academia.
00:21:03.120 Every level.
00:21:04.200 And then when they go to university, they're outstripping boys.
00:21:07.860 I mean, boys seem to be the ones struggling,
00:21:09.600 especially when it comes to the school level.
00:21:11.760 Yeah.
00:21:11.940 And that's another area of the debate that's incredibly fraught with nastiness and unpleasantness, because obviously in reaction to a lot of the feminist line, you've now got men's rights trying to draw attention to where those disparities exist.
00:21:30.460 But the data does speak for itself, and I find it pretty compelling, whether it's outperformance at university, whether it's the fact that women in their 20s and 30s are earning just as much, sometimes more, than men.
00:21:43.080 And it's really striking when, in my field, again, when you look at STEM data and, you know, the ratios, they're apparently being so off whack.
00:21:52.280 But they are off whack if you only take a very narrow definition of STEM, which is engineering and computer science and, you know, that type of science.
00:22:02.760 If you take in social science, behavioral science, medical science, anything of a biological nature, women are 75% within those fields of science.
00:22:13.420 And yet nobody's saying we need to level up both sides.
00:22:17.940 You know, we're only saying there aren't enough women in computers.
00:22:20.120 I agree. I would love to see more women in computing and engineering.
00:22:24.160 I think they have so much to contribute and different balance of the types of products that we're building.
00:22:31.700 but in order to achieve that we need to talk them out of other they're not going to just create these
00:22:36.420 women you know out of nowhere we need to talk them out of the other choices that they're making
00:22:40.880 um and you know i would certainly personally err on the side of choice you know i don't believe
00:22:46.500 women should be talked out of things that they want to go into and there's certainly the sort
00:22:50.800 of women that are picking those behavioral or biological sciences as opposed to the technical
00:22:56.140 hard sciences are pretty bright women you know if they're picking those sciences so i i trust them
00:23:01.840 to make those judgments um and i don't either want to be talking women out of careers in arts and um
00:23:08.100 you know social work and media and you know the the creative fields and edge hell no not out of
00:23:14.060 education but nobody's saying less women in teaching please let's have more men in teaching
00:23:18.800 so it's because men are toxic but all the conversations are being had in silos in the
00:23:24.800 underrepresented fields you know whether that's in finance or it's in tech and that's that to me
00:23:30.720 is you can't have them in isolation and the boys arguments and the men's underrepresentation is
00:23:37.640 is never discussed and indeed you know the different lifestyle disadvantages that men
00:23:43.120 as a group experience you can't even bring up like what there's so many there's so many and
00:23:50.840 And there's suicide rates, there's homelessness rates, rates of depression, rates of school
00:23:57.400 dropouts, workplace deaths, deaths in service, of course, on the front line.
00:24:03.720 You've got sentencing court disparity, men receiving 60% longer sentences for the same
00:24:09.800 crimes as women, child custody, paternity cases, domestic abuse, which actually a heavy
00:24:17.420 percentage of men you know suffer from that and we never hear about that but it's I don't know
00:24:23.980 that it's helpful to fight fire with fire yeah I think sometimes the context is required to help
00:24:30.240 broaden and open up the Overton window a little bit and I think that's where some of the more
00:24:35.760 moderate voices like myself are trying to just say well hang on let's look at the full picture
00:24:41.520 before we dive deep into one particular area but then you get into a stage where there's this sort
00:24:47.400 of competitive victimhood, which I think is where the men's rights movement falls down.
00:24:53.480 There's a lot of very smart people trying to make these arguments, and I guess just
00:24:56.680 as the feminists or the progressive feminists would say, we've got to overcorrect in order
00:25:01.540 to get back to a normal, I suppose the men's rights feel likewise, you know, overcorrect
00:25:05.440 and overstress these things in order to at least get some awareness for them, particularly
00:25:09.500 in this sort of clickbait, you know, headline-grabbing media world and online world that we now
00:25:17.040 living but it's just it's victimhood by another name. That's my biggest worry with it. That's my
00:25:22.920 concern. That's my biggest worry with it because it encourages people to go to the other extreme
00:25:27.840 as well and for men's rights movement is never going to be a particular success because people
00:25:31.880 don't tend to feel sorry for men in the same way that they feel sorry for women. It's just a fact
00:25:36.700 of life you know. I feel sorry that we don't feel so I feel sorry for men that we don't feel sorry
00:25:42.160 but that's again you talk about realities it's a biological reality in my opinion because you
00:25:46.940 look at uh you talk about male deaths in the workplace and all the rest of it men evolved
00:25:52.140 we evolved to be disposable in a way that women are not if you think about two tribes of 10 people
00:25:56.920 each right let's say five men five women in each tribe the tribe that sent its men off to war
00:26:01.480 only one man comes back you can replenish your tribe at the same rate if you send your women
00:26:05.900 off to war it's not the same so men are disposable biologically in a way that women are not
00:26:10.840 And I think that until we have no war, until no one needs to be a firefighter or whatever, that will not change because that's how men evolved to be and women evolved to be different.
00:26:23.200 You talk about risk-taking, et cetera.
00:26:24.880 We've had evolutionary psychologists on the show talk about where that comes from, the risk-taking, risk-aversion, differences between men and women.
00:26:33.840 It's evolution, right?
00:26:35.220 and this is why it's so important to take it in the round as you say because you can't just focus
00:26:39.900 on one thing and go well men are more likely to die at work yes it's true it is true and we don't
00:26:45.420 feel as sorry for men as we do for women but there's advantages to being a man there are
00:26:49.740 disadvantages to being a man there's advantages to being a woman there are disadvantages to being
00:26:53.980 a woman and what we have to do i think as a society is get rid of discrimination but remember
00:27:01.360 that we're not the same.
00:27:03.020 And one of the great things about women
00:27:05.020 is women are different to men.
00:27:06.780 So I reckon if you had more women in banking,
00:27:09.120 for example, in 2006, 7, 8,
00:27:11.840 with really powerful voices,
00:27:13.800 I don't know if we would have had a financial crash
00:27:15.800 in quite the same way
00:27:16.820 because it would have been the super hyper risk
00:27:19.800 hungry people making those decisions.
00:27:22.700 If you had some women who are more risk averse
00:27:25.080 with a strong voice in that boardroom,
00:27:27.780 you may not have had the same problems,
00:27:29.560 which is why I really think it's important
00:27:31.060 that we do have people of both sexes represented.
00:27:35.040 Sure.
00:27:35.520 But that has to come through, as you say,
00:27:37.140 through opportunity, equality of opportunity,
00:27:39.760 not by forcing people into those positions, right?
00:27:42.880 That sounded like a party political broadcast
00:27:44.660 of the Women's Equality Party.
00:27:46.580 Yeah, but I think it's an important point to make.
00:27:49.060 It is an important point to make,
00:27:50.200 and that leads me very nicely on.
00:27:52.180 Jess, do you believe in quotas?
00:27:54.420 No.
00:27:55.420 Surprise.
00:27:56.160 Do not believe in quotas at all.
00:27:58.280 I don't think any self-respecting minority box tick wants to feel like they're there because they're a box tick.
00:28:09.320 I think some people will tell themselves that this is required in order to correct an imbalance.
00:28:19.100 But I think it does nobody any favours.
00:28:21.640 It might correct that imbalance in the short term.
00:28:23.440 I think there's nobody any favours in the medium to long term.
00:28:25.960 So no, I'm not fine.
00:28:27.400 I mean there's a counter argument to that where there is unconscious bias in that you're not aware that you have a bias but you know you only hire people that you know look or sound like you or have the same background as you hence why everybody at the BBC comes from the same college at Oxford or Cambridge or whatever else.
00:28:43.660 Do you believe that it doesn't sort of counterbalance that and force people to look outside the box as it were?
00:28:50.420 Well, yeah, I mean, this concept of unconscious bias is now accepted.
00:28:57.680 And yet I understand it has been entirely debunked scientifically.
00:29:02.240 It's just nobody wants to report that.
00:29:05.160 I've read articles that have actually done the research and say it's been debunked.
00:29:08.760 However, I do believe there's a little bit of people like me hiring that goes on.
00:29:13.500 But I don't believe that that is around identity boxes.
00:29:18.300 I believe it's around upbringing, values, you know, social behaviours.
00:29:25.920 You know, if they went to schools, you know, that you know, you've automatically got that point of reference with that individual.
00:29:33.700 And you might know some of the same people.
00:29:35.220 You know, London's a small place in many respects.
00:29:38.140 So I do think it happens a bit, although, you know, I have read that that unconscious bias is massively overstated.
00:29:45.320 And I want to give people the benefit of the doubt that there is more meritocracy and judgment employed in those things.
00:29:53.180 But it's the lack of diversity on ideology, people type, and background and viewpoints, socioeconomic diversity that I think is the biggest risk.
00:30:06.160 And the diversity that I think we should be focusing on as opposed to the visible diversity boxes.
00:30:13.340 It should be much more about getting more psychological diversity.
00:30:18.360 Because at the moment, the identity box ticking only serves to help the people of colour or the women.
00:30:31.460 I've gone to the same universities that have had the fortunate, well-supported upbringings with parents that believed in them.
00:30:38.960 And I've heard the argument that, well, that's where you start and then you create the role models to bring more people through the system.
00:30:46.420 But at the moment, I see the recipients of a lot of the diversity industry are those people that were doing pretty well already as a result of how fortunate they were with education or with attitude.
00:31:01.020 Some people are just born more intelligent, born more driven.
00:31:05.460 and so how do we get access to those types of people so you know the people who grow up who
00:31:13.080 want to achieve but don't always come from the most affluent backgrounds don't always have the
00:31:16.940 wealth of opportunity because i one thing that really angers me is the whole internship culture
00:31:23.200 that i see happening now and that it didn't happen when i graduated university but after 2008
00:31:28.660 i think it gave companies a little bit of carte blanche to start going i'll come and work for
00:31:34.280 free and then we'll see if you get a job or not okay i think from what i've seen and i'm not an
00:31:40.400 expert at this and you've just asked possibly the biggest societal challenge question of all
00:31:45.580 how do you bridge socio-economic uh background i don't have the answer i don't have the answer to
00:31:50.940 that how do you get you know a better melting pot of people from all those different backgrounds
00:31:55.300 i do know that there's the tide has turned on that free internship thing now no serious business
00:32:00.720 offers a free internship anymore because they have recognized that that is not the way it's
00:32:05.780 not fair and it only rewards those that can afford to do it as opposed to those that can't and you
00:32:10.260 know I'm very encouraged by the people I know within the business community that absolutely
00:32:16.440 accept that and go out of their way to engage with schemes that are about access for those that
00:32:22.680 aren't fortunate enough to have lots of parents you know within the school community that can
00:32:27.480 offer those those free work experiences there's some brilliant initiatives whether it's the
00:32:30.940 founders for school initiatives or the work finder app or access aspiration i think it's called
00:32:35.280 that are deliberately around creating networks of mentors and schemes and opportunities for those
00:32:42.520 that that don't have them within their existing community and school networks i'm a big a big
00:32:48.380 big fan of those and do you think we should obligate companies to i mean maybe look at
00:32:54.000 if you're making a certain amount of money in profit
00:32:57.400 that you should be maybe obligated
00:32:59.880 to bring people in from different socioeconomic backgrounds?
00:33:04.140 Or do you think it should be more free market?
00:33:09.100 I err on free market on most issues in life,
00:33:14.980 but I don't see the harm in telling,
00:33:19.840 strongly saying you need to get involved in your communities
00:33:23.360 in better ways but it doesn't have to be companies coming into business it can be can be companies
00:33:26.560 going into schools yeah and that's the big wave that i'm witnessing and i'm a big part of now
00:33:31.940 actually um on the advisory board of this fantastic initiative called founders for schools
00:33:36.400 which is now too narrowly defined it's not just founders it's all business people now going in
00:33:40.920 and it can be the 21 year old graduate um you know uh scheme um guy or it can be a 60 year old ceo
00:33:50.480 that goes in because everybody's got some mentoring that they can offer to those further
00:33:54.800 down the ladder to them whether we need to mandate it I'd say probably not but I think if you're a
00:34:01.600 responsible business and you do it it makes you more appealing to people that then want to apply
00:34:06.320 to you you know it's actually self it's self-promotional in some ways to do more of this
00:34:11.380 it's getting your brand into these schools for the talent to be aware of as they as they get there
00:34:16.000 as they graduate and this is a question that I've always wanted to ask because I'm a former teacher
00:34:19.820 you're someone who works in business you're a leader what do you think about this generation
00:34:24.540 of young people because it seems to me that a lot of people in the media say oh you know these
00:34:29.760 the education system it doesn't produce you know good workers you know the next generation they're
00:34:34.420 awful they're dreadful where do you stand on this new generation do you think they're ready for work
00:34:38.540 when they come out or do you think which generation are talking about millennials are obviously already
00:34:43.320 in the workplace? Are we talking about Generation Y, Generation Z, iGen? I'm an optimist. I think
00:34:52.100 there's going to be so much talent coming through. But my biggest concerns and another debate that
00:34:58.920 I've tapped into after I started going down the rabbit hole of feminism was, and also now as a
00:35:04.360 parent, is how we're bringing up children. And my concern, and obviously I'm very tuned in also to
00:35:12.360 the sort of snowflake culture conversation and this culture of safetyism that is in universities
00:35:20.580 now. So the generation at university now is the one that was kind of brought up with smart devices
00:35:25.540 and social media and below. And I've really started going into depth now on what's happening
00:35:32.600 to teenagers, how are devices and social media and screen time affecting their upbringing and
00:35:39.780 um you know their their sense of self and I feel sorry for them in many ways you know I think they
00:35:46.680 think that this is the most fun and as far as technology is concerned ever but it's also created
00:35:51.180 this culture of um sort of always on um mental health anxiety desire for dopamine and um you
00:36:01.680 know the kids now kind of have to have a personal brand from the age of sort of 11 12 13 um and
00:36:09.760 And there's this constant focus on self that really worries me.
00:36:16.860 And I think what we're doing, and there's some fantastic research done by a woman called
00:36:21.520 Jean Twenge in the States, which I stumbled across through a fantastic book, which has
00:36:26.220 been a big influence on me, Jonathan Hates and Greg Lukianoff's Coddling of the American
00:36:31.240 mind, that talks about how an 18-year-old today is now more like a 15-year-old 10 years
00:36:41.580 ago because they've been kept so safe, not allowed to go out, not allowed to have Saturday
00:36:46.660 jobs, not allowed to walk to school before the age of 12, 13. There's this sort of terror
00:36:52.300 that there's so many evils in the world now that we really want to protect our children
00:36:56.920 and we're no longer preparing our children for the road but the road for the child
00:37:01.060 and that this coddling has dramatically impacted how resilient children are
00:37:07.320 and made them nervous of bogeymen around every corner
00:37:11.700 and really had a big impact on the maturity levels when they reach university
00:37:16.720 which is why a lot of them want protecting from bad words, nasty conversations, ideas
00:37:22.820 that they don't want to confront by the time they get there.
00:37:27.100 And I think that's a very, very real phenomenon.
00:37:29.560 Some people say it's overstated, but I think it is a real phenomenon.
00:37:33.180 And I think that technology has been a big factor in that.
00:37:36.600 Although I also think it's down to parenting styles
00:37:38.920 and the things within the educational system that no one has to win anymore.
00:37:44.120 We've all got to be, we've all just got to take part as teams.
00:37:47.580 And, you know, there's so much going on around how we're bringing up children and educating children.
00:37:54.760 And I think when you've thrown this massive missile of a 24-7 weapon of a smartphone for teenage girls to exact passive aggression on each other in the middle of the night by leaving them out of photographs or groups.
00:38:09.040 You know, there's a very, very toxic, massively, rapidly changing environment that kids are being brought up in now.
00:38:18.500 And how much responsibility do you think the tech companies have to these types of users?
00:38:23.080 Because there's a huge debate going on now.
00:38:25.440 Does Facebook have responsibility?
00:38:27.200 Does Instagram, well, Facebook and Instagram, do Instagram have responsibility?
00:38:30.920 Is it the parents' responsibility?
00:38:33.880 I do think big tech has responsibility.
00:38:36.480 This is my industry and I'm tuned into all these debates about what they could and should
00:38:43.000 be doing.
00:38:44.000 The first thing I'd say is I think some of these are inadvertent byproducts and a lot
00:38:49.520 of people will say that they're mirrors on behaviours, they're just made more accessible
00:38:53.420 for the behaviours that teenagers engage in anyway.
00:38:58.940 And I think they have to look at how they moderate content, what they allow in the behaviours
00:39:04.960 But the fundamental problem with all of big tech is their business models, particularly social media business models, which are determined and reliant on dopamine, dopamine, eyeballs and time spent online.
00:39:18.320 So ultimately, the longer you're online and it's dopamine that fuels the amount of time you spend online, the more money they make.
00:39:25.600 Now, that is a problem. And I don't see how they can correct that overnight because, you know, we need children to get offline.
00:39:34.220 it's been scientifically proven that mental health and online poor mental health and online time is
00:39:41.720 directly correlated more time offline you know a better mental health um and i think you know
00:39:47.820 the social media neuroses um the the neuroses that's caused by social media is a massive problem
00:39:54.620 so i think tech is the problem i think parents need to watch this i think it's caught them
00:39:59.860 unawares you know because a lot of it's been happening in the bedrooms and you know we're
00:40:03.080 all playing catch up on how quickly these behavioural trends have happened. But I also
00:40:07.920 think tech is the solution. So there's my entrepreneurial optimism for you. I think
00:40:15.800 that different business models, different communications around what's happening, and
00:40:21.600 I also think that there's a new narrative and appreciation of the dangers of tech that's
00:40:29.060 only just starting now to get serious the last two years we've had this sort of whimsical oh god
00:40:35.160 i'm addicted to my phone isn't it awful i sleep with it by my bed and you know there's too much
00:40:39.200 time and then looking at your screen time reports and thinking god what else could i have done with
00:40:43.620 that four and a half hours today that i is that just me um probably more for us yeah absolutely
00:40:49.840 but the tide has moved from that sort of whimsical sort of incredulity about our tech use to actually
00:40:56.060 now thinking, I resent this. There's something really not quite right about this. The fact
00:41:00.960 that people aren't looking at each other on the street. The fact that we're constantly
00:41:06.740 being distracted whilst we're in the middle of something and we're happy to be distracted
00:41:09.600 about it. The fact that you can walk around the Louvre and 80% of people have got their
00:41:14.300 back to the art, taking selfies of the art and not looking at it. There's something profoundly
00:41:19.220 depressing that I think people are now realising about the negative by-products of these sorts
00:41:24.840 of behaviors. And I think that this will result in a lot changing. And I believe consumers
00:41:32.200 vote with their eyeballs. We've already invited all our parents onto Facebook and have left
00:41:37.460 them there. It will change the behaviors and platforms will go as quickly as they've come.
00:41:46.900 And I believe that new tech will come up, like my business, which empowers altruism,
00:41:54.140 which empowers knowledge and passion sharing and inspiration
00:41:56.840 in order to drive people offline.
00:41:59.800 So that's my business, basically.
00:42:01.340 I'm trying to take that Instagram stories format
00:42:04.120 and repurpose it into people creating content
00:42:07.680 that isn't about them and their face
00:42:09.160 and whether they're looking gorgeous today,
00:42:11.360 but it's about what they love, what they're good at,
00:42:14.360 what they've learned to do,
00:42:15.520 and empowering other people to do it
00:42:17.000 in less than one minute so that they can get offline
00:42:18.840 and actually do it,
00:42:19.760 and putting that in an open web bank that can be searched,
00:42:25.280 that can be found almost like Wikipedia
00:42:26.720 and can be built upon by other people who've got other ideas
00:42:29.500 and things that they've learned.
00:42:31.740 It's a really powerful thing that you're doing
00:42:33.700 because I can't remember, there was a piece of research done
00:42:36.500 that actually you feel best about yourself,
00:42:38.800 not when you achieve something,
00:42:40.080 but actually when you help another person.
00:42:42.540 Absolutely.
00:42:43.180 The dopamine hit is much more profound than a...
00:42:47.080 Yeah.
00:42:47.260 You know, if somebody actually says to you, wow, you know, I love that cake that you taught me about.
00:42:55.240 That's not a great example, but, you know.
00:42:57.020 Cake is incredible.
00:42:57.880 Cake can be incredible.
00:42:59.840 But say it's like learning how to fix something or learning how to do something offline or a new yoga pose or crafting, you know.
00:43:08.120 And they say, oh, I did that.
00:43:09.340 And then I changed the color scheme and I added this to it.
00:43:11.580 And look what I built.
00:43:12.280 And if somebody sends you that image or the video, what they've created off the back of something that you taught them how to do, you're just going to have this sense of, oh, you know, that child got that toy that their parent made for them or they had that party and they had that really fun game because I told them what that game was and they found it out because of me.
00:43:30.000 So I'd love, I don't know how naive it is to think that we're going to shift the tide immediately because that people, somebody likes me, somebody loves me.
00:43:39.960 Dopamine hit is the most powerful motivator on earth.
00:43:44.220 You know, everybody, it's got status in it.
00:43:47.720 It's got validation.
00:43:50.580 It's got friendship.
00:43:51.860 It's, you know, and that's what all those business models are powering.
00:43:56.340 And I believe we can pair it in a way that delivers something that is more fulfilling in the long term.
00:44:04.700 And Tick, my business, is trying to work out how we get those hooks firing.
00:44:12.560 Helping those less fortunate than yours is very important.
00:44:14.900 That's why I like working with Francis.
00:44:17.620 I'm kidding, man.
00:44:19.260 But in terms of you, obviously someone who's very optimistic.
00:44:22.420 and just to take, we've got about 10, 15 minutes left,
00:44:27.460 to take us back to the whole conversation
00:44:29.260 about men, women, all of that.
00:44:31.620 How do we move to a more constructive way
00:44:34.240 of talking about these things?
00:44:35.440 Because, you know, we had that horrible shooting
00:44:37.880 in New Zealand recently.
00:44:39.460 It will be a couple of weeks by the time
00:44:41.280 this episode goes out probably.
00:44:43.460 And you have the leader of the Women's Equality Party
00:44:45.680 in this country, can't remember her name,
00:44:48.360 coming out and saying this was all,
00:44:49.840 this is just toxic masculinity.
00:44:51.640 That's the cause of this stuff.
00:44:53.460 And it seems like we're using everything now to make these very narrow political points about men and women and all this stuff.
00:45:00.760 So how do we move forward positively from here?
00:45:07.200 You've never been asked that question before.
00:45:09.840 I have.
00:45:11.000 There's a very good TED talk that I did where I talk about the positive solutions as I see it.
00:45:21.640 I think we need to broaden the context of the whole discussion.
00:45:28.680 And that's, I guess, another one of my biggest concerns.
00:45:32.120 There's the undermining of confidence in women and the victimhood mentality,
00:45:35.980 but there's also the fact that we're seemingly not allowed to have so many of these discussions.
00:45:39.120 And we need to stop shutting people down for broadening them or asking things in different directions.
00:45:45.940 We need to allow men to be in the debate more, not just on one side.
00:45:48.980 they're allowed in the debate as long as they're singing from this song sheet um that you know
00:45:53.740 all men bad all women good um and that's just so binary and it's it's unfair and it's not accurate
00:46:01.400 and there's nothing positive about that narrative and that's that's what concerns me you know we
00:46:08.360 shouldn't be teaching girls about powerful girls should be teaching boys about powerful girls you
00:46:13.380 know and and and powerful people generally and and appreciating that you know we're all individuals
00:46:19.840 first and foremost and beautiful crazy you know different unusual in a distinct in our own rights
00:46:27.880 and I guess that's what I want to see because what's happening is it takes oxygen away from
00:46:34.020 areas where I believe prejudice and discrimination does still exist you know whether that's
00:46:40.380 socioeconomic or whether if you if you just want to keep it on feminism you know everything that's
00:46:44.880 happening with women and hijabs in the middle east you know that's real desperate feminism in
00:46:52.580 action that isn't being supported and has got nothing to do with you know air conditioning
00:47:00.200 and offices or whatever yeah or manspreading and mansplaining and you know there's so much
00:47:10.040 oxygen being lost on the little things when there's so many big issues to come up with and I think
00:47:16.400 having a more healthy and open debate around it one that isn't gender pay gap equals discrimination
00:47:21.600 but actually oh interestingly gender pay gap also can reflect choice and positive decisions that
00:47:27.820 women make to reclaim more balance in their lives and you know we won't go off on that I know you've
00:47:33.480 done quite a bit on gender pay gap previously but well you you are a good example of that you talk
00:47:37.640 about yourself having three children or four years and how that changed your attitude to
00:47:41.740 to your working life and everything else right so tell us more about that well I would like to see
00:47:46.220 more women like me publicly saying that because the number of people that have come up to me
00:47:50.960 after I've put myself on the line a little bit about this and said oh thank you you know I
00:47:55.440 I totally agree with you you know I've chosen to spend time you know these years at home
00:48:02.060 you know I've chosen to I don't want to make partner anymore why would I want to do that
00:48:06.540 It's a masochistic, unpleasant life of long hours, no balance, politics and stress.
00:48:12.140 Life is too short.
00:48:14.040 And, you know, this is what I choose.
00:48:16.740 And the gender pay gap between me and my husband is now 90 percent, 100 percent.
00:48:21.460 And men, of course, often say, thank you for saying this because I'm not allowed to.
00:48:27.120 Why aren't you allowed to?
00:48:28.100 You've got a wife you love dearly.
00:48:29.300 You've got daughters that you care about.
00:48:31.120 You know, you do not not have a vested interest in this debate.
00:48:35.320 But that won't save you now.
00:48:36.320 that's the thing is like both francis and i partner like i talked to my wife about this and
00:48:40.080 she's she's probably more against this stuff than i am in some ways she's probably less
00:48:45.340 understanding of all this victimhood than i am because i'm like well but this happens but this
00:48:50.300 happens and she has a very similar attitude she's like well you just you know of course there's
00:48:54.160 discrimination but you power through it or you you deal with it uh and she can say that
00:48:59.840 but if a man was to say that and and we know this for example in comedy francis and i were
00:49:05.980 talking just before you got here uh you know it's perfectly normal now to go into a comedy club and
00:49:13.040 see a woman talking about how men are trash on stage that's comedy now i know you know and and
00:49:20.920 you just kind of go well of course you know not all men are perfect and there are problems with
00:49:26.340 masculinity but that level of vitriol that you have now is is gone to a level which i find very
00:49:33.780 difficult to understand that people sit there and kind of take it you know what i mean yeah
00:49:37.820 and what what what concerns me the most is the lack of kindness in the tone of the debate you
00:49:45.620 know there's a there's a huge amount of anger thrown around and sort of nasty piss-taking and
00:49:54.340 um shutting down and that to me doesn't further any debate or discussion and it means there isn't
00:50:03.060 room for all these different voices and if people are scared to raise their head on something as
00:50:08.040 fundamental as you know whether they have the right to feel you know optimistic about their
00:50:14.900 opportunities in life or not then there's something seriously wrong there and I don't
00:50:20.140 and I have a perfectly honest I have found it actually really hard I tried to do a micro version
00:50:25.620 on my TED Talk at a media advertising conference a few months back.
00:50:31.280 And I chose, perhaps unwisely, to then open that up to a workshop afterwards.
00:50:37.300 And there's a reason why I've done it as a TED Talk and not as a debate.
00:50:44.160 So I'm actually not that strong a debater.
00:50:45.940 I kind of can get a bit flustered and kind of forget the angles.
00:50:49.540 You know, I hate something like Newsnight or, you know, one of those sorts of shows.
00:50:54.300 but I chose to open it up
00:50:56.320 and people were so nasty
00:50:58.920 and they made it very, very ad hominem about me
00:51:01.660 it was, well, you're alright
00:51:03.780 you haven't experienced this
00:51:04.960 you haven't experienced that
00:51:06.220 and I took it all
00:51:07.760 and I ended up totally on the back first
00:51:10.600 and said, oh, well, you know
00:51:11.720 maybe, maybe you're right
00:51:13.780 and then I went away and thought about it
00:51:15.460 and I said, no
00:51:16.040 I know enough
00:51:18.980 I've spoken to enough people
00:51:20.000 I've lived in this world
00:51:21.080 and even if I'm only allowed
00:51:23.320 if you only allow me to talk about the fields that i know there's something very wrong within
00:51:29.300 those fields um or within some of the narrative around it and i just don't understand why
00:51:35.100 i have to you know why it has to be so unpleasantly um shut down let's have a conversation
00:51:42.000 the ritual is incredible and it happens particularly when people from supposedly
00:51:46.520 oppressed groups speak out that's when it seems to be extra vicious doesn't it like we've had
00:51:51.640 people on the show from people who are black or or female or whatever talking about you know there
00:51:57.640 isn't that much racism or isn't this particular problem and those people get some of the most
00:52:02.220 vicious abuse from the from the people who think they're on their side you know but that just means
00:52:07.860 that we're never going to have any um introverts go into public life yeah in any way and introverts
00:52:13.720 are often the most thoughtful people on this earth you know there's a reason why they've
00:52:18.820 weighed these things up, they'd thought about it. I mean, James Damore, sorry. That's the
00:52:24.080 one. I can literally, if I say James Damore, I can literally throw a bomb into a room full
00:52:27.680 of tech women. I thought that whole debate was a debate that needed to be had. No matter
00:52:34.840 where you fell on any side of what he was saying, this is a nerdy, introverted guy who
00:52:40.020 actually got away and done a lot of research and distilled it and was asking some interesting
00:52:43.220 questions. Now, you can disagree with the conclusions he came to, but you shouldn't.
00:52:48.140 I personally don't know. But you shouldn't disagree with his ability to be able to have
00:52:55.860 that conversation. And that, to me, was symptomatic of so much of the level of debate that
00:53:02.440 why are we not allowed to have that conversation? You're not feeding one side or the other.
00:53:06.180 It's this whole disinfectant of sunlight on an issue. And obviously, back to big tech
00:53:12.440 again there's this huge debate around how you know the sort of uh political um sort of social
00:53:18.840 ideologies of these um these tech companies are now arbitrating in a way that um is fundamentally
00:53:26.880 affecting that debate as well you know with certain viewpoints totally you know shut down
00:53:31.660 and and banned and i'm sure you know more about that than i do but it's happening on an everyday
00:53:35.940 level in everyday conversations and that's what concerns me that is not a popular view you know
00:53:42.320 the the 14.5 000 retweets of this poor tech bug demonstrate that there's a right-on way of being
00:53:50.940 able to talk about these issues and what amuses me is they all think that they're like some great
00:53:56.700 um you know activists around these issues i'm like this isn't activism this is the line
00:54:01.900 this is the line that is acceptable my line's not acceptable on some of these issues and i can get
00:54:08.480 crucify for and have done very very personally so i don't tend to do this in open forums anymore
00:54:15.600 well this is only going on the internet oh yeah you're gonna be fine but but there is there is a
00:54:21.100 giveaway in that attitude of having to shut people down which is you're by shutting someone down
00:54:26.940 instead of engaging with the argument and defeating their argument what you're really saying is i know
00:54:32.580 they're right you that's what you're really saying because if you if they were wrong you would just
00:54:39.140 prove them wrong but if you have to shut someone down that means their arguments actually are
00:54:45.280 accurate because otherwise why would you have to shut them down well i don't i'm not sure i agree
00:54:51.020 with that i don't know that the point is you're not why would you not want to disprove that
00:54:56.260 argument but that's my point wrong well because you're scared of it in some way perhaps i don't
00:55:01.960 know. I just don't know. I just think always more polite, interesting, intellectual, non-personal
00:55:09.440 way of discussing these issues is the way to go. And I'm not quite sure how we kind
00:55:16.540 of reverse a lot of these tone of the debate because it's exacerbated so much by the social
00:55:23.600 media tools and the way in which mainstream media has got to go to the clickbait extreme
00:55:29.760 of any argument and it means that there's nothing is allowed to be grey
00:55:33.420 anymore and that's that's a real worry because everything's grey yeah
00:55:37.320 everything nothing is black and white that's such a British point of view
00:55:40.380 everything is depressing and it's really pertinent because you hear you know and
00:55:49.020 you see these titles you know smashes so-and-so you know you know crushes
00:55:53.940 whatever and you think well all they're really talking about is ideas and isn't
00:55:58.000 discussion at its best, when you say something
00:56:00.400 to me and I go, actually I've never thought
00:56:02.340 about it like that, you know mate, she's got a point
00:56:04.320 I'm going to go away, I'm going to have a think, I'm going to change
00:56:06.360 what I think and I'm going to evolve
00:56:08.260 as a person and a human being, but we don't
00:56:10.280 do that anymore because it's about ego
00:56:12.160 now, it's I have to beat you
00:56:14.000 and ultimately nobody really learns
00:56:16.420 anything. And it's also, everything
00:56:18.420 is a soundbite, I've
00:56:20.380 probably said for example, three or
00:56:22.360 four things within this
00:56:24.240 chat that might not have been
00:56:26.440 that well considered and I'll go and I'll watch it again and I'll think those will be the clips
00:56:30.540 we use but those will also be the quotes that will get thrown at me you know the next time I'm
00:56:40.520 you know somebody somebody would have done their research and when I'm fabulously Mark Zuckerberg
00:56:44.980 esque with my new platform tick check it out it's free in the app store um you know that they'll
00:56:50.580 find something you know that ill-advised that I will have said today in some context in all of
00:56:56.100 this sort of lots of controversy and you can hear it in the way i talk i'm i'm working all this stuff
00:57:00.720 out for myself and i think this is such a cathartic way for us to learn from each other and
00:57:05.580 throw these ideas around and yet there are so few forums like this and the one that you put together
00:57:12.160 with trigonometry that really enable us to do that and that's that's a that's just a shame
00:57:16.560 well that that is what we're trying to do you know the final thing either the cause of the alt-right
00:57:20.760 yeah apparently that's what thanks mate someone's gonna take that literally i know it sounds better
00:57:26.440 in my voice as well it sounds more authentic doesn't it you don't sound i always think all
00:57:30.160 right as americans are you don't sound all right you just sound racist yeah um the final thing i
00:57:36.500 i always feel like it's important when we talk about this men and women stuff that we also
00:57:40.140 address the fact that there are some men who've gone off the deep end as well as you say as a
00:57:44.980 result of this toxicity on both sides so you have all these incels and these people running around
00:57:49.840 going well women are always complaining all this stuff and it's like no that doesn't help either
00:57:54.860 you know men and women need to come together the two groups of people historically that have always
00:58:00.060 needed to cooperate more than any other groups is men and women and it you know you look at any
00:58:05.640 successful person they probably have a very healthy relationships with people of the other sex because
00:58:10.620 that's how you live life by dealing with other people you know and if you if you if you have
00:58:15.080 those unhealthy attitudes you're going to struggle whatever it is you're doing you know
00:58:19.000 and that worries me as well I think that it's it's great that you hear a lot of people you know
00:58:24.600 people like we talked about Joe Rogan and people like others and us trying to remind men as well
00:58:29.240 that don't be a dick that's also part of part of the that also has to be part of the conversation
00:58:34.800 because a lot of men are resentful now they are resentful is what they see as a kind of war on men
00:58:40.560 yeah and you know some of there is a legitimate argument to be made that we live in a society
00:58:45.480 where men are kind of stigmatized for being men but the answer to that isn't to become a dick
00:58:50.720 the answer to that is to try and have more conversations to try and learn about stuff to
00:58:55.120 try and bring the conversation back to the middle and that's why we're always very grateful to have
00:58:59.180 guests like you to come on and talk to us about that so uh let's have a final question right okay
00:59:04.100 can i just say that you said men and women come together and i didn't make a joke about it and i'm
00:59:07.840 so proud of myself i'm evolving as a human being look at jess rolling her eyes which is the exact
00:59:14.160 same thing that every woman in my life does but jess uh just to finish up uh we always end on this
00:59:20.760 question what is the one thing that we're taught with that we should be talking about a society
00:59:25.260 that we're not really yeah you've given me um an hour where you've made me talk throughout the
00:59:32.160 whole hour to think about my answers to this question um so it's a it's a sort of summary
00:59:38.920 in some way of a lot of the stuff we've been talking about it's this obsession on self
00:59:44.740 selfie me my identity um there's something that i don't think is being discussed enough
00:59:54.580 is this sort of the very narcissistic, unpleasant nature of you, me, me, sorry.
01:00:03.900 You know, the onus on putting it all over you and your bad joke.
01:00:10.440 Gee, were you in the crowd last night?
01:00:15.300 Yeah, it worries me that we are, that nobody is actually saying,
01:00:20.760 well actually you know all this time in our own head thinking about ourself and our mindfulness
01:00:26.860 and our mental health and our gender and our identity and what we want and what we can
01:00:31.220 contribute to the world and our values and our what's the the trendy thing at the moment is
01:00:36.620 what's your purpose what's your purpose in life and I kind of think well you know that's not a
01:00:43.520 healthy that's a healthy culture if we're all obsessing about us and what we want what we're
01:00:49.780 doing and you know the onus is being lost and I think the selfie itself is the most symptomatic
01:00:58.760 visual representation of a culture that is dangerously obsessed with self so yeah big
01:01:09.180 big are we talking about that enough I think we are and in lots of sort of different issues but
01:01:14.620 that to me kind of encapsulates it and we kind of want to say really stop looking at yourself
01:01:20.740 it feels like you're directed at me and stop playing with yourself as well for us
01:01:26.560 see i've gone down to that level as well this is what happens it's yeah
01:01:30.040 meant together eventually talks about willies right okay so and on that note yeah great note
01:01:36.520 to finish on um i'm sorry about him and about me just i'm sorry about everything now uh thank you
01:01:42.860 Thanks so much for coming on the show.
01:01:44.740 Your Twitter is?
01:01:45.740 At Jess Butcher.
01:01:46.740 At Jess Butcher.
01:01:47.740 And my business is...
01:01:48.740 Well, I was about to...
01:01:49.740 Thanks.
01:01:50.740 Yeah, I was about to plug it.
01:01:52.240 It's Tick, and it's available on the App Store.
01:01:54.240 Tick.
01:01:55.240 Yes, search for Tick Done on the Apple Store, Android, coming very soon.
01:01:58.460 Fantastic.
01:01:59.460 As always, follow us at TriggerPod on Twitter, Instagram, Facebook.
01:02:03.900 Click the bell button next to the subscribe button so you get notified of when the video
01:02:07.380 comes out.
01:02:08.380 Subscribe to the YouTube channel if you haven't already for more great videos.
01:02:11.600 us an iTunes review, blah, blah, blah. We'll see you in a week from now. Absolutely. Also
01:02:15.420 as well, just check to see if you haven't been unsubscribed. It always happens. Please
01:02:19.620 check it out. And thanks. And also just tell a friend about us. Spread the word, spread
01:02:24.640 the hate. Speak soon. Bye-bye.