TRIGGERnometry - January 20, 2019


The Dark Side of Technology


Episode Stats

Length

37 minutes

Words per Minute

186.56769

Word Count

7,029

Sentence Count

218

Misogynist Sentences

8

Hate Speech Sentences

6


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, Dr Linda Papadopoulos talks about the dangers of trigger warnings and how they can cause people to be offended, and why it s important to be a victim in order to learn how to deal with the hurt that comes from being a victim.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Welcome to Trigonometry. I'm Francis Foster.
00:00:07.380 I'm Constantine Kisson.
00:00:08.520 And this is the show for you if you're bored of people arguing on the internet over subjects they know nothing about.
00:00:14.340 At Trigonometry, we don't pretend to be the experts, we ask the experts.
00:00:18.900 Our fantastic expert guest this week is a returning guest to Trigonometry, the first returning guest to Trigonometry.
00:00:24.540 she's a former advisor to us president and the founder of h robotics dr pippa malengren i got
00:00:31.240 it right yes and our amazing expert guest this week is the columnist for the spectator vice
00:00:37.400 chairman of the ogilvy advertising group and one of the most amazing creative counterintuitive
00:00:43.460 thinkers in the world today rory sutherland welcome to trigonometry it's a joy to be here
00:00:48.220 our fantastic expert guest this week is a psychologist and author dr linda papadopoulos
00:00:53.640 Welcome to Trigonometry.
00:00:54.620 Thank you for having me.
00:01:00.780 Let's just get straight into it.
00:01:02.080 First of all, one of the things, you know, this is called Trigonometry.
00:01:05.280 And we briefly mentioned that I was telling you the study that I saw that giving people trigger warnings actually primes them to have a negative experience of whatever it is.
00:01:16.000 And you were like, yeah, please, please ask me about that on the show.
00:01:18.920 So what do you want to say to us as a psychologist about that?
00:01:21.720 if you think of um one of the things that we do in cognitive therapy right when someone comes in
00:01:27.140 and says look this is happening to me and it's really bothering me one of the things that we
00:01:30.700 do is we look at erroneous thinking so if I've come in and I've said you know oh you know I
00:01:36.220 don't like stripes it may be that you hear well Linda doesn't like me we'll talk about it later
00:01:41.280 and um and you know or she's being offensive and she's being directly offensive now if you were
00:01:47.460 having therapy they'd be like well maybe it's just that Linda had a bad experience with stripes
00:01:51.000 It's nothing personal. Maybe before you kind of personalize this and become triggered by it,
00:01:55.900 maybe you need to check out with Linda why she said that. So in a lot of ways, these trigger
00:02:01.120 warnings, in a lot of ways kind of priming people to be offended is the exact opposite of what we do
00:02:06.680 in CBT, which is take a step back and reevaluate that thought, right? If I'm going out of my way
00:02:13.320 to find something to be offended about, then I will be. And there's a value in being offended,
00:02:18.960 Because if I'm offended, I can put myself in the role of someone who's a victim, then they need to take care of me.
00:02:24.420 I can get social points because I've called you out for being whatever, being unthoughtful, uncaring.
00:02:32.840 So there's a reason why this happens.
00:02:34.940 And I think one of the things that I think we see a lot of these days with regards to people feeling that vulnerability
00:02:42.500 is precisely because I think a lot of young people grew up in a culture where their parents got rid of obstacles.
00:02:48.640 They were like snowplows.
00:02:49.940 So they had a problem with a teacher.
00:02:51.500 They went and they spoke to a teacher on their behalf.
00:02:53.900 They didn't like not being invited to a kid's party.
00:02:56.740 They'd complain and they'd speak to the mother and they'd been invited to the party.
00:02:59.600 A big part of dealing with life is having the resilience to come back and realize that sometimes shit happens not because someone's out to get me, but because things happen.
00:03:10.400 And I have to find the strength to overcome these things.
00:03:12.420 So I just kind of feel that a lot of the way that we're talking about trigger warnings and language being dangerous basically de-skills people from being able to cope with these things.
00:03:23.560 And I certainly wouldn't want that, you know, for my child or any of the young people that I work with.
00:03:29.080 And do you think this behaviour is addictive, that once you start and you cast yourself in the role of victim, you get this attention, you think to yourself, hang on, I quite enjoy this, whether consciously or subconsciously?
00:03:40.000 well there's I think there's a power in um in being being seen and being heard and look I think
00:03:48.140 there are some genuine people that are victimized out there I think it's important that we say that
00:03:52.080 and I think they should be able to speak up and talk about it but I think intense importance right
00:03:56.640 if I bump into you and I didn't mean to bump into you then surely that matters right now the amount
00:04:03.120 you hurt you know if you're really hurt that that yeah that matters too but the intent has to come
00:04:07.980 into it and I kind of fear that it's very rarely about intent it's very rarely that I didn't notice
00:04:13.660 you had the striped shirt on before I said I don't like striped shirt I'm keep going I'm sorry
00:04:18.340 I always enjoy when your fashion sense gets destroyed that's always the best I did show
00:04:25.420 this to my girlfriend yesterday and she just looked at it and then walked out the room you
00:04:30.100 know when you're trying to pet a cat and it's like no I'm out so there we go the moral of the story
00:04:34.820 always listen to your girlfriend. Yeah, yeah, and listen to her disapproval. Right, um, do you think
00:04:39.540 social media has made this worse? The fact that, you know, we're all constantly logged in, we're
00:04:43.600 all looking for approval, we're all looking for likes, et cetera, et cetera, et cetera. I think
00:04:48.000 it's, um, it's a huge part of it. I think, yes, they're looking for approval and likes, but I think
00:04:52.460 also kind of the distillation of one's ideology down to their, you know, most thoughtless tweet,
00:04:58.560 Right. So, you know, surely we're an amalgam of all of, you know, the ways that we interact, our interactions, our opinions, everything else.
00:05:08.020 But because I can look into your background and come up with the one time that you used, you know, whatever, you know, a word in the wrong way or you re-liked something you didn't think about and you retweeted it.
00:05:19.220 I think these things mean that we're walking on eggshells.
00:05:24.480 We're constantly so anxious of being misunderstood
00:05:27.820 because, like we said earlier,
00:05:30.700 there's something in calling out people for being wrong.
00:05:35.740 So we've got this very weird situation where we want to be seen,
00:05:39.100 so we're constantly kind of saying,
00:05:40.300 this is what I feel about this and I have an opinion about that.
00:05:42.560 But at the same time, the rest of us are kind of looking to call each other out
00:05:46.840 because there's a value in that as well.
00:05:48.360 So, like, the kind of the discourse isn't the healthiest.
00:05:50.780 We're not kind of talking to get to a better place of understanding.
00:05:53.400 We're talking to kind of call each other out and see, well, hold on.
00:05:57.160 Maybe you didn't see that in the right way.
00:05:58.660 And then that, you know, then those kind of Twitter wars start.
00:06:01.480 And do you think it's sort of, it increases people's narcissism using the social media?
00:06:08.620 Well, I think it's inevitable that something that, um, that, that, that makes you think
00:06:16.500 about how you're seen in such a nuanced way increases narcissism and and and I don't mean
00:06:22.340 that I mean it's actually really kind of heartbreaking so I've been a psychologist for
00:06:26.180 for many many years and sort of even things that I see a lot of the way they're depicted to me have
00:06:31.020 changed I do a lot with eating disorders and um you know for for many years the big thing was a
00:06:36.620 number right so I want to be this weight I want to be this size and then more recently people are
00:06:40.760 bringing in sort of pictures from Instagram with sort of hashtags like bikini bridge or thigh gap
00:06:45.600 or box gap, which means we're kind of standing outside ourselves and looking in. We're kind of,
00:06:50.860 you know, self-objectifying. Now, this self-objectification, while there's a narcissistic
00:06:55.360 element, there's also an element that kind of belies incredibly dangerous low self-esteem
00:07:01.460 because, you know, not only am I looking at myself through my eyes, but through the eyes of, you know,
00:07:06.060 everyone else that I feel is looking at me and through their values and through the way that
00:07:10.600 they're going to assess me. So yeah, I always say if I was going to create an exercise for poor
00:07:15.860 self-esteem, I'd tell people to take a bunch of pictures of themselves, look through the ones that
00:07:20.420 are like the least awful, then find that one and then spend, you know, 20 minutes filtering it,
00:07:25.660 thinking of a great hashtag, then put it up on social media, then sit back and wait for the likes
00:07:29.660 and if 50 don't come in the first 20 minutes, take it down and start over again. That is literally
00:07:34.260 an exercise and you know and poor self-esteem yet you know so many people do it day in day out and
00:07:41.300 and this idea of not being able to live up to your selfie you know this discrepancy between who i am
00:07:47.380 and who i'm projecting i think is having a significant effect on mental health because
00:07:52.980 there's there's no ability to rest from from what does everybody else think about me that's
00:07:58.660 constantly there there's no ability to say this is who i am and maybe this can be okay and i've
00:08:03.860 I've always thought as well that kind of part of your job as a human being as you mature is to
00:08:08.100 start to get to a point where your self-esteem is driven by what you think about yourself like
00:08:13.320 a kind of sense of who you are and it seems like social media is taking that away from us
00:08:18.580 and kind of almost exploring our vulnerabilities as human beings would you say that we're
00:08:22.880 particularly vulnerable to social media as humans? Well this is such an interesting point so again if
00:08:28.540 you look at sort of human evolution, we've not evolved to kind of absorb the opinions of thousands
00:08:35.000 and thousands of people on ourselves, right? We tend to live in sort of in groups. And there's
00:08:40.040 some research that attests the fact that even the amount of forensic we have should be no more than
00:08:44.080 about 100. We just can't handle anymore. That's how we've evolved and it works well. So social
00:08:49.280 media is all about the numbers, like you say. And so, you know, being able to assimilate this kind
00:08:55.380 of, I mean, there's like bathroom doors. Anyone can write whatever it is on the back, but because
00:08:59.480 they're not in a little cubicle with a smelly toilet, they're there for everyone to see. It
00:09:03.280 matters. Whatever's written matters. And if thousands and thousands of people's opinions
00:09:09.600 and the diversity of them matter, again, that's a recipe for disaster because not only do you not
00:09:16.360 rest from it, but you're never, like you say, able to say, do you know what? This is who I am. This
00:09:20.600 is the imperfect version of who I am and while I think we all want to you know self-actualize
00:09:25.520 and reach our potential you know part of that is saying I'll never play in the NBA if I'm a
00:09:30.860 certain height or I'll never have a singing career if I can and whatever it is and and being okay
00:09:35.600 with that and I think that's that's harder when you know those goal posts are always you know
00:09:40.620 sort of changing about who you ought to be and everyone's telling you well you're not there yet
00:09:44.340 what do you think about what do you think makes social media so addictive
00:09:48.360 is it specifically has they designed it that way why do we buy into it so much like even i do it
00:09:55.820 even i do the first thing i do when i wake up in the morning is check instagram yeah because okay
00:10:01.940 this is very interesting again so up until relatively recently psychology has been used
00:10:07.820 to help people right so you have a problem you go to a psychologist you help them um in the last
00:10:12.440 sort of 20, 30 years, psychology is slowly being used to persuade people. There's places called
00:10:19.560 sort of persuasive technology labs, usually where, you know, places where tech exists, right? So like
00:10:26.080 Silicon Valley, where our psychology is being used against us. So for example, I don't know,
00:10:32.560 you guys obviously Netflix, right? I love Netflix. I can't watch one show anymore. I rarely, if I'm
00:10:37.660 watching a series, I can rarely turn off and watch. Now, the reason for that is they know that if they
00:10:43.760 start the next one right away, like they do, I'm more likely to stay. They take away a barrier. If
00:10:47.940 I can skip the intro, I'm more likely to stay. They take away another barrier because the metric for
00:10:52.200 the success of social media sites is time spent online. So they ensure, and not because they're
00:10:58.800 evil. I don't think anyone with these social media companies is bad or evil. They're just trying to
00:11:02.140 to maximize their profit, maximize how successful they are.
00:11:06.080 So if time spent online is what they want,
00:11:08.560 when I'm devising whatever, Instagram, Snapchat, Facebook,
00:11:12.480 I'm going to ensure, well, what will make you more likely to stay?
00:11:15.800 Well, I know that notifications will.
00:11:18.040 Why?
00:11:18.700 Do you know, there's something called variable reward system.
00:11:21.780 You know, like slot machines?
00:11:23.280 Yeah.
00:11:23.980 You know why they're addictive?
00:11:24.880 If every time you pulled it, you know, something would come out,
00:11:27.940 there'd be not much to it.
00:11:28.940 You'd get it when you wanted it.
00:11:30.000 But the fact that sometimes it comes out every three times you pull it, sometimes it comes out every five times, means you're much more likely to go back.
00:11:36.100 We know that. That's very basic human behavior.
00:11:38.560 Now, that variable reward system exists on all social media sites.
00:11:42.500 It exists when you see that little notification at the bottom.
00:11:45.980 It exists when you're going to get some sort of validation, when you get the retweet.
00:11:49.980 It exists when someone likes something.
00:11:52.160 We know that.
00:11:53.180 Now, also, there's gender differences here, right?
00:11:55.360 So if we look at the way that boys and girls react differently, we know that boys tend to be much more apt to gaming addiction.
00:12:03.460 Why is that?
00:12:04.440 Because from an evolutionary perspective, men have evolved to seek out competency, especially during those teen years, young boys.
00:12:11.800 Now, what does that mean?
00:12:12.960 That means I want to feel like I'm getting good at something.
00:12:15.720 Now, what better thing than if I give you a game that I kind of ding, ding, ding, give you a reward every little while, every little increment so I feel better.
00:12:22.280 and I get that dopamine rush and I play more.
00:12:24.560 So I get more of a dopamine rush.
00:12:26.040 So that becomes quite addictive.
00:12:28.260 For girls, we're taught to seek out
00:12:30.540 kind of that social equity, right?
00:12:32.560 Do you like me?
00:12:33.640 Do I have that equity?
00:12:35.140 Do you enjoy being with me?
00:12:37.260 And as a consequence, again,
00:12:38.780 those hearts, those likes
00:12:40.180 are going to be very validating and rewarding.
00:12:42.900 So behind your phone,
00:12:45.420 this is not just a platform.
00:12:46.980 There's dozens of people with PhDs in psychology,
00:12:50.440 You know, way smarter than, you know, than the average person in terms of how they understand human behavior, how they understand themselves.
00:12:58.020 All, you know, ensuring that you spend as much time as possible on these platforms.
00:13:02.640 And do you think that there's anything that can be done or what's going to be the impact, first of all, of this if you project it into the future?
00:13:09.720 And is there anything we can do as a society, maybe legislatively, to start to curb some of these things?
00:13:16.500 Yeah, I mean, I think we're already beginning to see
00:13:20.400 so sort of gaming addictions already accepted
00:13:23.140 in places like Japan, not yet in the States,
00:13:27.320 but I think it's happening, we know that.
00:13:29.460 We know that, again, even from a psychological point of view,
00:13:35.780 you know, self-harm, which we tend to see quite a lot of it,
00:13:41.140 especially with sort of adolescents and adults,
00:13:43.160 we're beginning to see that in the digital form.
00:13:45.300 So digital self-harm, where young people are bullying themselves online anonymously.
00:13:50.460 Now, this is a new thing.
00:13:52.100 What?
00:13:52.680 Yeah.
00:13:53.840 You've got to tell us more about it. I've never heard of this.
00:13:55.800 So there's been cases where young people have been horrifically bullied online.
00:14:02.680 And as we know, there's been horrific cases where they've taken their own lives or attempted to.
00:14:06.820 And what we find now is in some cases, certainly, certainly not all, but in some cases,
00:14:11.280 these young people will actually be in such a low place that they create another profile.
00:14:15.080 and they bully themselves they call themselves stupid or ugly or horrible and the premise we
00:14:20.720 believe because this is very new this is not something we know uh you know we don't have
00:14:25.500 much data on but we believe it's the same thing it's about um it's about showing the pain showing
00:14:31.960 the the pain i feel inside and and externalizing it and being able to deal with it so um you know
00:14:38.640 all of these things are affecting us and this is just mental health cognitive health is being
00:14:43.820 affected as well. We know that millennials now, there's some studies to attest to the fact that
00:14:47.340 millennials have worse memories than seniors. And we know that because when you consolidate a
00:14:53.380 memory, if you keep being interrupted, you're never going to consolidate that memory. So try
00:14:57.660 and remember something. When you keep hearing buzzing or pinging, you're having to re-answer
00:15:01.980 that email. There's a reason for this. Sleep disorders, huge problem. They're on the rise.
00:15:07.260 Why? Because this blue light that we're constantly on affects us. Sleep's really important. We believe
00:15:12.340 that sleep is one of the factors that is implicated in depression. So there's so much going on. And
00:15:18.700 I think kind of our tech has outpaced our social evolution, our physiological evolution,
00:15:25.340 and we're trying to play catch up.
00:15:32.400 If you ask the question at the very beginning, why am I here? And I suppose one of the reasons
00:15:38.180 i'm here is i gave a ted talk back in about 2009 i think it was yeah something like that yeah so
00:15:44.660 that's nine years ago now and i simply made the euro star gag oh it's fantastic fantastic little
00:15:50.280 story okay now i'd only thought about it a couple of days beforehand the point was that they were
00:15:55.420 spending six billion pounds uh building high-speed rail tracks between st pancras and the kent coast
00:16:02.040 to reduce the overall journey time between London and Paris,
00:16:06.220 London and Brussels as well, by about 25 minutes, okay?
00:16:12.280 Now, my snarky little suggestion was, look,
00:16:16.180 what is a bit weird is we're spending $6 billion on these rail tracks
00:16:20.360 and the trains, this was true until two years ago, okay?
00:16:24.860 So the trains don't have Wi-Fi.
00:16:27.860 Now, I don't know, our Southeastern trains,
00:16:29.880 which you share with me from Tunbridge Wells,
00:16:31.460 They're now installing Wi-Fi on them.
00:16:33.140 They are.
00:16:33.900 I actually get half the time.
00:16:35.560 I'm actually annoyed my train arrives so early.
00:16:38.240 I should have gone off London Bridge to get here today.
00:16:41.040 I stayed on until Charing Cross because I was in the middle of something.
00:16:44.720 If you put Wi-Fi on something, essentially time loses its meaning.
00:16:49.100 But also, you're perfectly productive or entertained.
00:16:52.920 So actually, the time doesn't matter nearly as much.
00:16:55.800 So you can reduce what you might call perceptual journey time,
00:16:59.220 not at a cost of six billion which is what it costs to reduce actual journey time you can reduce it at
00:17:04.500 a cost of what's probably five million six million which is literally 0.1 percent of the cost i said
00:17:10.720 look if you want to be more perverse and i just got silly there i said you know if you really
00:17:14.760 wanted to improve the euro star and you had a budget of six billion you'd hire all of the
00:17:18.940 world's top male and female supermodels getting to walk up and down the train handing out free
00:17:23.860 chateau petruse to all the passengers you'd have saved five billion pounds because it would cost
00:17:29.560 you about a billion pounds to do that and people would ask for the trains to be slowed down
00:17:33.060 now the point about that is it's a totally that is what you might call the advertising
00:17:38.700 cousin to a comedic technique which is the reframing joke yes which is you tell something
00:17:45.720 you know that you know the most the most boring thing is you know you know um that joke which
00:17:52.160 seems completely innocent until you flip the context
00:17:55.260 by saying that, you know, I was masturbating the other day,
00:17:59.940 blah, blah, blah, blah.
00:18:00.660 And then suddenly you introduce the fact that it was in Sainsbury's.
00:18:03.800 And the same thing takes on basically what is,
00:18:06.920 goes for a self, what you might call a little bit of reveal.
00:18:11.260 That wasn't me literally saying that, but it was an example.
00:18:14.260 Just to make me clear.
00:18:16.000 Right?
00:18:17.280 You would never shop in Sainsbury's.
00:18:18.420 That's true, man.
00:18:19.860 But the interesting thing there is that it's exactly the same thing,
00:18:25.740 which is that human decision-making is quite often,
00:18:28.980 and human perception is badly constrained by a conceptual frame
00:18:33.760 or what you might call a mind trap.
00:18:37.000 And comedy is one of the ways in which you can liberate yourself from this.
00:18:41.480 So I'll give you a little real-world story, which is...
00:18:44.860 You may remember, it's one of my favourite sketches,
00:18:47.400 even though, as I said, it makes no sense.
00:18:48.900 It's a Rowan Atkinson sketch from 9 o'clock news
00:18:51.280 where he goes into a bathroom showroom
00:18:53.300 and they have various little bottle showers, baths, etc.,
00:18:56.980 in which he can actually design his own bathroom.
00:18:59.720 And he just decides he's going to have a bathroom with seven toilets.
00:19:03.380 If I get rid of the bath,
00:19:05.160 I can have another couple of toilets against them all.
00:19:07.700 Now, OK, totally bonkers, gorgeous, right?
00:19:11.520 Anybody who's seen it will still remember it 35 years later.
00:19:16.160 Someone comes to me with a problem,
00:19:17.440 which is the firemen in certain American cities,
00:19:21.660 when they're not fighting fires,
00:19:23.160 which actually is most of the time
00:19:24.480 because of smoke detection equipment and so on
00:19:27.820 and vastly less flammable fabrics
00:19:30.300 have reduced the incidence of domestic fires quite significantly.
00:19:35.540 So they go around typically to poorer areas in the US
00:19:38.120 to housing projects
00:19:39.240 and they try and install smoke detectors for free.
00:19:43.440 It's a perfectly noble and worthwhile thing to do.
00:19:45.500 now one of the things they can never get their head around and i can't understand it either
00:19:49.940 entirely is that they can get nearly everybody they ring on the door and say firemen here
00:19:54.080 we're just here to install a smoke detector you basically you can get them to install one
00:20:01.440 but they really balk at two or three and the typical apartment really needs three the strange
00:20:06.400 thing is that's like saying no i don't want one in my child's bedroom it's a really weird thing
00:20:10.660 to say but for some reason people just draw the line at one maybe two when three is necessary
00:20:17.920 okay now i use that same kind of python-esque uh you know bizarre approach to suggest a solution
00:20:26.680 which is very simple thing in behavioral science it's just re-anchoring i said and it partly was
00:20:32.900 OK, I partly suggested absolutely outrageous things, which which would never be legal or acceptable, which is to actually outrage people by saying, you know, well, actually, if you're Korean, you'd be entitled to three.
00:20:49.700 But since you're African-American, you only get two.
00:20:51.720 OK, I can see why that might be an issue.
00:20:55.300 You do it the other way around.
00:20:56.560 You do it the other way around.
00:20:57.300 OK, so you absolutely cause some sort of outrage now.
00:21:01.940 And competitiveness between different groups.
00:21:04.060 And competitiveness in comparison.
00:21:05.140 That came from a friend of mine whose wife was Chinese.
00:21:08.800 She was invited to visit South Africa.
00:21:10.700 And they didn't go in the end.
00:21:12.240 But she wasn't allowed to share the same hotel with him.
00:21:15.120 And they rang up and said, is your wife Chinese or Japanese?
00:21:17.780 He said, she's Singaporean Chinese.
00:21:19.320 Ah, unfortunately, she'll need to stay in a different hotel.
00:21:21.780 She wasn't so bothered about staying in a different hotel from her husband.
00:21:24.120 The idea that had she been Japanese, it would have been OK.
00:21:27.220 Drove her practically insane.
00:21:29.880 We forget this fact, actually.
00:21:31.680 One of the problems of the British left, they assume complete harmony among all external ethnic groups.
00:21:38.480 Well, the Chinese-Japanese rivalry is, with some reason in some cases, still pretty intense.
00:21:45.680 But, okay, so one of the reasons, and this is why free speech and advertising, by the way,
00:21:50.440 and free speech and comedy are both really, really important things to fight for.
00:21:55.700 Now, no one is suggesting for a second I'm ever going to do that, okay?
00:21:59.260 but it just struck me as a really interesting idea
00:22:01.700 of how to get people to accept free
00:22:03.220 is to suggest that they weren't actually
00:22:05.080 one of the ideas about how you get younger children
00:22:07.580 to eat insects is to have a pile of nocus
00:22:09.940 and you say well unfortunately because you're under 16
00:22:12.220 you won't be allowed to eat these
00:22:13.680 basically the magazine
00:22:15.660 the magazine Just 17
00:22:17.380 had an average readership age of about 13
00:22:19.900 you know
00:22:20.440 so actually telling people that
00:22:23.560 you're not eligible for something
00:22:25.260 drives them practically insane
00:22:26.660 but then
00:22:28.880 the reason the free speech thing is important is because there is a whole area of ideas which
00:22:37.300 belong in an ad agency which i would describe as you'd get promoted for making the suggestion
00:22:42.400 and fired for enacting it and that one that i suggested is an example of that okay if you
00:22:48.360 actually had the idea i'd be impressed by your ingenuity if you actually executed it i'd be
00:22:53.440 horrified by your lack of judgment the ability to say that thing is really really important because
00:23:00.640 one speech is experimental action so applying the same criteria to speech as you do to physical
00:23:07.060 actions is ridiculous because one of the reasons we've evolved humor and speech is it's nature's
00:23:12.540 flight simulator it's where we try things out without actually doing them and probably the
00:23:17.460 road of stories a whole bunch of things exist as a kind of simulation but the second thing is
00:23:23.180 sometimes in advertising in in problem solving of any kind you kind of have to climb mount silly
00:23:29.420 to get to the bright sunlit uplands block below beyond and what i eventually suggested borrowing
00:23:35.140 from that rowan atkinson sketch for the the smoke detector thing is i said turn up with six
00:23:39.740 now six is a ridiculous number and if you say to the people well um you know we we normally bring
00:23:47.340 six smoke detectors to
00:23:49.280 her home, but
00:23:50.080 I think here you can make do with
00:23:53.160 three or four. You'll probably get them
00:23:55.280 to install three or four.
00:23:56.960 Three, in fact. Because one now seems
00:23:59.040 weirdly on the low side. Once you've
00:24:01.080 anchored them at six, one seems
00:24:03.080 like, well, no one will have one. It's insanity,
00:24:05.660 right? And so
00:24:07.220 having the ability
00:24:09.520 to say and think,
00:24:11.340 and I would argue the principal value of
00:24:13.220 an ad agency isn't actually
00:24:15.380 the production of advertising or communication it's the existence of a culture with a client
00:24:23.380 company's interests at heart but with a culture in which it is possible to say ridiculous things
00:24:30.420 and still get promoted now that's one of the most important things that that agency can have
00:24:34.800 is the ability to say something totally ridiculous um which in say the civil service would be career
00:24:40.900 suicide or in politics would probably be career suicide unless you you know unless you're a
00:24:46.880 boris johnson and you manage to kind of donald trump yeah i mean there's a bit of there's a bit
00:24:53.140 of me which goes jesus you know i mean i don't like what he says but at least at least i believe
00:24:58.160 him when he says it yeah i think a lot of people feel that way actually it's one of the reasons
00:25:01.940 he probably got a lot of the mixed argument isn't he you know where you are with a good
00:25:05.020 playing crumb but i mean there is something about that which is that the elite priestly cast has
00:25:11.700 become basically weird it's become fixated with a very very narrow worldview um which is instilled
00:25:19.000 in all those kind of religious ceremonies that take place uh in davos or business schools or
00:25:23.980 whatever which are all inculcating exactly the same creed okay now by the way i'm a conservative
00:25:29.320 I'm not making this as a kind of weird left-wing point, but I don't think anybody can reasonably
00:25:35.300 say that there's a complete lack of imagination and weirdness in politics and business, which
00:25:43.220 is that it's become obsessed with an incredibly narrow way of defining worldly progress and
00:25:49.560 success, mostly around efficiency and cost reduction, which is not really very well aligned
00:25:55.320 with what people really care about.
00:25:59.320 you and i met two years ago at kilkenomics in ireland doing comedy well yeah i was doing
00:26:07.920 you were doing the same thing i was doing comedy and we were chatting in the bar after one of the
00:26:14.720 shows that we were on and you were telling me about this new thing in china where they're going
00:26:20.020 to have this system where they're going to track everybody's behavior and all this stuff and i'll
00:26:24.540 be honest with you pippa i really like you but i was sitting there listening to you going
00:26:28.380 she's a little bit crazy she could be a little bit crazy maybe she smoked a lot of weed when
00:26:35.180 she was really yes so interesting yeah i was sitting there and then a year later boom it's
00:26:41.940 happened it's real and that's what you were talking to us about last time you were on the show
00:26:47.240 what the hell is going on and what what's to come because now i believe you thank you okay so let's
00:26:53.380 is quickly go through what is already real right if you jaywalk in not all of China they've only
00:27:00.520 rolled it out in one city but they're going to roll it out nationally if you jaywalk
00:27:03.640 it clocks that you've done it because the cameras are ubiquitous now so the facial recognition is
00:27:10.360 incredibly strong they have the most valuable artificial intelligence startup in the world in
00:27:14.520 fact the most valuable startup in the world called sense time and it can recognize an individual out
00:27:19.580 of a crowd of 10,000 and it can recognize your emotional state at any given time so it clocks
00:27:25.560 it's you and particularly you next thing you know you pick your mobile phone up and the fine
00:27:29.820 for having jaywalked is already in your text messages if not already deducted from your bank
00:27:36.260 account and your name and or your government number is already broadcast on the oled screen
00:27:42.680 that's above the intersection the nearest intersection so you've now been broadcast to
00:27:47.720 everyone nearby, that you are bad and you just violated the law.
00:27:50.920 Now, this is important because what it does is it affects your effectively personal Uber score.
00:27:56.100 The social credit system is based on the idea that you're given a score, which reflects your social compliance.
00:28:03.020 So if you Google stuff that they don't want you looking at, your score goes down.
00:28:07.740 If your brother or your sister does it, your score goes down.
00:28:13.180 right because Mao always said the the best eyes and watchers or it's not the government is to get
00:28:20.300 everybody to report on each other and this is a kind of a I mean the Stasi would love this system
00:28:25.560 but it's bigger than that uh within the last few weeks it's been announced that when you
00:28:32.340 tap swipe and pinch on your mobile device it's a better indicator of who you are than your
00:28:39.000 thumbprint so now even if you're using someone else's phone they know it's you
00:28:44.340 triangulate that with the way you walk it turns out your walking gait is also a
00:28:49.320 better indicator of who you are then your thumbprint so what they're doing is
00:28:54.900 taking all these things and triangulating think about it as
00:28:59.040 previously independent silos of data now they triangulate using artificial
00:29:03.780 intelligence to pull it all together but it goes even deeper than that and
00:29:08.660 recently they arrested someone at a rock concert of like 60,000 people because
00:29:13.400 again the face went by one camera bang at clocks this is person is wanted and
00:29:18.600 they went to the person's seat can you know the debit card yep the ticket up
00:29:22.680 there arrested I think they now have 11 million people who are you can buy a
00:29:28.920 train ticket or a plane ticket but you can't board it they won't permit you so
00:29:33.460 effectively you wherever you are no so they're putting you into digital digital prisons so i
00:29:40.040 think one of the um spouses of one of the dissenters uh she's like blocked into a very small few block
00:29:49.320 area around her house and anytime she tries to move beyond it the officials are alerted and the
00:29:54.800 police will be right there so she literally is stuck in a digital prison this is basically what
00:30:00.540 the technology permits. And it rewards you if you do things that are supportive of what
00:30:06.420 government's interested in, and it basically penalizes you if you don't. We're doing the
00:30:10.680 same thing in the West. It's just it's not government that's doing it. We have private
00:30:14.640 entities that do it. We have Facebook that does it. We have Amazon that does it. We have Uber
00:30:18.720 that does it. And I don't know if you saw, but there's been an announcement that there's been
00:30:24.000 to deal between it seems Google and MasterCard. So now if I go to you know I don't know a department
00:30:33.620 store and I buy this color lipstick Google's going to know and now they know what they're
00:30:39.320 going to advertise to me. So I'm concerned about this and I write a lot about this in my book
00:30:45.360 because imagine what we've got it's a world where you are emitting data points about yourself all
00:30:51.700 the time without even knowing it because you've turned all the fitness apps off on your phone
00:30:55.980 but the fact is the way you walk is revealing a lot about you including by the way your your
00:31:01.440 cardiac condition a lot about your health is revealed by how you walk so this multiplicity
00:31:06.700 of data points that you're throwing off but you don't know who gets to see it but whoever does
00:31:12.200 get to see it they know more about you than you know about yourself and by the way it's a two-way
00:31:17.840 thing if I'm a chief executive and I go on let's say this program and I start talking about my
00:31:23.920 company and I'm lying this same facial recognition technology can identify your micro facial
00:31:29.940 movements and they know that you're lying and they can set the algorithms to short the stock
00:31:34.740 of your company as you're talking on say CNBC's Squawk Box. So think of it as almost like a crystal
00:31:41.280 ball of data points and on the one side it will let us conjure forth answers that will do things
00:31:47.480 like solve cancer literally we will solve extraordinarily difficult problems by having
00:31:52.320 all this data but it also changes the balance of power between companies and customers because
00:31:58.600 companies will have so much information that i've argued you know we used to have insider trading
00:32:04.660 laws we may need inciter trading laws that if i am mastercard now or google my knowledge of you
00:32:11.920 is so great. Forget Cambridge Analytica. They only had 5,000 data points on 81 million people,
00:32:18.680 and that was enough to begin to influence your political position. I could sell you way more
00:32:23.380 than a political position. If I have more than that, I could sell you a refrigerator. I can
00:32:26.800 say anything I want. So I think human beings are very vulnerable to this kind of power being
00:32:32.120 exercised. And similarly, what kind of world do we have if people don't know how they appear?
00:32:37.440 And should we have a right to know how do we look with that data slice looking at us at any given time?
00:32:45.480 So in the book, I've tried to lay this out because we have a lot of leaders who are making decisions about the landscape of our future who don't understand what I'm saying.
00:32:55.860 And so they're literally missing a profound shift in humanity.
00:33:01.840 Well, this is the future.
00:33:02.900 I mean, this is the future of one of us, right?
00:33:04.520 This is now.
00:33:05.180 It's not even the future.
00:33:06.060 This is nothing I'm not telling you is for the future.
00:33:09.360 It is already existing right now.
00:33:12.700 Jesus Christ.
00:33:13.700 I'm never smoking weed again.
00:33:15.780 You won't need to.
00:33:18.560 So, I mean, joking aside, but then that literally means
00:33:22.820 if you're being tracked of every single second of every day,
00:33:26.640 if they're reading what you were doing, if they can manipulate you,
00:33:31.300 is that the end of free will eventually?
00:33:33.020 This is the—so what's really interesting to me is when I was in college, I studied political philosophy, and everyone went, oh, my God, you will never get a job.
00:33:42.680 I mean, you are permanently unemployed if you study—now we get to this, and I'm like, these are questions of political philosophy.
00:33:48.940 That's exactly what this is.
00:33:50.600 This is about the balance of power between states and citizens, companies and customers, between citizens.
00:33:58.780 it's about the the invasion of a person's free will absolutely these are all core questions now
00:34:09.300 they're they're not tangential anymore and i and the more we develop artificial intelligence the
00:34:15.260 greater it's going to be for example um i know a guy who's building an extraordinary company
00:34:20.260 it is effectively going to place nano chips which are so small that that they can fit inside the
00:34:29.520 body they are at the level or it's they're below the level of atoms right you're basically in the
00:34:35.860 you're into the level of you can see someone's atomic structure their nuke their dna their
00:34:41.180 cellular structure and their organs because it just they float around in your body at the nano
00:34:46.220 level so that means the good news is when you get cancer you get proteins in advance of the cancer
00:34:53.660 which now we don't detect very well but then you'll be able to say you're building up proteins
00:34:57.820 we see it coming we can hit this thing and you'll never get cancer but the bad news is you're
00:35:02.800 emitting information about what is physically happening to you to whoever has access to that
00:35:09.700 data again the good news is that will create a kind of um he calls it a cure chain the guy who
00:35:15.320 founded it steve paper master like blockchain it's a cure chain we'll be able to cure many obscure
00:35:21.320 diseases that only a few people have that right now aren't worth fixing but with that you can fix
00:35:26.080 it easily but talk about privacy i mean the health care companies will know more about your health
00:35:33.260 than you will ever know about your own health so yes these are really profound philosophical
00:35:39.240 questions well here's the thing is it's not just health as well because we had an evolutionary
00:35:42.920 psychologist on the show a few weeks back i don't know if you've caught that episode
00:35:46.000 uh diana flashman and one of the things she was talking about is genetic studies the study of
00:35:51.340 genetics is now coming to a point where we can pretty confidently say that there's a very strong
00:35:55.700 genetic component to behavior yes right so if you can analyze people at that level and their dna and
00:36:02.680 their genetics you're not just going to be able to tell what disease they might develop or whatever
00:36:06.960 you're going to be able to tell what their personality type is to some extent what their
00:36:10.860 choices are going to be what their political leanings are going to be it would make dating
00:36:14.820 really easy that would actually make dating much easier yeah imagine tinder no i want a left leaning
00:36:22.680 no i can see you're good for three dates and then
00:36:24.720 i think you're missing you're a key part yeah i think you guys are really missing
00:36:33.740 But you know what? Remember Minority Report and there was that idea of pre-crime?
00:36:40.700 That is already real in China.
00:36:42.680 They are already using the artificial intelligence in conjunction with this extraordinary data sweep,
00:36:48.860 and they are determining which people look more inclined to cut corners than others
00:36:56.360 and then to start corralling them into a corner and putting pressure to behave better.
00:37:00.720 that's today that's not maybe someday we already see this in motion so there's again this is a huge
00:37:08.760 philosophical question is this question of prejudging precognition premeditation where do
00:37:16.200 we draw these lines subscribe to us on youtube give us a rating on itunes follow us everywhere
00:37:23.760 that you can we are on twitter on instagram on facebook all these terrible social media platforms
00:37:28.060 so please keep using them to support our channel at TriggerPod.
00:37:32.260 Anything else, Francis?
00:37:33.180 Yeah, don't smoke weed whilst you're watching this episode.
00:37:36.180 And if you've enjoyed it, we'll see you in a week's time
00:37:38.340 with another brilliant episode.
00:37:39.620 Thank you very much. Goodbye.