In this episode, Dr Linda Papadopoulos talks about the dangers of trigger warnings and how they can cause people to be offended, and why it s important to be a victim in order to learn how to deal with the hurt that comes from being a victim.
00:01:02.080First of all, one of the things, you know, this is called Trigonometry.
00:01:05.280And we briefly mentioned that I was telling you the study that I saw that giving people trigger warnings actually primes them to have a negative experience of whatever it is.
00:01:16.000And you were like, yeah, please, please ask me about that on the show.
00:01:18.920So what do you want to say to us as a psychologist about that?
00:01:21.720if you think of um one of the things that we do in cognitive therapy right when someone comes in
00:01:27.140and says look this is happening to me and it's really bothering me one of the things that we
00:01:30.700do is we look at erroneous thinking so if I've come in and I've said you know oh you know I
00:01:36.220don't like stripes it may be that you hear well Linda doesn't like me we'll talk about it later
00:01:41.280and um and you know or she's being offensive and she's being directly offensive now if you were
00:01:47.460having therapy they'd be like well maybe it's just that Linda had a bad experience with stripes
00:01:51.000It's nothing personal. Maybe before you kind of personalize this and become triggered by it,
00:01:55.900maybe you need to check out with Linda why she said that. So in a lot of ways, these trigger
00:02:01.120warnings, in a lot of ways kind of priming people to be offended is the exact opposite of what we do
00:02:06.680in CBT, which is take a step back and reevaluate that thought, right? If I'm going out of my way
00:02:13.320to find something to be offended about, then I will be. And there's a value in being offended,
00:02:18.960Because if I'm offended, I can put myself in the role of someone who's a victim, then they need to take care of me.
00:02:24.420I can get social points because I've called you out for being whatever, being unthoughtful, uncaring.
00:02:51.500They went and they spoke to a teacher on their behalf.
00:02:53.900They didn't like not being invited to a kid's party.
00:02:56.740They'd complain and they'd speak to the mother and they'd been invited to the party.
00:02:59.600A big part of dealing with life is having the resilience to come back and realize that sometimes shit happens not because someone's out to get me, but because things happen.
00:03:10.400And I have to find the strength to overcome these things.
00:03:12.420So I just kind of feel that a lot of the way that we're talking about trigger warnings and language being dangerous basically de-skills people from being able to cope with these things.
00:03:23.560And I certainly wouldn't want that, you know, for my child or any of the young people that I work with.
00:03:29.080And do you think this behaviour is addictive, that once you start and you cast yourself in the role of victim, you get this attention, you think to yourself, hang on, I quite enjoy this, whether consciously or subconsciously?
00:03:40.000well there's I think there's a power in um in being being seen and being heard and look I think
00:03:48.140there are some genuine people that are victimized out there I think it's important that we say that
00:03:52.080and I think they should be able to speak up and talk about it but I think intense importance right
00:03:56.640if I bump into you and I didn't mean to bump into you then surely that matters right now the amount
00:04:03.120you hurt you know if you're really hurt that that yeah that matters too but the intent has to come
00:04:07.980into it and I kind of fear that it's very rarely about intent it's very rarely that I didn't notice
00:04:13.660you had the striped shirt on before I said I don't like striped shirt I'm keep going I'm sorry
00:04:18.340I always enjoy when your fashion sense gets destroyed that's always the best I did show
00:04:25.420this to my girlfriend yesterday and she just looked at it and then walked out the room you
00:04:30.100know when you're trying to pet a cat and it's like no I'm out so there we go the moral of the story
00:04:34.820always listen to your girlfriend. Yeah, yeah, and listen to her disapproval. Right, um, do you think
00:04:39.540social media has made this worse? The fact that, you know, we're all constantly logged in, we're
00:04:43.600all looking for approval, we're all looking for likes, et cetera, et cetera, et cetera. I think
00:04:48.000it's, um, it's a huge part of it. I think, yes, they're looking for approval and likes, but I think
00:04:52.460also kind of the distillation of one's ideology down to their, you know, most thoughtless tweet,
00:04:58.560Right. So, you know, surely we're an amalgam of all of, you know, the ways that we interact, our interactions, our opinions, everything else.
00:05:08.020But because I can look into your background and come up with the one time that you used, you know, whatever, you know, a word in the wrong way or you re-liked something you didn't think about and you retweeted it.
00:05:19.220I think these things mean that we're walking on eggshells.
00:05:24.480We're constantly so anxious of being misunderstood
00:11:30.000But the fact that sometimes it comes out every three times you pull it, sometimes it comes out every five times, means you're much more likely to go back.
00:11:36.100We know that. That's very basic human behavior.
00:11:38.560Now, that variable reward system exists on all social media sites.
00:11:42.500It exists when you see that little notification at the bottom.
00:11:45.980It exists when you're going to get some sort of validation, when you get the retweet.
00:11:49.980It exists when someone likes something.
00:12:12.960That means I want to feel like I'm getting good at something.
00:12:15.720Now, what better thing than if I give you a game that I kind of ding, ding, ding, give you a reward every little while, every little increment so I feel better.
00:12:22.280and I get that dopamine rush and I play more.
00:12:46.980There's dozens of people with PhDs in psychology,
00:12:50.440You know, way smarter than, you know, than the average person in terms of how they understand human behavior, how they understand themselves.
00:12:58.020All, you know, ensuring that you spend as much time as possible on these platforms.
00:13:02.640And do you think that there's anything that can be done or what's going to be the impact, first of all, of this if you project it into the future?
00:13:09.720And is there anything we can do as a society, maybe legislatively, to start to curb some of these things?
00:13:16.500Yeah, I mean, I think we're already beginning to see
00:13:20.400so sort of gaming addictions already accepted
00:13:23.140in places like Japan, not yet in the States,
00:13:27.320but I think it's happening, we know that.
00:13:29.460We know that, again, even from a psychological point of view,
00:13:35.780you know, self-harm, which we tend to see quite a lot of it,
00:13:41.140especially with sort of adolescents and adults,
00:13:43.160we're beginning to see that in the digital form.
00:13:45.300So digital self-harm, where young people are bullying themselves online anonymously.
00:19:39.240and they try and install smoke detectors for free.
00:19:43.440It's a perfectly noble and worthwhile thing to do.
00:19:45.500now one of the things they can never get their head around and i can't understand it either
00:19:49.940entirely is that they can get nearly everybody they ring on the door and say firemen here
00:19:54.080we're just here to install a smoke detector you basically you can get them to install one
00:20:01.440but they really balk at two or three and the typical apartment really needs three the strange
00:20:06.400thing is that's like saying no i don't want one in my child's bedroom it's a really weird thing
00:20:10.660to say but for some reason people just draw the line at one maybe two when three is necessary
00:20:17.920okay now i use that same kind of python-esque uh you know bizarre approach to suggest a solution
00:20:26.680which is very simple thing in behavioral science it's just re-anchoring i said and it partly was
00:20:32.900OK, I partly suggested absolutely outrageous things, which which would never be legal or acceptable, which is to actually outrage people by saying, you know, well, actually, if you're Korean, you'd be entitled to three.
00:20:49.700But since you're African-American, you only get two.
00:20:51.720OK, I can see why that might be an issue.
00:25:59.320you and i met two years ago at kilkenomics in ireland doing comedy well yeah i was doing
00:26:07.920you were doing the same thing i was doing comedy and we were chatting in the bar after one of the
00:26:14.720shows that we were on and you were telling me about this new thing in china where they're going
00:26:20.020to have this system where they're going to track everybody's behavior and all this stuff and i'll
00:26:24.540be honest with you pippa i really like you but i was sitting there listening to you going
00:26:28.380she's a little bit crazy she could be a little bit crazy maybe she smoked a lot of weed when
00:26:35.180she was really yes so interesting yeah i was sitting there and then a year later boom it's
00:26:41.940happened it's real and that's what you were talking to us about last time you were on the show
00:26:47.240what the hell is going on and what what's to come because now i believe you thank you okay so let's
00:26:53.380is quickly go through what is already real right if you jaywalk in not all of China they've only
00:27:00.520rolled it out in one city but they're going to roll it out nationally if you jaywalk
00:27:03.640it clocks that you've done it because the cameras are ubiquitous now so the facial recognition is
00:27:10.360incredibly strong they have the most valuable artificial intelligence startup in the world in
00:27:14.520fact the most valuable startup in the world called sense time and it can recognize an individual out
00:27:19.580of a crowd of 10,000 and it can recognize your emotional state at any given time so it clocks
00:27:25.560it's you and particularly you next thing you know you pick your mobile phone up and the fine
00:27:29.820for having jaywalked is already in your text messages if not already deducted from your bank
00:27:36.260account and your name and or your government number is already broadcast on the oled screen
00:27:42.680that's above the intersection the nearest intersection so you've now been broadcast to
00:27:47.720everyone nearby, that you are bad and you just violated the law.
00:27:50.920Now, this is important because what it does is it affects your effectively personal Uber score.
00:27:56.100The social credit system is based on the idea that you're given a score, which reflects your social compliance.
00:28:03.020So if you Google stuff that they don't want you looking at, your score goes down.
00:28:07.740If your brother or your sister does it, your score goes down.
00:28:13.180right because Mao always said the the best eyes and watchers or it's not the government is to get
00:28:20.300everybody to report on each other and this is a kind of a I mean the Stasi would love this system
00:28:25.560but it's bigger than that uh within the last few weeks it's been announced that when you
00:28:32.340tap swipe and pinch on your mobile device it's a better indicator of who you are than your
00:28:39.000thumbprint so now even if you're using someone else's phone they know it's you
00:28:44.340triangulate that with the way you walk it turns out your walking gait is also a
00:28:49.320better indicator of who you are then your thumbprint so what they're doing is
00:28:54.900taking all these things and triangulating think about it as
00:28:59.040previously independent silos of data now they triangulate using artificial
00:29:03.780intelligence to pull it all together but it goes even deeper than that and
00:29:08.660recently they arrested someone at a rock concert of like 60,000 people because
00:29:13.400again the face went by one camera bang at clocks this is person is wanted and
00:29:18.600they went to the person's seat can you know the debit card yep the ticket up
00:29:22.680there arrested I think they now have 11 million people who are you can buy a
00:29:28.920train ticket or a plane ticket but you can't board it they won't permit you so
00:29:33.460effectively you wherever you are no so they're putting you into digital digital prisons so i
00:29:40.040think one of the um spouses of one of the dissenters uh she's like blocked into a very small few block
00:29:49.320area around her house and anytime she tries to move beyond it the officials are alerted and the
00:29:54.800police will be right there so she literally is stuck in a digital prison this is basically what
00:30:00.540the technology permits. And it rewards you if you do things that are supportive of what
00:30:06.420government's interested in, and it basically penalizes you if you don't. We're doing the
00:30:10.680same thing in the West. It's just it's not government that's doing it. We have private
00:30:14.640entities that do it. We have Facebook that does it. We have Amazon that does it. We have Uber
00:30:18.720that does it. And I don't know if you saw, but there's been an announcement that there's been
00:30:24.000to deal between it seems Google and MasterCard. So now if I go to you know I don't know a department
00:30:33.620store and I buy this color lipstick Google's going to know and now they know what they're
00:30:39.320going to advertise to me. So I'm concerned about this and I write a lot about this in my book
00:30:45.360because imagine what we've got it's a world where you are emitting data points about yourself all
00:30:51.700the time without even knowing it because you've turned all the fitness apps off on your phone
00:30:55.980but the fact is the way you walk is revealing a lot about you including by the way your your
00:31:01.440cardiac condition a lot about your health is revealed by how you walk so this multiplicity
00:31:06.700of data points that you're throwing off but you don't know who gets to see it but whoever does
00:31:12.200get to see it they know more about you than you know about yourself and by the way it's a two-way
00:31:17.840thing if I'm a chief executive and I go on let's say this program and I start talking about my
00:31:23.920company and I'm lying this same facial recognition technology can identify your micro facial
00:31:29.940movements and they know that you're lying and they can set the algorithms to short the stock
00:31:34.740of your company as you're talking on say CNBC's Squawk Box. So think of it as almost like a crystal
00:31:41.280ball of data points and on the one side it will let us conjure forth answers that will do things
00:31:47.480like solve cancer literally we will solve extraordinarily difficult problems by having
00:31:52.320all this data but it also changes the balance of power between companies and customers because
00:31:58.600companies will have so much information that i've argued you know we used to have insider trading
00:32:04.660laws we may need inciter trading laws that if i am mastercard now or google my knowledge of you
00:32:11.920is so great. Forget Cambridge Analytica. They only had 5,000 data points on 81 million people,
00:32:18.680and that was enough to begin to influence your political position. I could sell you way more
00:32:23.380than a political position. If I have more than that, I could sell you a refrigerator. I can
00:32:26.800say anything I want. So I think human beings are very vulnerable to this kind of power being
00:32:32.120exercised. And similarly, what kind of world do we have if people don't know how they appear?
00:32:37.440And should we have a right to know how do we look with that data slice looking at us at any given time?
00:32:45.480So in the book, I've tried to lay this out because we have a lot of leaders who are making decisions about the landscape of our future who don't understand what I'm saying.
00:32:55.860And so they're literally missing a profound shift in humanity.
00:33:18.560So, I mean, joking aside, but then that literally means
00:33:22.820if you're being tracked of every single second of every day,
00:33:26.640if they're reading what you were doing, if they can manipulate you,
00:33:31.300is that the end of free will eventually?
00:33:33.020This is the—so what's really interesting to me is when I was in college, I studied political philosophy, and everyone went, oh, my God, you will never get a job.
00:33:42.680I mean, you are permanently unemployed if you study—now we get to this, and I'm like, these are questions of political philosophy.