00:02:52.740Actually, the most shocking example and a most concerning example, I decided not to publish it because I think it's so dangerous that anyone could misuse it.
00:03:00.420And, you know, I wouldn't want to facilitate that.
00:03:03.060But some examples of just corporate surveillance and your phone at night is sending information that it has collected throughout the day.
00:03:10.200And it sends it at night so that you don't notice that your battery is draining because, you know, people typically connect it at night and charge it, charge their phones at night.
00:03:18.220and companies are tracking very sensitive things like whether you sleep okay at night or not
00:03:23.620at what time do you wake up with whom do you sleep with which can tell people you know who
00:03:29.260is your partner but also things like whether you're having an affair and if you have a smart
00:03:33.680car it's tracking not only where you go and how fast you drive and how well you drive but things
00:03:38.640like and the music you listen to and what that tells about your mood but also the seats in your
00:03:44.100car is measuring your weight and you know potentially selling that information to insurance
00:03:48.660companies who might want to know you know if you're getting a bit too slim or a bit too fat
00:03:52.460and they know things about your health records your educational records your purchasing power
00:04:01.440your browsing history and people search the things that they care most about the things that worry
00:04:07.520them their diseases and whether they can you know pay their their loan and so it's very very
00:04:13.020sensitive it's just shock after shock after shock and carissa why is this a problem well you know
00:04:19.160because people go well so what all they're doing is collecting data what does it matter
00:04:23.260it matters because privacy is power because data whoever has the data in the digital age will have
00:04:30.820the power because data gives not only the possibility of selling that data which gives
00:04:35.360them money which already makes them very powerful but it gives them the possibility to try to predict
00:04:40.640what you're going to do next and try to influence that and change that. And that is incredibly
00:04:45.500attractive for companies, but also for governments. And it's something that we haven't been focusing
00:04:51.340on enough. I was reading this book by Bertrand Russell, the philosopher called Power, and he
00:04:57.480argues that we should think about power as something like energy, in that it can transform
00:05:02.140itself from one thing into another. So if you have enough economic power, then you can buy votes or
00:05:07.100buy politicians. If you have enough political power, you can buy military or get military power
00:05:12.220and so on. And there's this really important kind of power in the digital age having to do
00:05:17.120with forecasting and prediction that has always been there in a way. We've always known that the
00:05:23.000more knowledge you have on somebody, the more power you have over them. But we have never had
00:05:27.500this capability of amassing so much data and of analyzing it. So it matters because the more
00:05:32.780others know about you, the more you're vulnerable to them in all sorts of ways that are invisible
00:05:36.860to you, but you might get discriminated against for a job or a loan application or an apartment.
00:05:42.580You might get extorted. Your identity can be stolen and your democracy can be stolen as well
00:05:49.100because it influences how we relate to one another as citizens. And in your book, you use many,
00:05:55.940many examples. The one that I found particularly powerful was the example of the man in Virginia.
00:06:01.040Oh, yeah. So I have a friend who was training at the time to be a data analyst and he was
00:06:06.840telling me like what it's like to go to that training. And one of the exercises that they
00:06:11.960had to do is just pick a random person anywhere in the world and just research anything and
00:06:17.760everything you can about them. So this random data analyst picked this random guy in Virginia
00:06:23.780and he learned everything about him. So if I remember correctly, this was a man who had
00:06:28.100diabetes, who was having an affair. He knew what kind of car he drove, what job he had,
00:06:34.160who were their friends, who were their family.
00:07:19.920because most of data is actually collected by companies
00:07:22.820and only then do governments make a copy of the data.
00:07:25.280But it's really the companies that are best
00:07:27.300and have more resources to collect data.
00:07:30.100And they have many reasons to use that data.
00:07:32.940against you. Like I mentioned, you can be discriminated against in all kinds of settings.
00:07:37.840So essentially, they're undermining equality and equality of opportunity. You are not being treated
00:07:43.080as an equal citizen. You're being treated on the basis of your data. But at the same time,
00:07:48.300we shouldn't forget governments because it almost makes no sense to separate the corporate
00:07:52.980surveillance from the government surveillance right now because they share data all the time.
00:07:58.620So every time a company collects data, that data can potentially go to the government and the government very often just makes a copy of the data immediately.
00:08:08.380But also every time the government collects data, that data also ends up in the hands of corporations.
00:08:12.500So we've seen this in the coronavirus pandemic in the UK.
00:08:16.480The NHS has given data to Palantir, this very shady data company that was partly funded by the CIA.
00:08:24.320And they gave data not only that, you know, might be more kind of, I don't know, understandable about people's health to fight the coronavirus pandemic, but they also got data, for example, about people's criminal records.
00:08:37.060And there was no explanation as to why exactly this company needed that data and what's going to happen to that data.
00:08:43.500So the flow of information goes both ways to such a large extent that it almost makes no sense to differentiate between them.
00:08:51.020And the other thing that people often say when I've talked to people about this is it's always
00:08:56.000the same thing, which is, well, I'm not doing anything wrong. I don't care. What should I
00:09:01.140worry about? What do you say to people who think about these things that way?
00:09:05.360Well, there are at least two responses. One is you actually do care because nobody wants to have
00:09:10.500their identity stolen. That takes, you know, that's a hassle. It can actually get you to jail
00:09:14.980without having done anything wrong whatsoever because somebody else uses your name to commit
00:09:19.060crimes in your name but also you are vulnerable in all sorts of ways even if you do nothing wrong
00:09:26.080so for instance maybe you have a disease or maybe you have a disease that you don't know about
00:09:30.380and that a company wants to pick up on before even you do and then discriminate against you
00:09:35.860next time you ask for a loan or a job or something like that and so there are all kinds of reasons
00:09:41.020why our privacy is important even if we do nothing wrong but furthermore even if you didn't care about
00:09:45.660yourself. You just said, you know, I'm a masochist. I want to be stolen and I want to be exposed and
00:09:50.340extorted. I'm fine with that. It's just an interesting experience. You should protect your
00:09:55.000privacy because privacy is actually a collective thing. This narrative that the tech companies have
00:10:00.600sold us that privacy is just a personal preference, it's something individual. And if you're not shy
00:10:04.560and you're not a criminal, then you have no reason to protect your privacy is totally misguided.
00:10:09.500because when when you expose yourself you expose others so every time you share your location you're
00:10:16.120sharing data about your neighbors and your co-workers every time you share data about your
00:10:20.980genetics you're sharing data not only about your siblings and parents and kids and cousins but very
00:10:26.580distant kin that can suffer really bad consequences like being deported or being denied health
00:10:31.700insurance or life insurance um even if they didn't do the test themselves and you're not you don't
00:10:38.760know that you're actually kin and you've never met that person. And in the same way, society
00:10:46.180benefits from people protecting their data. So one example is Cambridge Analytica. Only 270,000
00:10:52.860people gave their data to the firm. And with that data, the political firm managed to get their
00:10:57.880hands on the data of 87 million people who were the friends of these original 270,000 people,
00:13:44.120It's a good question and it's complicated.
00:13:46.160The first reason was because governments after 9-11 were very scared about what happened.
00:13:52.860They wanted to prevent it at all costs.
00:13:54.980And it was intuitive to think that the more data they had on people, the more they could do something about it and keep people safe.
00:14:00.820Now, it just turns out that big data is not the kind of analysis that is good for preventing terrorism.
00:14:06.300Big data is fantastic at knowing what you're going to buy tomorrow because we have data from billions of people who buy things every single day.
00:14:13.040But terrorism will always be a very unusual event, and that makes it very hard to understand for big data.
00:14:18.860So the first reason was because governments thought they had an interest in keeping the data.
00:14:23.440Now, you know, 20 years have passed since then, and I think governments are slowly learning that having all that data stashed away is a national security danger.
00:14:35.160So I think that's going to motivate them to change the law.
00:14:38.920But another reason why laws haven't been put in place is because data is very hard to police.
00:15:47.220Because, yeah, I knew what the answer was. Can you explain why, though, Carissa?
00:15:53.520Yeah. So one reason is that we've just been forced to use digital stuff more and more and more.
00:15:59.820And, you know, one advantage is that the narrative that, you know, we can always opt out and it's up to us kind of has totally disintegrated, right?
00:16:08.640It's obvious to everyone that you can't opt out if you want to be a participant in society.
00:16:12.660There's no way to opt out of interacting with the digital.
00:16:17.140But we have been forced to use it a lot more and therefore a lot more data is being collected.
00:16:22.640Also, times of crisis are notoriously dangerous for civil liberties.
00:16:27.300These are times in which governments very often pass laws and measures that wouldn't be accepted in other circumstances.
00:16:35.960And so in this case, I think there was a well-intentioned idea that the more data we have, the more we can have tools to stop COVID.
00:16:44.340It turns out that that actually hasn't been the case once again, that AI hasn't been helpful to stop COVID.
00:16:50.860And, you know, the tracking applications haven't been the most important aspect of fighting COVID.
00:16:55.620but the measures are there and they don't have a sunset clause so we don't know when they end
00:17:01.180and what happens to all that data that has been collected and one thing the government has done
00:17:06.620and nobody's talking about this but I think that this is absolutely major they've got rid of cash
00:17:12.760can you explain why that is such a disastrous thing for our society
00:17:17.620especially for me because I don't want to pay tax but that's beside the point
00:23:02.420But we don't need to buy and sell personal data for that.
00:23:05.720We don't need to allow data brokers to know everything about us.
00:23:08.600We don't need to allow the government to know anything, everything about us.
00:23:11.400So there's a huge difference between using personal data for things that are important and then having a market of personal data in which anyone can buy it. The highest bidder gets it. That's very different. And that's what we should avoid.
00:23:24.840I mean, people might give another example of, let's say, the lending, which you've referred
00:23:30.300to a number of times already. You could say, well, they're able to make better decisions. I mean,
00:23:35.2802008 would suggest otherwise. But in the past, if you wanted to get a mortgage, you'd have to go to
00:23:41.000the bank and the guy in the bank would have to make a personal decision based on how you were
00:23:45.880dressed and whatever else. Now, we have all this data, which is very good at predicting whether
00:23:52.220you know you're going to be able to repay your mortgage surely that's a good thing because we're
00:23:56.180you know this is the argument we're not giving people debt they can't handle etc etc well it
00:24:01.240depends it has to be shown it's not enough to just say this works i want to see how it works
00:24:05.940and it hasn't been shown that way so for instance you know one argument is your banker might have
00:24:10.900all kinds of prejudice they might be a racist they might be a sexist that's actually quite
00:24:14.240plausible and even likely um but it turns out that algorithms are you've clearly met a lot of
00:24:19.400bankers. No, I actually haven't. But it turns out that algorithms are just as biased or worse than
00:24:31.300people. So one thing that I have proposed, and again, it depends on the kinds of proxies that
00:24:37.780these algorithms are using. So in many cases, it's unfair that we are being treated as a category and
00:24:44.120not as an individual. So say you live in a certain kind of postcode in which people tend to not pay
00:24:49.300their loans. But you might be different, right? You might be like super serious and super
00:24:54.000good at paying back your loan. And you're not going to get that loan because, you know,
00:24:58.140your neighbors don't pay their loans or your friends on Facebook don't pay their loans. And
00:25:01.980that seemed quite unfair. So one of the things that I have proposed recently in an article for
00:25:06.420the Harvard Business Review is that we should only allow algorithms that have really important
00:25:12.460decisions to make, like giving somebody a loan, to go out into the world if they have passed
00:25:18.140a randomized control trial. So just like we do with medicines, we don't allow any medicine to
00:25:23.360just go in the market without having been tested, not even in a crisis like the coronavirus pandemic.
00:25:27.460We had to make sure that the vaccines were safe and they went randomized control trials. Well,
00:25:34.260in the same way, an algorithm could go through a randomized control trial. We could have an
00:25:38.860agency like in the United States, the FDA, the Food and Drugs Administration, to make sure that
00:25:44.340an algorithm is safe and to prove that it's actually making better decisions than the
00:25:50.900Carissa, isn't part of the problem here that we have this new technology, it's brand new,
00:25:56.700we don't actually understand the full ramifications of this technology?
00:26:02.000Yeah, so it's really reckless to just let it loose into the world. Essentially, we are
00:26:06.720treating people as guinea pigs, and that's totally wrong. And we used to do it with medicine.
00:26:11.040So when you went to the doctor in the 1950s, you could get signed up for an experiment without knowing.
00:26:18.240And after the Nuremberg Code, which is like one of the most important medical ethics codes, we decided that we don't experiment on people without their consent.
00:26:27.100So if you want to carry out clinical experiments, you get you inform people about what you're doing.
00:26:32.840You get their consent and then you give them some kind of compensation.
00:26:36.520And we should do the same thing. And right now we're not doing it.
00:26:38.920we are being guinea pigs all the time of algorithms without even knowing it.
00:26:44.100And one of the things I found really fascinating about your book, and this is the bit that I
00:26:49.400mentioned I want to reference in mine, is the fact that people often don't understand that
00:26:54.700intentionality and malice and the deliberate evil is actually completely unnecessary for harm to be
00:27:02.940caused so the collection of data today completely innocently or indeed for beneficial purposes
00:27:09.740will often end up being useful to to people who do want to use it for evil can you talk a little
00:27:15.920bit about that yeah that's really important to have in mind because i don't think that you know
00:27:21.020i i think that most people in tech and most people in finance and most people um just have
00:27:26.360good intentions or at least not bad intentions they just want to do well in life they want to
00:27:30.140innovate and so on. But Hannah Arendt had this amazing term called the banality of evil.
00:27:36.860And the idea is when we think of evil, when we think of the Nazis, we think of
00:27:40.820the paradigm of somebody like Hitler, who, you know, you imagine somebody who hates Jews and
00:27:46.700wants to hurt people and wants to kill people. And that's our paradigm of evil. But in fact,
00:27:51.160most of the time, evil gets perpetrated by perfectly normal people who are just bureaucrats.
00:27:56.420And that was her conclusion with the Eichmann trial, that this guy was just a bureaucrat. He wasn't a monster. He was just following orders. He wasn't critical. He was just a cog in the system.
00:28:08.580And it's really important that we don't become cogs in a system that creates injustice and that erodes democracy and even creates evil.
00:28:17.960And it's not enough for us to think of ourselves as good people and have good intentions.
00:28:22.780We need to be a lot smarter than that and a lot more critical to avoid evil.
00:28:27.580And one of the examples you give on that very issue, speaking of the Nazis, is the fact that survival rates in different European countries for Jews were different because they had different practices of collecting data.
00:28:41.300And those that collected data by ethnicity, which may have been perfectly reasonable under a legitimate, democratic, sensible, non-discriminatory government, then fell into the hands of people who wanted to use it to discriminate, to kill, to murder, to imprison.
00:28:57.860And that innocent collection of data led to more people being killed because of it, right?
00:29:03.980Yeah, I think it's a perfect example to show that personal data is a ticking bomb.
00:29:08.060So the Dutch had a very good system of statistics.
00:29:11.320They had a guy called Lenz who was one of the pioneers of statistics,
00:29:14.560and he wanted to build a system that followed people from cradle to grave.
00:29:18.240And in his census, there were a lot of questions,
00:29:21.020and there was data collection about your religious affiliation,
00:29:24.080but also your ancestry and things like where your grandparents lived.
00:29:26.900Now, in contrast, in France, they had made a decision since 1872
00:29:32.120not to collect that kind of data for privacy reasons.
00:29:35.800And so when the Nazis arrived to France and asked, you know, where are the Jews?
00:29:39.360They said, you know, we have no idea how many Jews we have, let alone where they live.
00:33:58.880So in a survey I did with a colleague, Sean Brooke, we found out that about 92% of people
00:34:04.060have had some kind of bad experience related to privacy online.
00:34:07.260Sometimes it's about getting your credit card number stolen.
00:34:09.460Sometimes it's about being exposed on Twitter.
00:34:11.820Sometimes it's about, you know, having an ex-boyfriend track you or something like that.
00:34:16.960And the more we have bad experiences, the more we learn that actually privacy was important after all.
00:34:21.540And the more we get angry that these companies are making us vulnerable.
00:34:25.240But you're right that we need a lot more consciousness.
00:34:28.100And in particular, we need to be much more aware of the political implications of privacy.
00:34:31.840This is not an individual thing or it is an individual thing, but it's much more than that.
00:34:36.080And do you think that, for instance, when you look at something like China, that is a warning for us all, particularly in this country?
00:34:43.980It's such a huge warning. It's such an interesting case for so many reasons.
00:34:48.040But one reason is that, you know, they claim that they are more ethical than we are because we're doing the same thing.
00:34:53.320We're scoring people, but we're just not telling them. And there's actually some truth to that.
00:34:57.780Of course, the flip side is that we have a lot more freedom.
00:35:02.360And even if you're being scored, say, as a consumer, that's not going to impact whether you get, I don't know, a place at a university or a job.
00:35:22.420And I am very worried that we're walking towards it.
00:35:24.920But now a really interesting lesson from China is coming through right now as we speak.
00:35:30.840And that's the one argument from the West for not regulating data was that, you know, we have to have as much data as possible because look at China, they're collecting so much data.
00:35:40.440And if we don't collect enough data, we will be at a disadvantage.
00:35:43.720And now it turns out that China is passing one of the strictest privacy laws in the world right now.
00:35:48.840That's very interesting. There's a lot of speculation about why exactly are they doing it?
00:35:52.060They're hurting their stock. And so why exactly are they doing it?
00:35:54.620But one reason, I think, it may not be the only reason, but one really important reason is that they are realising how dangerous it is to have so much personal data stored because it's a national security danger and the West is going to hack it sooner or later and make a use of it.
00:36:12.300So if even China is regulating privacy, we really have to get our act together quickly.
00:36:18.460And how close do you think we are to that kind of dystopia, Carissa, where, you know, you go and get a loan and for reasons unbeknownst to yourself, you fail it. Your credit rating is excellent. You know, you've never been in debt. You've never defaulted on a payment yet because you don't know that you have a, you know, a genetic condition, but a company does, you're not going to get the money.
00:36:41.640it's really hard to say part of me thinks that that probably is already happening but it's just
00:36:48.240very hard to tell because everything's on the on underground and we can't see it so it wouldn't
00:36:53.380surprise me that those things are already happening but it just they just haven't come out to the
00:36:57.600light and then depending on you know it's it's hard to predict because if we get you know one
00:37:02.760prime minister or another in a few years it can really make a difference and if the u.s you know
00:37:07.360gets a different president. Things can change so quickly, for better and for worse.
00:37:13.680Yeah, as we've seen in recent days. Carissa, moving on a little bit, I know that you are
00:37:18.680an associate professor at the Institute for Ethics and AI. Do you mind if we talk a little
00:37:23.100bit about AI more broadly? Yeah, sure. Because I remember I was listening to Elon Musk a few
00:37:29.980weeks ago talking about, he was asked a bunch of broad questions, but one of them was like,
00:37:36.720what terrifies you the most about the future.
00:38:22.920Yeah, we should definitely be worried.
00:38:25.160So research in AI, in the ethics of AI, I mean, one way to categorize it, there are
00:38:30.200many ways, but one way is it broadly divides into two.
00:38:32.780The people who are really worried about what is often called super intelligence, so the point at which AI becomes more intelligent than human beings, and then what are they going to do? Are they going to get us hooked in heroin or are they going to take over the world and so on?
00:38:48.780And the people who are worried about more short-term problems, like, you know, the bias that algorithms are instituting and so on.
00:38:56.360I think that those concerns about AI taking over the world and what are they going to do are legitimate and we should be thinking about it.
00:39:03.920But my own take is that the short-term risks are much more real, tangible, and they're just like, they're here.
00:39:11.220They're here and they can lead to a dystopia as bad as, you know, the AI that gets us hooked on heroin.
00:39:17.320So I think it would be as bad to have this complacent attitude towards AI and data and get into a totalitarian regime that we can't resist because we're being surveilled all the time than the worry that robots might be advanced enough to take over the world.
00:39:31.700I think we're pretty far from that still.
00:39:34.540Give us some examples of the smaller, more short-term stuff that you're concerned about with artificial intelligence.
00:39:42.220So one concern is, you know, how is AI impacting things like equality?
00:39:48.680So one thing, you know, data and AI are very closely related because the most successful kind of AI at the moment uses a lot of data and much of that is personal data.
00:39:57.240So it's hard to differentiate those two.
00:39:59.980So one question is, you know, is it OK for us not to be treated as equals anymore, but to be treated on the basis of our data?
00:40:05.760And what does that say about our society and what implications does that have?
00:40:09.200Another very important issue is bias. It's really hard to have an AI that is not biased. And it's really difficult to identify the bias because, you know, when we're not aware of it. And there's a lot of evidence that shows that AI is intensifying inequalities that were already there. So, you know, we've been, as a society, we've been sexist for a long, long time. We're pretty good at it. But AI can make us even more sexist and without us realizing. And the same with racism.
00:40:39.200Another issue that I'm worried about...
00:43:51.660And one reason it's unwise is because it creates societies that are very harsh when they never forget, but also because it kind of stops progress.
00:43:59.840And what about the other argument? Sorry, Francis, just to finish on this very point.
00:44:03.680You know, we talk about equality, but you might say from a kind of ruthless Russian mentality like mine, you could argue, couldn't you, that, well, look, these algorithms actually, in many cases, I take the point about the disparity when it comes to sex.
00:44:20.480but in many cases they're making better decisions in order so for example if you are about to get a
00:44:26.840mortgage and you do actually have a condition which means you're highly likely to die in the
00:44:30.840next five years shouldn't the lender know that before they give you a loan isn't this just a way
00:44:36.260of making better decisions and and being fairer because we never really had equality like the
00:44:42.220fact that if you don't pay back your loans you shouldn't be treated as equal by your bank to
00:44:48.320someone who does do you see what i'm saying and the algorithm is just amplifying the ability to
00:44:52.900make accurate decisions some people might argue yeah so i'm reading an interesting book at the
00:44:57.800moment called the tyranny of merit and about the kinds of effects that that it has in society to
00:45:03.300think that you know people can just um are self-made and and we should judge them accordingly
00:45:09.720and so i recommend that but so the first thing to say is unless we have randomized control trials
00:45:15.820that show that these loans are actually these algorithms are making better decisions and getting
00:45:21.400better returns then you know we shouldn't believe it we shouldn't believe it you know because
00:45:26.720somebody says that we need randomized control trials so that's the first thing but secondly
00:45:30.100let's say that they do work like that and let's say that they are more effective we also have to
00:45:34.320ask ourselves what kind of society do we want to live in do we want to live in a society in which
00:45:38.400because somebody gets cancer they can't get a loan um shouldn't we have rules for that just