A vaccine-passport surveillance state isn’t just sci-fi
Episode Stats
Words per Minute
186.31247
Summary
Anne Kavoukian was Ontario s Privacy Commissioner from 1997 to 2014. She s now Executive Director of the Global Privacy and Security by Design Centre, and she s a pioneer for the idea of privacy by design. In this episode, Anne talks about privacy in the digital world, what s at stake, and what we should be looking out for.
Transcript
00:00:00.000
Whether you own a bustling hair salon or a hot new bakery, you need business insurance that can
00:00:06.980
keep up with your evolving needs. With flexible coverage options from TD Insurance, you only pay
00:00:11.800
for what you need. TD. Ready for you. Hello, I'm Anthony Fury. Thanks for joining us for the latest
00:00:21.260
episode of Full Comment. Today's episode is all about privacy. What are the big issues now? What
00:00:26.940
are the upcoming privacy concerns we have to watch out for? And what has privacy meant
00:00:31.220
during COVID-19? Our guest today is a leader in the field. Anne Kavoukian was Ontario's
00:00:36.540
Privacy Commissioner from 1997 to 2014. She's now Executive Director of the Global Privacy
00:00:42.760
and Security by Design Centre, and she's a pioneer for the idea of privacy by design. Anne joins
00:00:49.320
us now. Welcome to the podcast, Anne. Thank you, Anthony. Yeah, great to have you on. Thanks
00:00:54.200
very much. What are your top concerns right now about privacy? I know there's so much terrain
00:00:59.160
we can cover and different things we're going to go into, but right now, what are the things
00:01:02.820
that you are most focused on? I'm really concerned that we're returning
00:01:07.620
to a zero-sum mindset. By that, I mean zero-sum means you can only have one positive gain in
00:01:13.740
one area, always to the detriment of the other area. So in this case, we're talking about the
00:01:19.320
pandemic. Fear is abounding. You have to get a vaccine. There is such pressure on people to get
00:01:26.080
a vaccine. And forget about privacy. Forget about people's sensitive medical data, this data that
00:01:32.960
should not be shared with the public. People shouldn't have access to it. It's all about
00:01:37.560
the pandemic interests, public safety versus privacy. Get rid of privacy. That's what concerns
00:01:45.820
me enormously. So you believe we could do both things at the same time? We could say,
00:01:50.620
all right, we have a virus here. It's serious for a lot of folks. Let's encourage people to
00:01:54.640
get vaccinated. But at the same time, we can deal with these concerns, and we should be,
00:01:59.360
but we haven't been. Absolutely. It's like that with all incidents over time. Think of terrorist
00:02:06.180
incidents. Think of 9-11. Obviously, there was enormous fear during that time, and measures were
00:02:12.700
introduced for surveillance purposes. But once the terrorist incident, the emergency ends, often
00:02:19.500
those measures that have been introduced to get rid of privacy, they continue. So what I'm so
00:02:26.860
concerned about, there's this pressure now, of course, on everyone getting a vaccine passport.
00:02:32.800
First and foremost, this should be only information you share with your physician.
00:02:36.660
It's very personal medical information. Health-related data deserves the strongest
00:02:42.340
protection possible. And we're just ignoring all of that, making this all public. And you see,
00:02:48.140
once your vaccine passport data is retained at a particular site, there will be information
00:02:53.920
associated with it relating to the geolocation data, where you posted it. And there's enormous
00:03:02.780
fears now that these vaccine passports are going to act, they're saying like a virtual bouncer,
00:03:08.520
that they will create a new inescapable web of geolocation tracking, surveillance,
00:03:14.360
which is the exact opposite of privacy and freedom.
00:03:19.860
What are some of the issues that we're doing right now that we could be doing completely differently?
00:03:26.000
Like, you know, when I have had to go to restaurants and they say, okay, scan your phone here to get the
00:03:31.280
menu because they wanted to do away with paper menus or so forth, or do the contact tracing,
00:03:35.020
scan your phone here. I mean, I didn't know, I don't know the technical backend stuff of all
00:03:39.380
that, but I was like, ah, this is kind of icky. I just, I just write my name down on the form for
00:03:44.280
the contact tracing. So I did, or with the menu thing, I was like, just give me the menu. You know,
00:03:47.940
I just want to look at the paper menu, like I've done, you know, throughout my life, that kind of
00:03:51.380
thing. And I know there's some people who just love doing the new digital stuff and I'm just not crazy
00:03:55.100
about it. And I'm glad at least for now where I am in Ontario and Toronto, I've been able to do that
00:03:59.380
stuff. Well, the digital is more likely to lead to surveillance because then they can retain that
00:04:07.320
information relating to you at a particular time, at a particular place. And I know people think
00:04:14.300
it's crazy when I say surveillance will abound, but in this day and age, surveillance is abounding.
00:04:21.240
If someone is interested in finding your whereabouts on a particular date and time and where you've been
00:04:26.200
going, it's not going to be difficult to do that. That's why they're talking about these
00:04:30.640
new scanning technologies, these digital measures as being this inescapable web of tracking,
00:04:37.460
because the potential for your information to be tracked and surveilled exists. And that's what
00:04:44.180
concerns us. You see, privacy, privacy forms the foundation of our freedom. You cannot have free
00:04:49.440
and open societies without a solid foundation of privacy. So we have to protect it and guard it whenever
00:04:55.280
we can. Now, and I know normally when you do speak about privacy, you're speaking about it in terms
00:05:00.800
of technological effects. I think that just invariably is the case in the 21st century. But
00:05:05.960
one thing that I was thinking about as you were just talking about vaccines and privacy, there's also
00:05:10.880
a change in, I don't know what the term is, like social privacy. And that even if there's no iPhones
00:05:15.440
around, there's still the kind of expectation that people will be divulging personal information
00:05:21.100
about each other. I mean, the whole idea of pushing for your friends and family members to get
00:05:26.580
vaccinated is then we hear these messages, we'll don't have an unvaccinated person over for Thanksgiving
00:05:31.540
dinner or what have you. And of course, the entire way that you can you can fulfill that is by screening
00:05:37.620
the people who come into your house by asking them for whether they're vaccinated or not. Now, we've taken to
00:05:43.280
just kind of assume that everybody has had 100 conversations like that the past few months. But would you
00:05:47.540
find that that is also a privacy concern? It's a huge privacy concern because you have to understand
00:05:52.540
there are many reasons why people may not get a vaccination. I mean, I'm not anti-vax. I'm pro-choice.
00:05:59.580
And some individuals are immunocompromised. They cannot get a vaccine. They may have allergies to it.
00:06:06.560
You may recall, or perhaps you won't, right at the beginning when vaccines were first coming out,
00:06:12.340
they said, if you have any kind of allergy that requires an EpiPen, for example,
00:06:16.660
don't get the vaccine, bad news. No one is saying that anymore. So people are very concerned that
00:06:22.700
there are reasons why they shouldn't get a vaccine, medical reasons, but no one is hearing about it.
00:06:27.840
These are the concerns that I have that will proliferate over time and will impact our social
00:06:34.000
interactions. There's no question. One thing that I found really odd and quite frankly, downright
00:06:38.820
creepy and disturbing, was the news came out that two Ontario PCMPPs, two members of Doug Ford's
00:06:45.380
government caucus here in Ontario, were not vaccinated. The names of who those two people
00:06:50.320
were, those two young women, were made public. So we knew that. They did, I guess, discuss their
00:06:56.520
issue. And I think their reasonings were eventually presented and they presented their reasons.
00:07:00.860
The Ontario Liberal leader, Stephen Dale Duca, at one point actually called for like some sort of
00:07:05.200
probe or investigation into this issue. And I was like, this is really kind of sketchy. Like you
00:07:11.260
can totally say everybody should be vaccinated and, and then, oh, two people aren't. Okay. Well,
00:07:15.900
what's that about? Without saying like, we need a full blown commission into investigating why these
00:07:21.520
two young women made this choice, whether you agree or disagree with their choice.
00:07:25.680
It's crazy. It's like, you have to investigate these poor individuals who have made the choice,
00:07:30.780
I'm sure for very good reasons. And it's their business. This is your privacy. This isn't something
00:07:36.620
that should be investigated and debated. It's not like they've broken a law. It's, it just,
00:07:41.500
this is what concerns me enormously. It's getting to the state of law enforcement has to be invoked
00:07:47.640
and investigate these individuals. This is, this is going to eradicate our privacy and lead to such a
00:07:55.220
surveillance society that my fear is we may not, not be able to get back out of. And that's what
00:08:01.440
always concerns me whenever there are these emergencies, that the ability to pull back from
00:08:07.800
this, see vaccine passports are an infringement of our privacy associated with our sensitive medical
00:08:14.380
data. And they will, and they are leading to highly intrusive systems of surveillance associated
00:08:20.240
with your geolocation data collected at various sites, et cetera. And this kind of what people are,
00:08:25.960
some people are calling it vaccine surveillance. This is what we have to fear.
00:08:31.900
I guess part of the challenge, because pandemic measures are all about where people are going,
00:08:37.400
who's going, where they're going, and whether they should or shouldn't go there. I mean, almost
00:08:41.500
every single pandemic measure or restriction has some form of privacy invasion in terms of keeping
00:08:48.160
tabs on where people are going, does it not? No, I agree. And there are these, you know,
00:08:54.840
these type of passports globally. That's why the kind of surveillance we're talking about
00:09:00.280
is enormous. You know, New York has the Excelsior pass, their passes in the EU. Here we have ours.
00:09:08.040
I mean, these are the concerns that are mounting, that once the pandemic passes, and it will move on,
00:09:15.600
that these measures that have been put into place will continue and will continue to erode your
00:09:23.280
privacy and expand surveillance, vaccine surveillance. So, Anne, one of the complaints
00:09:28.860
that people have made who have followed emergency preparedness over the years is that the pandemic
00:09:33.880
response team should have never been led in any province in Canada by a chief medical officer.
00:09:39.720
Instead, it should have been an emergency planning head. And we have emergency planning
00:09:42.940
bureaus in every province in Canada. They should have been in charge. And then the chief medical
00:09:47.380
officer would have been one of the primary people involved, but not the singular person.
00:09:51.580
And I could also imagine the position that you held for almost 20 years in Ontario, Ontario's
00:09:56.200
privacy commissioner would have also been at the table. I've heard a lot of complaints about how not
00:09:59.820
enough voices are at the table, sort of the command table. If you had been at the command table,
00:10:04.360
if the privacy commissioner had been there, why, how can we take, because we've established that
00:10:09.200
there's so many privacy violations now, could we have done pretty much what we're doing right now,
00:10:14.380
but have done it better? Absolutely. And I agree with you. Where are the privacy commissioners?
00:10:19.580
Why aren't we hearing from them? Why aren't they all over this? Because this is a huge issue and
00:10:26.720
we're not suggesting, you know, I hate the reason I hate zero sum models is it's either or win lose.
00:10:35.060
We're not suggesting that it should be privacy versus public health. Of course, you need both,
00:10:41.200
but you can have both. And we need solid measures and people in charge who are doing this, just like
00:10:48.480
you recommended in terms of who should be leading this. And my concern is there's also a lot of
00:10:54.880
documentation, articles from scientists, epidemiologists, doctors that take an opposing view to the views
00:11:01.860
you're hearing about in mainstream media, but you're not hearing about them. They've been
00:11:06.280
completely shut out. These are some of my concerns. One of the other things that has happened in Ontario
00:11:12.840
recently that I feel would be a much bigger deal if we were not preoccupied with talking about the
00:11:17.640
pandemic was the rollout of something called Ontario's digital ID, something that I know you're
00:11:23.160
very familiar with these, with these concepts, digital IDs. It's just been announced during the
00:11:27.980
pandemic. The rollout is beginning. I'm not going to try and explain what the digital ID is because I
00:11:33.620
might not do it as well. How would you explain what a provincial digital ID is?
00:11:37.760
What they're trying to do is replace, for example, driver's license that you have in paper form
00:11:42.300
or your OHIP card, things of that nature. Make it digital in nature so that if you lose the paper or
00:11:49.040
whatever, you don't have to worry about it. It is retained in digital form at all times. And I can
00:11:54.240
understand why people are doing that. But enormous, enormous protections have to be invoked. And in
00:12:00.480
fairness to the government, they have said, look, we know that this data is very sensitive. We're going
00:12:06.860
to encrypt it. We're not going to retain it in centralized form. They've introduced a number of
00:12:12.260
measures that will be privacy protective. But what I always say is trust, but verify. You have to verify
00:12:19.520
all of what is being claimed by the government is going to take place to make sure it's actually
00:12:26.040
being rolled out. So we have to have audits. Again, I don't know where the privacy commissioners,
00:12:31.400
I would be looking under the hood all over the place to ensure that the claims being made about
00:12:37.340
the privacy protective measures that the government intends to introduce, that they're in fact in place
00:12:42.540
and being made responsibly. There's all kinds of encryption you can have. You can have very strong
00:12:48.380
encryption or weak encryption. You know, all that kind of thing. Someone has to investigate this.
00:12:52.720
We need to audit all of these activities. So I've got the Ontario government's digital
00:12:57.020
ID website up explaining all of this. And they say, as an individual, you can use the digital ID to prove
00:13:02.180
your identity when you, and they give some examples, make an age sensitive purchase, like buy a lottery
00:13:06.880
ticket, apply for government assistance, access and use vaccination records, open a bank account. And it
00:13:13.320
says, as a business, you can use digital IDs for hiring new employees, proving identity,
00:13:17.480
verifying customers identity. I go, well, hold on a second. How does this all interact with each
00:13:23.220
other? Because, you know, again, I'm not, I want to say I don't want to be dystopian, but I guess I
00:13:28.540
say, yes, I want to go there and I want to do the worst case scenario. And then you can tell me
00:13:31.820
how much of it is potential, how we safeguard against it. We're talking about this digital thing
00:13:38.040
that has my vaccine records, it has my financial info, and I guess it can all talk to each other. I mean,
00:13:44.880
an employer is going to access some of it, the banks can access some of it, the governments can
00:13:49.180
access others. There's hackers out there who are smarter than government programmers, I think. So
00:13:53.540
I've heard. So I like, this sounds a little tricky. It makes me very nervous, not because the intention
00:14:01.040
isn't good. The government is trying to make life simpler for people. But in rolling this out, and like
00:14:07.040
you said, all the complexity involved, there are so many measures, the potential for a hack seems
00:14:13.560
enormous. Look at what's happening in Newfoundland and Labrador. They've had, oh, horrible hacks of
00:14:21.120
their health data. Over 100,000 individuals, health information has been accessed by hackers,
00:14:28.180
and health, the Ministry of Health's employee data. This is not something that can easily be avoided.
00:14:35.800
So the enormous measures that have to be put into place before this goes live, in terms of securing
00:14:43.160
the data, this is off the screen. I mean, I would hire the white hat hackers to try to break into it,
00:14:49.960
to make sure that they can't. Because the odds are someone will break into it. And like you said,
00:14:55.180
it's the most sensitive information. It's health information, it's financial information,
00:14:59.460
it's employers accessing employee data, and on and on. This is huge.
00:15:03.400
Okay, but here's the problem. Saying, okay, it's good intentions. We've got all of this together.
00:15:08.640
This is the point of what it's to be used for now. But a digital ID, the genie's not going back
00:15:13.160
in the bottle. It's going to continue forever. So the Ontario government only means that for these
00:15:17.520
three things right now, five years, 10 years, 15 years, 20 years down the road, surely, surely they're
00:15:24.200
going to slightly tweak the way it's being used. And presumably, they would be tweaking it in a more
00:15:29.880
expansive way, as opposed to a constricting way. Am I crazy to think that's possible?
00:15:35.300
No, not at all. That's a safe assumption to make. The hope is, and I'm challenging them on this,
00:15:41.760
they're saying that they're not going to retain this data in a centralized database,
00:15:45.000
which means that theoretically, each individual would have control of their own
00:15:51.320
digital ID, and would only be able to reveal it if it was they consented to it. This is very
00:15:59.260
complicated, because people are going to say, Oh, my God, I don't have to consent every time,
00:16:03.800
you know, just put it out there, that there's going to be so much confusion as to how this will
00:16:08.860
operate, and the measures that need to be taken to keep your data protected. And this is what concerns
00:16:15.120
me, the government is saying the right things. But that's why we need someone to look under the hood
00:16:19.960
and make sure the measures they they intend to put into place are being executed very,
00:16:25.740
very securely. We'll be back in just a moment after this message.
00:16:34.620
Did you lock the front door? Check. Close the garage door? Yep. Installed window sensors,
00:16:39.320
smoke sensors, and HD cameras with night vision? No. And you set up credit card transaction alerts,
00:16:44.260
a secure VPN for a private connection, and continuous monitoring for our personal info on the dark web?
00:16:48.320
Uh, I'm looking into it. Stress less about security. Choose security solutions from TELUS for peace of
00:16:55.620
mind at home and online. Visit telus.com slash total security to learn more. Conditions apply.
00:17:02.980
Wait, I didn't get charged for my donut. It was free with this Tim's Rewards points.
00:17:12.100
Earn points so fast, it'll seem too good to be true. Plus, join Tim's Rewards today and get enough
00:17:17.800
points for a free donut, drink, or Timbits. With 800 points after registration, activation,
00:17:22.640
and first purchase of a dollar or more. See the Tim's app for details at participating restaurants
00:17:27.320
When I found out my friend got a great deal on a wool coat from Winners, I started wondering,
00:17:35.600
is every fabulous item I see from Winners? Like that woman over there with the designer jeans.
00:17:41.080
Are those from Winners? Ooh, or those beautiful gold earrings? Did she pay full price? Or that
00:17:47.040
leather tote? Or that cashmere sweater? Or those knee-high boots? That dress? That jacket?
00:17:51.780
Those shoes? Is anyone paying full price for anything? Stop wondering. Start winning.
00:18:05.620
And I described in the introduction that you're the pioneer of the idea of privacy by design.
00:18:10.540
I just want to take a moment to ask you, what is privacy by design? What does that mean? How does
00:18:15.460
it work? Privacy by design is all about being proactive. It's a privacy, it's a model of
00:18:21.520
prevention, much like you have a medical model of prevention. You want to prevent the medical
00:18:25.620
harms from arising. I want to prevent the privacy harms from arising, not just offer remedies after
00:18:33.060
the fact. Privacy laws, which are very important, kick in after you've had the data breach or privacy
00:18:39.120
infraction and try to fix the problem. I want to prevent the problem by being proactive and baking
00:18:45.060
these measures into all your operations. Bake it into the code so it's an essential component.
00:18:51.220
How do the software developers, the corporations, the governments, how are they responding to you
00:18:57.460
and your like-minded colleagues who are doing these endeavors? Are you seen as a partner from the
00:19:01.980
ground floor? As I understand, you're basically saying this idea should be. How do they see you?
00:19:11.160
I'm delighted to say that it is being properly welcomed. Privacy by design has been translated
00:19:17.180
into 40 languages. Several years ago, there was a new law introduced in the European Union
00:19:23.020
called the General Data Protection Regulation. It's huge. It's very comprehensive. It embeds personal
00:19:29.580
control in the part of the data subject. But they also included my privacy by design in the law.
00:19:35.060
So it's taken off. I offer privacy by design certification because companies come to me
00:19:40.800
and they want to be certified for privacy by design. They want to tell their customers
00:19:45.140
the lengths they're going to to protect their privacy. It's growing like leaps and bounds. Thank God.
00:19:52.820
We broke down Ontario's digital ID. I'm hoping you can break down for me the internet of things and any
00:19:59.320
potential challenges we face with that. The internet of things, of course, being when we get 5G
00:20:04.820
internet everywhere such that all of our devices are increasingly connected to the net. A lot of us,
00:20:10.540
I have the ability to adjust my temperature and air conditioner on my phone right here. That's a
00:20:14.300
pretty common thing. But we know we're going to be seeing more of that with the refrigerator and the
00:20:18.400
toaster and you name it, everything else, which is really exciting and potentially nerve-wracking as
00:20:24.340
Well, it should be nerve-wracking because the internet of things, it's like the Wild West.
00:20:28.460
It's just running out the door at an amazing pace without any thought or very limited thought given
00:20:34.880
to privacy and security. That's what concerns me. I'm all for IoT, but you have to embed privacy and
00:20:41.840
security into these devices before they go running out. And right now they are running out,
00:20:47.520
which is why there are a lot of problems. There are lawsuits against the internet of things. It'll go
00:20:56.140
on and on for a while because in the excitement of developing these technologies, people aren't
00:21:03.180
devoting the time required to embed privacy and security into them. So stay tuned. You're going to
00:21:09.720
see a lot of problems with IoT before they disappear.
00:21:14.580
And to go all dystopian again, so if I have a device in my home that is connected to the internet
00:21:20.100
and it has either some form of a camera on it or some form of a microphone on it, is it conceivable
00:21:28.800
Oh, of course. Beware. I mean, you've all heard about Alexa. This story I heard of a few months ago,
00:21:34.820
this woman calls her best friend and she says, well, what's wrong with your marriage? I didn't
00:21:40.540
know you and your husband were having problems. And the woman says to her, what are you talking
00:21:44.620
about? We haven't told anybody about her problem. And the woman said, oh, but I heard about Alexa.
00:21:54.620
That's right. Alexa was on when they were having this heart to heart about their marital problems.
00:22:00.140
And that got conveyed to this other woman who has an Alexa. I mean, these are the nightmares
00:22:07.340
That is, you have to watch your devices. It's like people say, oh, good thing the cat can't
00:22:12.660
talk or the dog can't talk or whatever. Yeah, but the toaster can.
00:22:15.940
That's it. You have to turn them off. I'm not kidding. It's crazy.
00:22:20.160
Well, you know, I know some people who there's the little, the camera on top of the, on top
00:22:25.000
of the laptop and they have like duct tape over that unless they need to use it. Do you recommend
00:22:30.520
I think it's a great idea. Whether it's necessary or not is questionable, but it cannot hurt just
00:22:37.800
in case it does collect your information. You want to preserve your privacy and then in
00:22:43.080
your home, that's your bastion of privacy. It should be.
00:22:46.220
Now there is one company that says this stuff's never going to happen. Everything's totally
00:22:51.640
hunky dory. That company is Huawei. Huawei would like to participate in the building of Canada's
00:22:57.460
5G infrastructure grid. I understand there are some back channel, uh, bits of drama going on
00:23:03.860
between our, our five eyes partners, the United States and the UK are telling Canada guys, please,
00:23:09.180
please don't do this. The basic idea being you need these companies to help build the 5G
00:23:14.480
infrastructure, which would go into the internet of things. Huawei says, Hey, we can do this. Let
00:23:18.600
us be a part of it. I believe Ericsson and a couple other companies are already doing that here in
00:23:23.040
Canada. Uh, but Justin Trudeau still doesn't want to make a decision on Huawei. What are your thoughts?
00:23:28.420
Cause Huawei has, has some, uh, has some executives here in Canada and some lobbyists who I'm sure
00:23:32.920
we'll talk my ear off about how it's totally fine. Oh, Huawei does have amazing technology. There's no
00:23:38.580
question, but they also have the ability to have, you know, China intercept the communications. I mean,
00:23:45.180
they're connected. There's no question. So do you want to risk it? I mean, why is the UK and the U S
00:23:50.980
they're saying no to Huawei for very good reason that the potential for surveillance is very strong.
00:23:58.280
Now that we're talking about China, a lot of people think that the things happening in China
00:24:03.460
are at least a canary in the coal mine for us in terms of, well, they don't have a digital ID. They
00:24:08.980
have this thing called the social credit system, which is a digital ID, uh, kind of on steroids.
00:24:15.400
Um, how would you describe the social credit system and, and, and what sort of concerns it brings
00:24:20.140
to people in China? It just makes me gag. They have social credit scores for everyone. Everyone
00:24:25.400
has a social credit score. And if you jaywalk too much, you have a lower score. I mean, it's
00:24:30.360
ridiculous. I heard this story. And how do they know you're jaywalking? Because I know you're not
00:24:35.240
really making that up. Like they do kind of pay attention to that stuff everywhere. Cameras exist
00:24:39.760
everywhere that pick up everything about you, your communications, everything. It's just,
00:24:45.460
there's no privacy, obviously in China. There was a story also, this is heartbreaking. This young
00:24:50.700
student finishes high school. He's brilliant. He aces everything. He wants to go to university.
00:24:55.880
He applies to universities in China. He gets rejected from them all. Why? Because the social
00:25:01.780
credit scores of his parents are unacceptable. They're too low. Can you imagine you're punishing
00:25:07.200
the child because of the behavior of the parent? They jaywalk too much or whatever. It's ridiculous.
00:25:11.940
And I guess why I take this so seriously, as I know throughout the 1990s and other periods,
00:25:17.840
we were very optimistic that China was becoming more like us. They were becoming like the West,
00:25:23.420
welcome them into the WTO. Let's do so many deals with them, open branches there and everything.
00:25:27.640
But I think it's reasonable for me to say that if anything, that's been proven to be the flip side
00:25:31.840
and almost we are becoming perhaps slightly more like China. Oh, God help us if we are. But you're
00:25:38.480
absolutely right. China's surveillance has grown dramatically. There is no ability to escape that.
00:25:45.220
And I pray that that doesn't happen here in Canada. What should regular individuals do to make sure
00:25:51.840
that these things, you know, don't happen? I mean, we're talking about, I don't even want to say worst
00:25:56.340
case scenarios because I know some of these things that you've warned about in previous years
00:25:59.660
have been proven to in part come true. How do people escalate the privacy issues to the level they
00:26:07.180
need to be? You know, one thing I do, and I suggest to people to do it, whether you're buying something
00:26:13.300
online or in a real store, physically, you're present. And I make the purchase and often they'll
00:26:19.980
ask you for your postal code or something. And I'll say, Oh, what do you do with the postal code?
00:26:26.080
Do you share it with third parties? I ask questions. And the person, the clerk won't have a clue what
00:26:34.000
to answer, but they'll go get the manager. And the manager will come and say, Oh, you're concerned
00:26:38.200
about your privacy. Oh, well, in that case, we can do this, we can do that, we can encrypt the data.
00:26:43.080
They have measures in place, but you need to express your interest in privacy to engage that. So I urge
00:26:49.180
everyone to do that. Turning location services off on your phone. Is it is that a valuable thing?
00:26:55.240
Some people say it doesn't really matter that this can places like apps or whatever still go around it.
00:26:59.560
It's a good question. Um, the, the only negative associated with it is there may be a need to
00:27:07.000
actually have your location services available to you. So you have to be careful, but it's probably
00:27:12.220
there anyway. I want to talk to you about a rather odd project that was planned for downtown Toronto.
00:27:18.520
It's not happening anymore called sidewalk Toronto, where alphabet Google's parent company got, I guess,
00:27:24.920
the rights you could call it for a large bed of acreage, a piece of downtown land that had previously
00:27:31.340
not been developed because it was more industrial land. And they said, this is the great opportunity
00:27:35.480
to create a smart community, a community entirely plugged into a grid where everything from garbage
00:27:41.580
collection to the temperatures and the rain precipitation. And I think who's standing where
00:27:47.420
and what, and everything's going to be on the grid and smart. And it can be sort of,
00:27:50.620
if not controlled, at least kind of monitored and directed from a central hub. And you were at the
00:27:55.840
forefront of individuals saying, okay, maybe sounds kind of cool, but boy, there are some concerns with
00:28:02.680
having an entire neighborhood that every inch by inch is like this.
00:28:07.080
Well, the interesting thing, Anthony, they approached me and they wanted to hire me to embed privacy by
00:28:13.300
design into the smart city. Great. And I said, great. I would love that because I live in Toronto. I want
00:28:19.340
this to be as privacy protective as possible. And I studied it and I said, okay, what we have to do
00:28:24.740
is we have to de-identify data at source, meaning because the sensors, the technology is going to be
00:28:29.820
on 24 seven. I said, the minute anything picks up anything that's personally identifiable, we have to
00:28:35.840
strip the personal identifiers from it. And they said, great, we'll do that. And everything went along
00:28:40.820
fine along those lines until they got criticized by Jim Balcelli about something. It wasn't even about
00:28:47.480
the privacy thing, but they, they got very, very concerned. And so I'll never forget the board
00:28:52.360
meeting at which I had to resign because at the board meeting, they said, and this is the only thing
00:28:56.600
they didn't consult with me on. They said, oh, we want all of you companies, the companies who are
00:29:02.940
participating in the development of the smart city to de-identify data at source and embed privacy by
00:29:08.280
design. But of course we can't make you do it. It's, it's, it's up to you. And the minute they
00:29:13.360
said that, oh, for God's sakes, like they're going to do this on their own. And I literally, I resigned
00:29:18.500
the next morning and they were shocked and horrified. Why are you resigning? And I said, because you didn't
00:29:23.060
talk to me about this and now we're not going to have any privacy at all. You're leaving this up to
00:29:27.160
people's goodwill. Are you kidding me? So unfortunately, and it just ended up disappearing and there is no
00:29:33.440
more alphabet smart city happening in Toronto anymore. It just fizzled away. So this is the
00:29:40.920
problem. You have to walk the talk always. When you say you're going to do something, you have to
00:29:46.520
follow through. And that's why I always say we have to look under the hood and make sure they're doing
00:29:51.320
what they said they're going to do. But you did believe that there was a way to do this properly.
00:29:55.860
I did because there always is. See, if you say no to everything, they're going to go around and go back
00:30:00.900
to the zero sum model and say, OK, forget about privacy. We need to have a smart city. Forget
00:30:04.860
about privacy. No, you don't forget about privacy. You figure out how to do both. That's why I hate
00:30:10.120
zero sum either or models. We can have positive sum, multiple positive gains at the same time.
00:30:16.800
You just have to devote some thought and attention to it. But I got to be cynical here. If I'm a CEO of
00:30:22.480
a company involved in any project that is doing mass data collection, what is the point of me getting
00:30:28.540
that data if I cannot do data mining and I can't sell it and I can't turn it into marketing material?
00:30:34.560
You can do data mining. You can do an enormous amount. You just can't do it in a way that reveals
00:30:39.740
personal identifiers. And the reason you want to do that is because when you reveal those personal
00:30:45.780
identifiers, you're going to lose those customers. See, there's such a competitive advantage to embedding
00:30:50.620
privacy, privacy by design, because customers love it. It builds trusted business relationships
00:30:56.660
where trust is, there's such a trust deficit right now. So I always, you know, I talk to companies
00:31:02.380
all the time. I go into the boards of directors and they, you know, they're shaking their heads.
00:31:05.940
They don't want to see me. And I say, give me 10 minutes. Let me tell you how you can do
00:31:10.280
your business interests and protect your customers' data in a way that will increase their loyalty
00:31:16.840
and will give you greater business. Then they're all in. They want to hear all about it. And we figure
00:31:23.280
Now, I know there's a lot of people who aren't too concerned about their data getting everywhere
00:31:27.100
because they go, well, I know it's just corporations who want it to sort of micro-target me
00:31:31.260
for their goods and so forth. And to be honest, I'm kind of fine about that. It might tell me about
00:31:34.880
a few sales. I'm not that concerned. There are others who go, and I referenced the China social
00:31:39.840
credit system for a reason. Okay, fine. But this whole, this sets a framework and a stage for
00:31:46.820
Absolutely. And what I say to people is if you have no problem with companies and governments
00:31:53.160
accessing your personal information, great. Give it away. Be my guest as long as you make
00:31:58.240
the decision to do so. Privacy is all about control. Personal control on the part of the
00:32:03.880
individual, the data subject relating to the use and disclosure of their information. If they want to
00:32:09.120
risk it and have the gains from various companies, great, do it. But do it knowingly. And the thing is,
00:32:16.540
some people will do it and others won't. You have to have other measures for individuals who want to
00:32:22.000
have their privacy protective. It's not an either or proposition.
00:32:26.120
Is there a way we can do the fine print better? Because I know a lot of people who don't even
00:32:31.240
read their mortgage documents, which is something that they're signing up for, you know, 500 grand
00:32:35.560
for 25 years of their life kind of thing. They don't even read those full documents. So they're
00:32:39.040
certainly not reading the fine print on some app, which may be, you know, turning on their camera
00:32:44.620
Oh, Anthony, I talk to people all the time about how to make sure that they know what's going on in
00:32:56.700
terms of the fine print. But you see, I don't expect them anymore to review all the terms of
00:33:02.840
service and all the legalese and the privacy policy. Life is short, but it doesn't mean people don't care
00:33:08.320
about privacy. Concern for privacy is at an all time high for the past two years. It's come in in
00:33:13.640
the public opinion polls at the 90 percentile, 90 percent very concerned about their privacy,
00:33:18.820
92 percent concerned about loss of control over their personal information. The answer to this
00:33:23.680
is privacy by design, because one of the seven foundational principles of privacy by design
00:33:28.700
is called privacy as the default setting. What that means is you say to your customers,
00:33:34.500
you don't have to ask for privacy. We give it to you automatically. It's the default setting.
00:33:40.080
We are only permitted to use your personal information for the primary purpose of the
00:33:44.460
data collection that you consented to. Beyond that, we can't use it for anything. If a secondary use
00:33:49.540
arises down the road that we'd like to use your information for, we have to come back to you and
00:33:53.800
seek your additional consent. This is a win-win all around. It's a game changer. Customers love it
00:34:00.560
because it builds such trust and companies love it for that reason, too. We can do this.
00:34:06.040
And I wonder to what degree do we get the privacy we deserve? By which I mean, I remember when I was
00:34:11.840
a kid or, you know, not too long ago, of course, you go to a store and you say, I'd like to buy this
00:34:15.840
and you give them the cash or even the credit card and they take your money because you think a
00:34:19.440
business would just be fine to take the money and be done with it. But now it's, as you said,
00:34:23.600
about going to the grocery store, postal code or even address, please, or phone number. And at
00:34:27.960
Loblaws, because, you know, they do it every time now, I just say no. I just say no and they know and
00:34:31.440
then they don't do it. But at other places, you'll be at a small store. I'll be taking my kids
00:34:36.040
and I'll just say, what? I said, why do you need my phone number? Well, we need it to create a
00:34:39.660
profile or I'm like, what? No, I'm here. You got a service. I'm giving you the money. That's it.
00:34:45.440
That's what we're doing here. But yet, so most people, I think just, oh, okay. And they give
00:34:50.300
them the information. I think that's changing slowly, Anthony. It is a slow ride, but most people
00:34:57.980
just aren't aware of the potential egregious uses of their information. And increasingly,
00:35:04.320
they are becoming concerned. In the past, I used to speak to public opinion, public groups,
00:35:09.140
and explain why I think they should be concerned about their privacy. I don't have to do that
00:35:14.140
anymore. They already know and they're asking me questions about how to do it. So I think slowly
00:35:20.860
that's beginning to change. Are you optimistic then about the future privacy?
00:35:27.120
Oh, Anthony, I'm the eternal optimist because you have to be. Otherwise, you're shaking your head
00:35:32.240
all the time. I'm not suggesting for a moment that it's getting easier. It's not. And surveillance
00:35:37.440
is abounding and we have to remain vigilant. But having said that, there's lots of groups
00:35:43.180
that are developing for this. For example, there's something called a decentralized identity
00:35:48.020
foundation consisting of all the major companies in Microsoft, Intel, et cetera. And it's all about
00:35:54.340
decentralizing identity because that way everyone can't have access to your identifying information
00:36:00.100
in a centralized manner. So there are measures taking place and encryption is getting stronger.
00:36:05.620
There's ways to do this. There's something called synthetic data, which de-identifies your data.
00:36:10.740
So it's like a chess game. We always have to stay one step ahead.
00:36:15.660
A crazy world out there for sure on the privacy matters. Anne Kavukian,
00:36:19.360
thanks so much for wading through it all with us, giving your expert insights. We really appreciate it.
00:36:23.180
Oh, of course. It's my pleasure, Anthony. Thank you.
00:36:27.400
Full Comment is a post-media podcast. I'm Anthony Fury. This episode was produced
00:36:32.800
by Andre Proulx, with theme music by Bryce Hall. Kevin Libin is the executive producer.
00:36:38.140
You can subscribe to Full Comment on Apple Podcasts, Google, Spotify, and Amazon Music.
00:36:43.220
You can listen through the app or your Alexa-enabled devices. You can help us by giving us a rating
00:36:47.380
or a review and by telling your friends about us. Thanks for listening.