00:55:01.260How much of it is on us to have the freedom to choose and make the right decisions?
00:55:05.960And how much of it is on the creator of the brand of Facebook?
00:55:08.640Well, I'm so happy that you asked that, actually, because I think it's a bit of a balance, you know?
00:55:14.460It's important to have informed consumers and informed citizens so that we know how to protect ourselves.
00:55:20.320It's also really imperative that the entire onus is not on us, that companies and even governments are giving us more transparency and awareness of what they're doing.
00:55:33.860And they're not putting us in a position where, if we are not well-educated, that we're being taken advantage of and that we can so easily be abused.
00:55:43.280So, right now, we have two massive problems, which is that tech companies will not make the ethical decision without being forced to.
00:55:50.320By laws and regulations that we don't yet have.
00:56:05.320And so, that's why we're in the position where Facebook has so much power and companies like Facebook.
00:56:10.540And on the other hand, we have a population that is incredibly digitally illiterate.
00:56:17.180We do not understand what our data rights are, how to protect them.
00:56:20.380We don't understand basic cybersecurity protocols and how to keep our data private if we wanted it that way.
00:56:25.820We don't understand media literacy, you know, how to spot disinformation and fake news.
00:56:31.480You know, kids don't know how to understand cyberbullying and how to stop it.
00:56:34.980We don't know how to be ethical to each other online, especially when we're anonymous.
00:56:38.920These are all things that need to start being integrated into the education system because we just have an undereducated population for our overexposure to, you know, our digital life.
00:56:51.060How much of that are we teaching in high school right now?
00:56:53.140How much of that are we teaching in junior high school right now?
00:56:55.880I mean, you were 13, 14 years old, and 13 years old is, what, 8th grade, 7th grade?
00:57:22.300We do digital literacy training for kids in schools.
00:57:25.020We're starting with middle schools because we think that's really the first age group, like 8 to 12 years old, where your parents have probably given you a phone.
00:57:35.420Even if your parents haven't given you a phone, you might have a family computer, and you're probably using digital devices in school.
00:57:42.100So, you have your own accounts, whether it be social media or at least email accounts.
00:57:48.940You're surfing online for sure in order to do at least research projects.
00:57:53.320But, you know, a lot of kids have full exposure.
00:57:56.460They're on their phones all day, every day.
00:57:58.960There are some kids where if their parents do not stop them, they will actually be on it 24-7.
00:58:03.220And not having the awareness of all of the different issues that I just listed is really debilitating, and it's really harming the psyche of kids, and it's harming their chances to be successful.
00:58:44.160But in this movie, the story is about his kid is being bullied online, and these friends of his in school create a profile, and they say, hey, you share your penis with me.
00:58:55.240I'm going to share the picture of my girl because I'm going to share my privates, and they took the picture of the girl that they knew he was obsessed with, right?
00:59:03.440And he sends it to him, and then the next day, those bullies from school take the picture and send it to everybody in school.
00:59:10.460He goes to school, comes home, one day the dad is coming home from work, he goes to his room, he's putting a loud heavy metal music, and he goes in, he's about to hang himself.
00:59:18.420Dad grabs him, puts him, and says what happened, and then they find out what the whole story was.
00:59:21.420That was a perfect example of what kind of bullying is going to be taking place right now if you don't teach your kids.
00:59:25.520One of the best movies to watch with your kids is that movie because I think it will show your kids what is possible.
00:59:32.160Anyways, I don't want to digress from it.
00:59:33.480Let's go back to what we were talking about.
00:59:34.920Would you consider yourself, I mean, I know politically I've heard you say you were for Bernie.
00:59:41.600You were not Hillary camp, you were Bernie, and you're a Democrat.
00:59:46.140Would you still position yourself as that today, or has it changed a little bit?
00:59:49.460I would say the way that I see American politics is more from an independent stance, especially because I spent my entire adult life living in the United Kingdom,
01:00:00.060where actually even the Democrats in the United States look quite conservative.
01:00:05.280In the United Kingdom and in a lot of other countries in Europe, it's taken for expectation and taken for granted that everybody has free access to health care,
01:00:19.980that if you become homeless, you get a government house, that you can have a weekly or monthly stipend from the government that will cover the needs of you and your family if you fall on hard times.
01:00:32.920So, a lot of the policies in America, even on the Democrat side, I find to be actually shockingly unhumanitarian.
01:03:31.700And then you move on and you go to a different relationship and a different relationship and a different relationship.
01:03:35.820And the older you get, the tougher it becomes to go experience what you once experienced with that puppy love, right?
01:03:43.680So for you, you're in a family, your grandpa's an MI person, 27 years military, you know, then from there you go and you're inspired to work on the Barack Obama campaign, senator, you have breakfast with him, with him and Rahm Emanuel.
01:04:01.800And you're like, oh my gosh, I can't believe this person exists.
01:04:59.700It's always good to have a dose of, instead of cynicism, I would say skepticism, to make sure that you are actually questioning what people are telling you.
01:05:10.000I think I spent too many years believing people at face value that what they were telling me was true.
01:05:16.820And that they actually had an intention to do something good for the world when they didn't.
01:05:21.140And so now I'm a lot more skeptical of what I'm told.
01:05:26.000I do more due diligence, definitely, than I did before, before I think about working with people or thinking that what they say to the public is actually what they believe behind closed doors.
01:05:39.360And, you know, that's why I think right now in the presidential fields, when I think about, you know, who represents my true beliefs, some people have some good things to say.
01:05:52.760But, you know, I haven't thrown my support behind anybody specifically because there's nobody that really speaks to everything that I'm talking about.
01:06:02.020I mean, the only candidates that we have that even have technology policy are Elizabeth Warren and Andrew Yang.
01:06:08.460Andrew Yang's fantastic, but, you know, he would make a really great, you know, CTO of America.
01:06:15.320But I think we really have been shown over the past few years that we need someone with a lot of foreign policy experience, someone that can go out and do diplomacy.
01:06:26.140Maybe Elizabeth Warren is one of those people, maybe she's not.
01:06:28.900But the rest of the political field hasn't even thought about data or privacy policy or technology regulation.
01:06:41.920And, you know, obviously that's my number one issue at the moment because I believe it underpins so many of our other problems that we have in society that it needs to be taken care of and it's not being addressed.
01:06:55.880I wonder who is going to do it, though.
01:06:57.700I wonder who is going to actually be talking about it, because when you look at it, I haven't heard Joe Biden talk about it.
01:07:03.100I haven't heard Sanders talk about it.
01:07:04.920Yang will talk about it and people will resonate with them.
01:07:08.720Elizabeth Warren is part of her message.
01:07:32.100I think the Democratic side is spending too much time tearing each other apart as opposed to actually building a unified message that can get people to care about politics again and actually get people out to the polls.
01:07:51.420I think right now the DNC is terribly disorganized, and that's unfortunate.
01:07:57.080Now, there's another topic of what's going on right now, which is impeachment.
01:08:07.760The articles have not even been sent over to the Senate.
01:08:09.980I have a strong feeling after watching the Republican members of the House of Representatives making their testimony that it is very unlikely that the Senate will proceed.
01:08:25.940Two other people voted president, right?
01:08:27.520And 100% Republicans voted against it.
01:08:30.840And in the Senate, you already heard what he said that he's going to do.
01:08:35.640It's not even going to die day one, right?
01:08:37.400But do you think – here's a curious question for you since you've worked in a marketing world and messaging is critical.
01:08:45.000Do you think sometimes a lot of these candidates are in the shadow of Nancy Pelosi and the impeachment campaign that they have, that they're driving, where it's taking –
01:08:57.440like last night, nobody knew the debate was taking place last night.
01:09:00.080I was like, oh, shit, we've got a Democratic debate, right?
01:09:05.060So do you think in a strategic way it's actually hurting the camp because, like, it's almost like a father that cannot help take the shadow away so his son can –
01:09:18.960you know, it's like the DNC cannot take the attention away to say, listen, let Biden, let Bernie, let Warren, let these guys get there.
01:09:25.300Because the media should be talking about them 24-7, not be talking about impeachment, knowing you're not going to win two-thirds on Senate.
01:09:34.580You think it's kind of hurting a little bit of the Democratic candidates?
01:09:37.040Well, I think technically it could be, but we can't think of it that way.
01:09:40.580Because when laws are broken and when our Constitution is violated, people need to be held to account.
01:09:47.860And I'm sorry, but I actually believe that when this president is no longer immune, when he's no longer in this seat, that he will be indicted for many different crimes, actually.
01:10:16.380So when we're going through Cambridge Analytica and I'm looking at, okay, it's very obvious, you know, the different kind of persuadables, the deterrence, you know, the possible pro-Trump, the absolute anti-Hillary, great, great strategy.
01:10:30.600But somebody could say, well, Brittany, I mean, let's not be naive.
01:10:34.920This has been going on for a long time.
01:10:36.500It just happens to be that today's tool is this.
01:10:38.440Somebody could say, you know, there used to be time we used it by bullying people, like literally bullying people and preventing some communities from being able to vote and knowing who to target and putting the fear into some communities to not even going to vote by bringing some power people, like in the 1800s and the 1900s.
01:10:57.760Hey, making sure people were fed to vote, you know, just throw some food at them.
01:11:02.080They needed some poor areas to win their votes over.
01:11:04.860Well, then, you know, it could be we used the mob a little bit because a mob helps with Kennedy to help them with election.
01:11:10.240And, you know, the mobs involved with that election that took place.
01:11:12.800And then, well, you know, what helped with some of these other guys is radio, whoever was better on radio.
01:13:13.300And, unfortunately, Facebook refuses to enforce these laws on its platform either, even though it is the world's largest communications platform.
01:13:22.300I'm not talking about censorship versus free speech.
01:13:25.660And I hate that people always bring it to that because my free speech is not unfettered.
01:13:31.520My free speech ends when your human rights begin.
01:13:35.560So, I am not allowed to discriminate against you.
01:13:38.200I am not allowed to incite violence upon you.
01:13:40.580I'm not allowed to suppress your vote.
01:13:42.680But yet, somehow, I'm allowed to do that on Facebook or politicians are allowed to.
01:13:46.360I'm not allowed to because I'm a common person.
01:13:48.480And this is the problem that I'm talking about.
01:15:10.520Like, what I'm saying to you is, I honestly think the DNC needs to hire a legitimate marketing firm to help with the languaging.
01:15:21.700And if MSM, mainstream media, needs to collectively come together and change their messaging or else the way they're going right now, it's going to be bad for a long time.
01:15:31.180Because the more you give me an anti how bad of a person I am, you're constantly building me up.
01:15:35.900I don't think that's an effective strategy.
01:15:37.540No, I totally agree in something that you'll, I don't know if you'll find it funny or horrible, but after Donald Trump won the election and we as Cambridge Analytica Commercial were going out to pitch advertising campaigns,
01:15:53.220when we went to go pitch big media companies, they would say, you know, we'd get a meeting with CNN, for example, and we'd go in and we'd be like,
01:16:05.320oh, well, we thought it was going to be really hard to get this meeting because, you know, you don't like our biggest client.
01:16:11.400And they'd say, oh, no, we made so much money off of covering Donald Trump.
01:19:49.920What is life like right now for you as a whistleblower, career-wise, your personal life, your comfort, your level of comfort of feeling safe, your fears?
01:20:05.260What is life of a whistleblower today?
01:20:08.340Well, it's definitely the scariest thing I've ever done.
01:20:11.280I'm not going to pretend anything other than that.
01:20:13.960But it's been something where I've been so lucky that what I said resonated with people.
01:20:25.900They care about their use of technology.
01:20:28.000They care about being abused and taken advantage of by big tech.
01:20:31.100They care about owning their data and actually having control over the value that they produce every day and over their private information if they want to keep it private.
01:21:11.580How do I support the Own Your Data campaign?
01:21:14.140And these are people that are calling their legislators.
01:21:16.980These are people that are going out and getting active.
01:21:18.980Some of these are people that are working in big advertising companies that are now working on data protection policy and working on data ownership mechanisms for their consumers where that concept didn't even exist in their companies before.
01:21:36.580I've been, again, lucky and honored to be a part of it because a lot of other whistleblowers don't get that.
01:21:44.320I, earlier, I think it was before this interview that we were chatting about National Whistleblower Day that just started this year in Congress.
01:22:07.840And, you know, at first it was really disheartening for me because I saw some people in this room who had a really important story to tell.
01:22:18.100They had managed to find evidence of corruption in government agencies or within important companies.
01:22:24.440And they didn't know how to talk to the media or they had been trying for years and no one wanted to tell their story.
01:22:32.080And I'm sitting there with five million press, you know, press pieces about me and my story and a book deal and a film.
01:22:40.240And I'm just like, wow, it's so amazing that some of these people persist and keep on going, even though people are not listening.
01:22:49.940And the fact that I was so lucky that people wanted to listen just blows my mind.
01:22:55.480And I want to address, you know, the safety question that you had, which is that, you know, no, I don't feel completely safe.
01:23:03.520It's not like I get threats every day.
01:23:05.840But there are definitely a lot of powerful people that would prefer if I stopped doing interviews like this every day and would prefer if I stopped pushing data privacy legislation in Congress.
01:23:21.040The threats are not interesting to me.
01:23:22.940So I think that it's just important to recognize that becoming a whistleblower is not easy and it's something that should be encouraged in order to force transparency and to weed out corruption.
01:23:34.640You know, one day, hopefully, whistleblowers are protected enough that it's a lot easier for us to stop corruption before it becomes a really big problem.
01:23:43.260Would you say Julian Assange is a friend or somebody you admire?
01:23:47.960Well, he was someone that I admired for a very long time.
01:23:52.460I think whatever role that he had in the hacking of the DNC or not, that's not something that I support, obviously.
01:24:02.440But the work that he did in the beginning and what he stood for, for full transparency and for holding power to account is something that I will always support.
01:24:16.120His dropping of the Iraq war files to show the crimes against humanity that were committed or war crimes that were committed had such an effect on what I did for the rest of my life and the way that I viewed the world and the way that I view my own government and the way that I question things.
01:24:34.940Yes. So I think, you know, he's someone that is in a very sad situation right now, and it's specifically because whistleblowing laws are not strong enough and they need to be.
01:24:47.800How was it when you met with him? Because I know you and him had a meeting together. How was that experience?
01:24:51.700It was so sad to see someone who has been basically in solitary confinement for seven years. He was nearly see-through, and I hardly got to have a conversation with him.
01:25:04.160I mean, you could tell he was obviously psychologically affected by being in there because it was almost like he was just talking at me for the whole, like, 20 minutes that I was there.
01:25:12.520You know, everything that was inside his head because he doesn't really get that much human contact. So everything he was thinking about, he just rattled off, and, you know, it's really sad to see that, especially when I worked in human rights, working with prisoners of conscience and working with political prisoners was something that I specialized in, and, you know, at the time I really did kind of see him as a political prisoner.
01:25:39.240I mean, his life is done. I mean, what are you going to do? Married, kids, public life, going out, seeing things, movies, just the day-to-day. I want to go to a restaurant having dinner. He can never do that for the rest of his life.
01:25:53.940Australia's fighting to get him back, but Australia versus America doesn't usually end up with Australia on top.
01:25:59.900No, it's not one that you're going to win too often.
01:26:06.940What are some of the biggest threats we have today?
01:26:08.880Now, obviously, you know, data is one. And by the way, you know what's so weird is I'm a CEO of a financial foreman. I go to a lot of these conferences with these big 50, 100, 200 billion auto insurance companies.
01:26:18.080And in the last 18 months, the most common conversation that's been coming up is cybersecurity.
01:26:23.400I mean, it's just, it's, we've never seen this much before, and it continues to coming up right now.
01:26:28.180Based on what you see and what you know and being on the inside on many of these things, what do you see as the biggest threat we are facing today?
01:26:35.420Is it data, cybersecurity? Is it China? Is it Russia? Is it internal? Is it companies the size of Facebook, Amazon?
01:26:46.640You know, what do you see as the biggest threat to the average person?
01:26:50.960I would say the biggest threat to the average person is the fact that there is a complete and utter lack of data protection.
01:26:58.920And data protection has a lot of different parts to it, right?
01:27:04.100That is everything from, yes, cybersecurity fending off attacks that could come from anywhere in the world from bad actors.
01:27:13.020That is also securing the value that us as individuals produce, the tracking and traceability of where data goes, who has it, where it's held, what it's being used for,
01:27:26.400and actually having any sort of opt-in or permission structures for my personal information to be used by certain people for certain purposes,
01:27:35.160and the right to actually monetize my own value for myself as opposed to being exploited.
01:27:41.100Data and the way that it is used means that anybody in the world can buy my time, my attention, and my privacy.
01:27:54.440And right now, because of our lack of legislation and regulation, because of our lack of technology that can actually manage and track and trace data in a reliable way,
01:28:05.200it means that our democracy is up to the highest bidder.
01:28:08.440And that's what scares me the most, which is why I do what I do every day.
01:28:32.400I just had a friend that did a study with, you know, a set of diabetes researchers and pharma companies
01:28:38.940and said, for somebody that qualifies for a diabetes study, how much do you pay for that medical data?
01:28:47.860Because, you know, most medical data comes from young 18 to 35-year-old white men in college who go for extra beer money to medical trials, right?
01:28:59.700And so to find someone that qualifies for a diabetes study, it costs them $28,000 for six to eight weeks of data
01:29:08.220in order to find the people, in order to get people to become part of a study, in order to complete the study.
01:29:16.480For a couple months of medical data, which is probably just blood tests, urine samples, they go in a couple times, right?
01:29:23.340So if we are to build a future where we own our data and where we are actually able to profit from our own human value,
01:29:34.400we need to have systems where we can share our data anonymously and securely,
01:29:40.120where we have laws and regulations that allow us to own it as our property,
01:29:44.720and that, okay, I'm fine if I produce data with a pharma company or I produce data with Facebook,
01:29:49.600they can have part ownership and I can have part ownership.
01:29:52.220I wouldn't have produced that data without them.
01:39:59.480I'm just curious to know how far he can go with this working out.
01:40:02.900First of all, Brittany, thank you for coming out.
01:40:05.440What I do want to say is, guys, if you haven't yet purchased a book, buy a copy and start reading it.
01:40:10.380I highly recommend you read this type of content because it's good for you to be in the know, especially today knowing data is now multi-trillion dollar your industry.
01:40:18.820For you to know how to protect yourself, your business, and your family.
01:40:21.220Brittany, thank you so much for coming out.