Valuetainment - February 21, 2020


Episode 431: Cambridge Analytica Whistleblower Opens Up


Episode Stats

Length

1 hour and 40 minutes

Words per Minute

178.02539

Word Count

17,963

Sentence Count

1,316

Misogynist Sentences

20

Hate Speech Sentences

9


Summary


Transcript

00:00:00.000 30 seconds.
00:00:01.800 Did you ever think you would make it?
00:00:04.220 I feel I'm so close I could take sweet victory.
00:00:07.620 I know this life meant for me.
00:00:10.760 Yeah, why would you bet on Goliath when we got Bet David?
00:00:14.580 Valuetainment, giving values contagious.
00:00:16.420 This world of entrepreneurs, we get no value to haters.
00:00:19.160 How they run, homie?
00:00:20.140 Look what I become.
00:00:21.400 I'm the one.
00:00:22.520 I'm Patrick, good to be your host of Valuetainment.
00:00:24.100 Today I'm sitting down with Cambridge analytical whistleblower, Brittany Kaiser.
00:00:27.360 That's all I need to tell you.
00:00:28.320 We went into a lot of different things.
00:00:30.040 Her campaign with Obama, with Trump, the insider, what's micro-targeting,
00:00:35.260 behavioral micro-targeting, the technical aspect of it,
00:00:38.620 her relationship with Alexander Nix, who was the founder of Cambridge.
00:00:41.480 All I'm going to say to you is if politics and the current micro-behavioral targeting
00:00:46.380 attract to you, you want to know more about it, don't miss this podcast.
00:00:50.280 Brittany, thanks for coming out.
00:00:51.580 Thank you so much for having me.
00:00:53.000 Yes, so you and I were talking earlier and I was asking you,
00:00:56.600 what it's like to have the life that you've had because you've been all over the world.
00:00:59.920 And you told me the longest you've stayed in one place was what, the last 13 years?
00:01:05.240 For only 10 days.
00:01:07.120 So let me get this straight so everybody understands this.
00:01:08.920 In the last 13 years, you've only, the longest you've stayed in one place was 10 days.
00:01:14.060 Sometimes I don't stray too far.
00:01:15.760 I'll just go to a neighboring city somewhere else in the same state or the same country.
00:01:20.920 But sometimes I'm in a different country every day.
00:01:25.100 How much of that is because of your personality?
00:01:27.520 How much of that is because of where you're currently at?
00:01:30.900 A bit of both.
00:01:31.900 I've always been someone that loves to explore the world, find out everything I possibly can,
00:01:37.200 go and see and do everything I can get access to.
00:01:40.120 And sometimes that turns out very well.
00:01:43.080 And I suppose sometimes I get myself into a bit of mischief.
00:01:46.400 Who were you though?
00:01:47.280 Who were you?
00:01:47.660 If I was friends with you, like what kind of a 10-year-old kid were you?
00:01:50.260 Not even in high school.
00:01:51.160 I want to know how you were at 10 years old.
00:01:52.960 When I was 10 years old, I spent a lot of my time sitting in the corner of the playground
00:01:57.560 reading the biggest book I could find and not joining in the games of tag.
00:02:02.940 The biggest book you could find.
00:02:04.480 Biggest book I could find or I'd be at home working on my history fair, science fair project
00:02:10.420 or studying for mathletes or debate club.
00:02:15.280 How though?
00:02:16.020 Is it your wiring?
00:02:17.260 Was it inspired by parents?
00:02:18.340 Because I know your dad I think was like a real estate.
00:02:21.060 He was in real estate and mom was with Enron before.
00:02:24.600 Was there conversations about politics?
00:02:26.760 Was there conversations about like deeper issues or was it just more in your DNA?
00:02:30.760 My parents always raised me to work as hard as I possibly could and to be a very high
00:02:36.120 achiever.
00:02:37.020 They both grew up in households where they had, I would say they weren't completely motivated
00:02:43.200 to be the most academic students.
00:02:45.920 They got involved in a lot of things as kids.
00:02:48.620 Parents.
00:02:49.080 Yeah.
00:02:50.160 So they just felt that they wanted to give me opportunities that they didn't have.
00:02:55.820 So did they kind of flip, meaning they didn't have the highest standard of expectation and
00:03:01.700 they said we're going to have a higher standard of expectation from you?
00:03:04.180 Exactly.
00:03:04.700 It's interesting how that goes because it's like, oh, it's too high.
00:03:07.360 I'm going to be a little bit more liberal leading my kids on this thing and kids come
00:03:10.640 back and say, I want a little bit more discipline.
00:03:12.600 Well, I wouldn't say a little bit more discipline.
00:03:15.600 My mom was raised military.
00:03:17.340 So she had a lot of discipline.
00:03:19.020 So she had that.
00:03:19.440 But she was the oldest of six kids, raising them on a military base.
00:03:24.160 So she was almost like a second mom and didn't really get to concentrate on her studies.
00:03:28.140 That's tough.
00:03:28.700 And you also, when you're in that situation sometimes, you also are not able to be a kid
00:03:34.620 or a teenager because so much has relied on you to lead.
00:03:38.160 So you almost skip a generation of your life and you sometimes look back and say, I don't
00:03:42.440 even know what it is to be a kid.
00:03:43.480 Yeah, my mom's expressed exactly that to me many times.
00:03:47.400 What kind of conversation was it?
00:03:48.820 If I'm in your house, we got four or five cameras at your dinner table, six o'clock, Thursday
00:03:55.120 night, you're sitting having dinner, you're 11, 12 years old, mom, dad's sitting there.
00:03:58.720 What are you guys talking about?
00:04:00.520 I would say probably what I did in school that day.
00:04:03.900 I was always very interested to talk about what I had learned, what I was going to do with
00:04:08.140 it, what I had done after school.
00:04:09.880 I was very lucky to be put in, you know, art lessons and photography lessons and be playing
00:04:16.080 sports after school.
00:04:17.600 My parents really spent all of their lives trying to make sure that me and my sister had
00:04:22.580 as many opportunities as possible.
00:04:24.760 And obviously you did.
00:04:25.800 I mean, to see what you ended up doing yourself and where you're at today, there was a lot
00:04:30.340 of that growing up.
00:04:31.380 Absolutely.
00:04:31.980 Was there conversations about politics and different issues or not really?
00:04:35.280 So interestingly enough, my parents have considered themselves independent for a lot of their
00:04:40.700 lives.
00:04:41.320 My dad's side of the family, although they're from Chicago, they've always leaned conservative,
00:04:46.400 but they voted for both Republicans and Democrats.
00:04:49.700 My mom's side of the family are military Republicans that grew up in all over the world.
00:04:56.220 So they do understand a worldview, but they've lived by a military base and in the South for
00:05:02.440 a while and I've always voted Republican.
00:05:05.040 So my parents really didn't have a kind of a strong, you know, party stance and they never
00:05:13.020 really instilled that in me or my sister.
00:05:15.180 Even when we were little, they wanted us to feel out what our religion should be, what our
00:05:21.100 political views should be without enforcing that on us.
00:05:23.920 So that kind of was left on you, which is great because it allows you to think for yourself.
00:05:29.180 I grew up in a family, my mother said they were communists, my dad said they were imperialists.
00:05:32.760 So for me, it was like, oh, yeah, it was crazy, you know?
00:05:35.860 Yeah.
00:05:36.300 It's like if MSNBC and Fox News got married and had a baby, it's me.
00:05:41.220 So I'm the baby of MSNBC and Fox News.
00:05:43.960 That's kind of amazing.
00:05:44.820 That's kind of amazing, right?
00:05:46.620 It's kind of confusing because you love these people and they have such different worldviews,
00:05:51.940 but it first makes you just not want to have anything to do with it.
00:05:55.400 And then later on, a little bit of the itch comes back and says, you know, I'm just curious
00:05:58.100 to know why my mom thought this way, why my dad thought this way.
00:06:00.300 That's why I asked to see where you were at.
00:06:01.700 So you're 14 years old.
00:06:03.800 I think you're 14 years old.
00:06:05.160 Are you in Scotland where you get this inspiration to go support Obama, present on Obama's campaign?
00:06:12.160 Is that right?
00:06:13.060 Did that really happen?
00:06:14.040 So 14 was when I joined Howard Dean's campaign, his primary.
00:06:19.400 That was 2003, I suppose.
00:06:22.540 I think I was 15 or 16 when I first met Barack Obama.
00:06:26.640 That was right after Howard Dean had lost the primary to John Kerry.
00:06:30.720 And then I was supporting John Kerry as a volunteer.
00:06:33.320 And I go to the DNC, the Democratic National Convention in Boston in 2004.
00:06:40.360 And the young state senator, Barack Obama.
00:06:43.640 So you were there.
00:06:44.440 Absolutely.
00:06:45.920 Yeah, I was at boarding school and there was a summer program that I was invited to participate
00:06:51.520 in called Lead America.
00:06:53.360 And they taught kids how to run political campaigns.
00:06:57.100 So we did a mock political campaign.
00:06:59.060 I actually ran for president.
00:07:01.080 Did you really?
00:07:01.920 I had a campaign manager and a press team and all of this different stuff.
00:07:05.920 It was really interesting to learn all of the mechanics like that while you're at the DNC.
00:07:11.040 So they would take us to, you know, different caucus meetings and different rallies.
00:07:16.260 And I went to this very small environmental rally.
00:07:19.340 There were about 30 people there.
00:07:21.960 And state senator Barack Obama was the keynote speaker with only 29 other people competing with me
00:07:28.540 for his time when he stepped off the stage.
00:07:31.700 How was that?
00:07:33.720 The first time I saw him speak, it was awe-inspiring, actually.
00:07:37.780 I had heard of him before, but I didn't really know much about him, obviously.
00:07:41.800 State senators don't get a lot of press or fame, but I'm from Chicago.
00:07:45.640 So, you know, I knew of some of my politicians.
00:07:48.160 Lincoln Park.
00:07:48.460 Are you in what part of Chicago?
00:07:49.580 Yeah, Lincoln Park is where I grew up.
00:07:51.720 And I was just so excited about what he was talking about, supporting Senator Dick Durbin
00:07:58.940 in blocking British petroleum from dumping into the streams and waterways that were polluting Lake Michigan.
00:08:05.900 And I thought that was something I wanted to get involved with.
00:08:09.900 I thought that I should support him in any way I could.
00:08:12.740 And I asked, how do I do that?
00:08:15.380 He said, volunteer for my campaign for U.S. Senate.
00:08:19.100 I'm going to be running.
00:08:21.480 And come to breakfast with me tomorrow morning if you're from Illinois.
00:08:26.140 So I did that.
00:08:27.540 And at breakfast the next morning, he said, by the way, I'm making a speech tonight.
00:08:32.660 Jan Schakowsky, why don't you get her a ticket?
00:08:34.860 I was sitting there having breakfast with him and Rahm Emanuel.
00:08:37.860 I guess I'm, yeah, I'm 15 or 16 years old.
00:08:40.420 And I get this ticket.
00:08:42.920 And I go to see his famous speech about how we're not the red states and the blue states.
00:08:47.700 We're the United States.
00:08:49.100 And that's when I knew that he was going to be president.
00:08:53.060 It's the moment, you know.
00:08:54.360 Some people have the moment.
00:08:55.360 That was the moment where you heard him speak.
00:08:58.020 You said, this guy could be a president.
00:08:59.600 Now, at the moment you're sitting there having breakfast with him, do you have any idea who
00:09:04.540 he is going to be?
00:09:05.680 Like, are you getting a feeling of this is a very special guy.
00:09:08.520 He could go places.
00:09:09.380 Did you know that already?
00:09:10.340 Absolutely.
00:09:10.820 How did you know that?
00:09:11.580 You can feel it from the second that you meet him.
00:09:13.900 I mean, a lot of people only have the opportunity to see him through the TV screen, which is
00:09:18.900 unfortunate because when you meet him, he just has this aura about him where you can
00:09:23.480 tell that he is genuine, that he's powerful, and that he has true intentions.
00:09:29.280 Genuine, powerful, and true intentions.
00:09:31.680 That's a good combination.
00:09:32.980 Yeah, it is.
00:09:33.560 That's a good combination to be president one day.
00:09:35.540 Absolutely.
00:09:36.140 A two-term president one day, right?
00:09:37.640 Exactly.
00:09:38.460 So from there, what happens?
00:09:40.100 So you do that.
00:09:41.180 You work with him.
00:09:41.940 Then what happens next?
00:09:42.860 So I went to go finish high school.
00:09:45.780 I left the United States when George Bush got his second term.
00:09:52.740 That was my last year, and it was my first year going to Scotland.
00:09:56.600 So I started college there at Edinburgh University, and it was in my second year there when Barack
00:10:04.320 announced that he was going to be running for president, or Senator Obama, I should say.
00:10:08.980 And I thought, well, you know, university is great and all, but it's not as important as
00:10:15.640 this guy becoming president of the United States.
00:10:17.860 That's what you were thinking.
00:10:19.140 Yeah.
00:10:19.720 So I left university.
00:10:22.480 Actually, I told my professors that if I was going to leave, I was doing something important.
00:10:29.460 I'm going to go work on the U.S. presidential elections.
00:10:32.140 Aren't you so excited for me?
00:10:33.880 They said, oh, great.
00:10:34.600 You're going to work for Hillary Clinton.
00:10:37.380 It's like, no, no.
00:10:39.860 Senator Barack Obama, he's my senator.
00:10:42.340 I'm from the state of Illinois.
00:10:44.420 And they didn't know who he was.
00:10:46.100 So they said, well, it's really not a good idea for you to be leaving university in order
00:10:52.900 to, you know, just go work for someone that's not going to win.
00:10:57.600 They told you this?
00:10:58.480 Yes.
00:10:59.700 More than one professor.
00:11:01.000 They said, you know, if you leave now that the best grade you can get is, you know, a 60
00:11:05.040 or 70 percent because you'll be failing all your exams.
00:11:08.760 I said, okay, fail me then.
00:11:11.160 I'm leaving.
00:11:14.540 Was it almost like a proving a point thing to you or no?
00:11:17.900 Like, did you get that chip saying, what do you mean this guy's not going to win?
00:11:20.800 Did you get that feeling yourself or no?
00:11:22.740 I realized that it was going to be something that they looked back upon and regretted,
00:11:29.020 not me.
00:11:29.600 Did you ever go back and see them or no?
00:11:31.160 Yeah, I went back and finished my degree, of course.
00:11:33.360 So you had a conversation after he got elected.
00:11:35.980 Yes.
00:11:36.500 What was that conversation like?
00:11:38.940 It was more like, well, congratulations.
00:11:43.500 What are you going to say?
00:11:44.740 Yeah.
00:11:45.160 It happened.
00:11:46.300 Hillary's not the president, you know.
00:11:48.520 And the first one was McCain, right?
00:11:51.600 McCain and Palin.
00:11:52.960 You're doing your school and then what happens next?
00:11:55.920 I was in school while I got the opportunity to go study abroad in Hong Kong.
00:12:03.240 And I was very excited about this.
00:12:07.460 This was actually why I didn't apply for a job in the Obama White House.
00:12:11.960 Most of us that had worked on the campaign were offered that opportunity to continue and
00:12:18.200 move to the White House once we won.
00:12:19.780 And I said, well, I've been studying Chinese for so many years.
00:12:24.940 I got this amazing opportunity to go study in Hong Kong.
00:12:27.700 I've never been there.
00:12:29.040 It's really important for me to go and explore Asia after studying Mandarin for so long and
00:12:34.860 Asian history and Asian religion.
00:12:37.160 It's actually much more important to me to go do that.
00:12:40.060 You believe that, that it was much more important to you?
00:12:43.180 Yes.
00:12:43.460 Why is that?
00:12:45.980 Going out and seeing the world and experiencing new things was always kind of the core of
00:12:52.600 what I wanted to achieve in my life.
00:12:54.180 Makes sense.
00:12:54.560 See as much and explore as much as I could in my lifetime.
00:12:57.980 But also, I really thought, you know, if I'm going to come back and do the American
00:13:03.800 government any service, it's probably going to be more in a diplomatic role, ambassador.
00:13:08.980 Me doing communications for the White House is not a good use of my time, actually.
00:13:15.820 So I did go to Hong Kong and very quickly got wrapped up in the human rights world there,
00:13:22.420 which is very topical now, of course.
00:13:24.780 But this was 2008.
00:13:28.040 Well, 11 years ago.
00:13:29.240 Now it's all they talk about.
00:13:30.860 Yeah, exactly.
00:13:31.600 And in 2008, the Chinese government was trying to push something called Article 8, which would
00:13:38.760 have given the Chinese Communist Party a veto over anything that was decided in the Hong
00:13:43.420 Kong parliament.
00:13:44.900 And this obviously put the people in Hong Kong in an uproar because the entire point of them
00:13:50.400 being a special administrative region is that they are allowed to administrate their own
00:13:54.720 region.
00:13:55.240 They have their own parliament.
00:13:56.500 They make their own decisions.
00:13:57.440 And I would spend a lot of time marching in the streets.
00:14:02.240 It looks kind of similar to the things that you've seen recently, the big protests, you
00:14:07.100 know, out with umbrellas so the cameras can't see your face, people wearing masks, and protesting
00:14:12.700 and demanding that the parliament get rid of this or the Chinese Communist Party to stop
00:14:18.040 pushing that.
00:14:19.240 It's not as bad as some of the bills that they tried to put through this past year, which
00:14:26.800 is why the protests now are a lot worse.
00:14:28.960 But it definitely became a core part of what I did while I was a student there.
00:14:32.980 And I started really getting into human rights and luckily met some incredible people that
00:14:39.820 did a lot of work at the United Nations and at the European parliament.
00:14:42.920 And I thought, well, you know, I'm studying international relations.
00:14:48.000 Actually, human rights is what I want to do because the people that I'm meeting here and
00:14:52.680 working with every day are very inspiring.
00:14:54.980 Are you, when you're going through that with being inspired to do work with human rights,
00:15:01.320 the deeper you get, is it getting more troubling to you?
00:15:03.980 And is it more unsafe to talk about it openly that we need to do something about this?
00:15:09.040 Did you get any of that feeling there or no?
00:15:10.820 The more you find out, the worse it is, definitely.
00:15:14.040 That's, yeah.
00:15:15.180 Definitely.
00:15:16.120 In China specifically?
00:15:17.800 China specifically has always been a place that I've concentrated on in terms of my research
00:15:22.420 and my work originally because I thought it was one of the greatest civilizations on earth
00:15:27.880 and I wanted to go live in China and know everything about it.
00:15:31.280 And then when I started understanding the politics there and the way that people are treated
00:15:35.840 and the way that minorities are targeted, I thought, this is a place I don't want to be,
00:15:43.240 but it's a place in need of reform and maybe I can help that.
00:15:46.660 What made you believe it's one of the greatest civilizations?
00:15:49.900 Was it the education?
00:15:51.080 Was it the school that you went to?
00:15:52.320 Was it a professor that painted China to be a great place and America's not as good as China?
00:15:57.060 Was there any influence there or was it yourself?
00:15:58.940 I, well, originally it was actually my grandfather who spent 27 years in military intelligence
00:16:06.440 and originally in the infantry in Korea and in Vietnam.
00:16:14.400 He was a paratrooper and then went into military intelligence for a long time.
00:16:18.900 And when I was in eighth grade preparing to go to boarding school, I told him,
00:16:23.620 Graham, so I want to take Japanese.
00:16:27.800 I'm given this amazing choice of all these different languages
00:16:30.540 and I want to take something that most of my friends are not going to get the opportunity to take.
00:16:35.140 And he said, no, no, that's not what you want to take.
00:16:38.440 I see they also have Chinese as a choice.
00:16:40.460 You should take Chinese.
00:16:42.780 Like, okay, explain to me why.
00:16:44.760 He goes, this is one of the most powerful civilizations on earth
00:16:48.180 and by the time you are in the working world,
00:16:52.660 probably at least a fifth or even a quarter of the world is going to be speaking this language
00:16:57.260 and it's going to be much more important than Japanese.
00:17:00.120 And you need to know that now.
00:17:01.760 This was 2000, I suppose.
00:17:04.180 Was he MI active at that time or no?
00:17:06.360 Was he military intelligence?
00:17:07.620 Was he still active or no?
00:17:08.820 Are you ever inactive?
00:17:11.120 Well, I'm saying the 27 years.
00:17:11.980 Once you joined intelligence.
00:17:13.500 Once the 27 years.
00:17:14.820 Was he still during his 27 years or is it post?
00:17:17.520 No, technically retired, but he would still get calls when assistance was required.
00:17:24.540 I mean, 27 years of experience.
00:17:26.040 They're going to be able to use the intel and experience you got,
00:17:28.440 so they're going to call you.
00:17:29.400 That makes sense.
00:17:30.340 So it's very, so was your grandpa a big inspiration into being who you are today?
00:17:36.320 Definitely.
00:17:36.920 He was.
00:17:37.860 So very close.
00:17:38.560 That's your mom's dad.
00:17:39.600 Yes, of course.
00:17:40.600 Makes sense.
00:17:41.020 So, okay, so now you're in China.
00:17:43.000 You're kind of seeing what's going on.
00:17:44.280 The deeper you get, the more concerned you are.
00:17:46.920 What do you do next?
00:17:48.780 So I had to go finish the fourth year of my degree at Edinburgh University.
00:17:54.240 This was my third year abroad in Hong Kong.
00:17:57.100 So I came back to finish my, it was technically a master's with honors in international relations,
00:18:03.040 and I was doing international law and Chinese as the two kind of other main components besides
00:18:09.800 just international relations.
00:18:11.200 And so I came back and I ended up writing my entire master's thesis on human rights in
00:18:16.180 China, on Falun Gong persecution, on the illegal organ harvesting trade, on persecution of
00:18:22.800 Uyghurs, you know, Muslim minorities, and the political prison camps.
00:18:27.900 Which is not just something I learned in Hong Kong.
00:18:33.060 The people that I met there gave me accreditation to go participate as a human rights lobbyist
00:18:38.520 at the European Parliament and at the United Nations in Geneva.
00:18:41.380 So I spent a lot of time there with a lot of the world's top experts, a lot of people who
00:18:46.820 were considered defectors who had escaped political prison camps or re-education through
00:18:52.280 labor camps, they call it.
00:18:54.040 And I was just kind of blown away by some people's stories and the gravity of the problem because
00:19:00.620 it looked very much like, well, you know, I grew up Jewish and going to a Jewish school
00:19:06.860 and it looked very much like a lot of the stuff that I had studied about the Holocaust growing
00:19:10.680 up.
00:19:11.520 And I was shocked to find out that there were things like that still going on in the modern
00:19:15.760 world.
00:19:16.380 You know, I'm still very young, a teenager, and starting to figure all of this out.
00:19:21.360 And I thought, well, this is a topic that the world needs to know about.
00:19:25.940 What do people think about when you compare that?
00:19:29.260 I mean, I'm sure it's not the first time you've said it had signs of a Holocaust type of
00:19:33.400 tendencies.
00:19:34.560 What do people tell you when you share that with them?
00:19:38.020 This was, I don't want to say a taboo topic, but still a very minority group of researchers
00:19:45.680 and human rights activists that were involved in exposing this type of, these types of abuses
00:19:52.740 back then.
00:19:54.340 Now, it's a little bit more mainstream.
00:19:57.700 More people understand it.
00:19:59.520 It comes up a lot more in the press.
00:20:01.540 It comes up a lot more in diplomatic conversations when countries like the United States are thinking
00:20:08.520 about their relationship with China.
00:20:10.940 And it's no longer, I would say, a topic that you would have to be in a tiny meeting room
00:20:18.920 in some far-flung wing of the European Parliament building to know about.
00:20:24.180 Now, it's a lot bigger.
00:20:26.260 So, I'm happy that so many people have followed that through and done something and put pressure.
00:20:33.080 But I definitely think we're hopefully starting to live in a world where things like that cannot
00:20:40.140 go unseen anymore.
00:20:42.060 What do you think about when politicians protect China?
00:20:44.400 You know, sometimes people are too careful.
00:20:45.920 Like, even today, like, there's, I'll have guests that'll come here and I'll interview
00:20:50.220 them.
00:20:50.740 And they'll say, there's only one thing I can't talk about.
00:20:53.540 What's that?
00:20:54.400 China.
00:20:55.440 This is the one topic I can't talk about.
00:20:57.360 China.
00:20:57.740 What is it with so many politicians, business folks, you know, anywhere that has any kind
00:21:03.500 of link to China that's not the one topic they want to talk about?
00:21:05.740 It's because they're afraid.
00:21:07.200 Afraid of what, though?
00:21:08.100 Afraid of Chinese money being pulled out of their businesses.
00:21:11.980 Afraid of Chinese companies not wanting to do business with them.
00:21:15.340 Afraid of Chinese spies following them around.
00:21:18.560 Trust me, I've had that happen to me.
00:21:20.020 Chinese spies following you around.
00:21:21.680 Absolutely.
00:21:22.740 Definitely.
00:21:23.800 Especially at all of the meetings at the, you know, European Parliament and the UN where
00:21:28.640 China is discussed.
00:21:30.000 There's plenty of people there where they're wearing badges that aren't even their real name.
00:21:33.980 You don't know where they came from.
00:21:35.200 They're not from a human rights organization.
00:21:36.820 It's pretty obvious.
00:21:38.980 But a lot of people are afraid of that.
00:21:40.840 And I, you know, I totally understand.
00:21:43.800 You know, I totally understand.
00:21:46.600 But if you're not willing to stand for something, then you stand for nothing.
00:21:51.280 I agree with that.
00:21:52.780 And it's, in a way, empowering them to know that you're afraid of them.
00:21:59.500 To me, it's exactly what they want.
00:22:01.580 They want to impose the fear so you stop talking about it.
00:22:06.000 Because they'll threaten you in their own ways.
00:22:09.000 I mean, you've got sports.
00:22:10.180 You've got NBA being hit by it.
00:22:11.560 You've got business people being hit by it.
00:22:12.800 You've got politicians being hit by it.
00:22:14.220 Yeah.
00:22:14.440 Very interesting what's going on right now with them.
00:22:16.060 So how does this whole thing come about?
00:22:19.140 Let's go right into Cambridge Analytica.
00:22:21.100 That's been your experience.
00:22:22.020 You've done a lot of different things.
00:22:23.100 You've worked on a lot of different projects.
00:22:24.320 How was the transition from where you're at to all of a sudden saying, I'm going to work
00:22:28.140 with Alexander.
00:22:28.760 I'm going to go and work with Cambridge Analytica.
00:22:31.320 How did that happen?
00:22:32.520 Right.
00:22:32.820 So I was still going back and forth between my academic work and my activism, which was
00:22:38.840 all, you know, pro bono.
00:22:40.780 Spending my student loan to get myself to Geneva, to the UN.
00:22:43.760 And I'm in the third year of my PhD.
00:22:47.360 I'm writing a doctorate on something called preventive diplomacy, which means that the
00:22:54.060 people in the world that hold the most diplomatic power, so heads of state, presidents, prime
00:23:00.160 ministers, ambassadors, preventive diplomacy is how they can intervene in situations before
00:23:08.700 massive atrocities happen.
00:23:10.840 So how do you stop war before it happens?
00:23:13.160 How do you, you know, intervene in a country's economy before there's massive inflation or
00:23:20.140 a famine?
00:23:20.880 There's all of these different factors.
00:23:23.900 And somehow my entire doctorate, or at least the third chapter that I was working on when
00:23:28.900 I met Alexander, was all about how you could use big data and predictive algorithms in order
00:23:34.980 to predict the future and find out that any of these atrocities or war or violence or any
00:23:43.520 other problems in society, find out that they're going to happen before they do so that you can
00:23:48.200 intervene early on and prevent it.
00:23:50.200 Now, no one at my law school could teach me about big data or predictive algorithms.
00:23:58.380 So one of my friends introduced me to the CEO, Alexander Nix.
00:24:04.100 That's actually the first chapter of my book.
00:24:06.740 And I thought, hey, this is very interesting.
00:24:11.900 He's using data for defense.
00:24:14.360 He's using data for politics.
00:24:16.780 He's using data for humanitarian purposes.
00:24:19.160 Okay, this is the type of stuff that I need to learn if I'm ever going to finish my PhD.
00:24:26.400 So let's see if I can get a bit of consultancy work and get an income besides my student loan
00:24:33.100 payments and see where this goes.
00:24:36.620 How is he pitching you?
00:24:37.920 When you first meet him, how impressive of a guy was he?
00:24:40.980 I know you talked about when you met President Obama, you know, presence, character, power,
00:24:45.920 all of that, you know, combined together.
00:24:47.620 What was Alexander Nix's personality the first time you met him?
00:24:52.000 I really thought that he was someone that was, I think I'd describe him as, you know, very posh.
00:25:03.360 The type of Englishman that didn't usually describe the friends that I had on a day-to-day basis,
00:25:10.380 but someone that was so incredibly privileged that he had extreme expertise.
00:25:16.140 He knew about things in the world because of his access to powerful people,
00:25:21.260 because of probably tons of projects that he had undertaken around the world.
00:25:26.900 And he had power.
00:25:29.560 So it wasn't that kind of awe-inspiring Barack Obama feeling.
00:25:35.480 No, nothing like that.
00:25:37.000 But it was definitely that feeling where you know that this is a person that really knows
00:25:42.780 what they're talking about, that they are very powerful, and they have the network to
00:25:46.440 do what they say that they can do.
00:25:48.260 And that has a different sort of aura.
00:25:52.880 That has a different sort of pull, I suppose.
00:25:56.560 It is magnetic, just in a different way.
00:25:59.860 So he recruits you, and you decide to go run with him.
00:26:02.060 What happens next?
00:26:02.800 So I joined this company, and immediately I'm introduced to some of the people that I suppose
00:26:11.980 I would have considered my mentors in most situations.
00:26:15.960 Cambridge Analytica, or the SCL group, Strategic Communications Laboratories, which was what it
00:26:22.080 was called at the time, the parent company.
00:26:24.560 They had a lot of other people that had spent their life in human rights work and in humanitarian
00:26:31.980 operations, people from the International Rescue Committee and the United Nations and diplomats
00:26:38.660 from the Commonwealth.
00:26:41.300 And wow, I thought, these people are amazing.
00:26:44.000 I can't wait to work with them.
00:26:46.400 We started working on all of these different projects that were both humanitarian, some were
00:26:51.260 defense, and some of them were politics in countries that I had never been to before, or I didn't
00:26:57.380 know much about.
00:26:58.380 So it was a very steep learning curve when I first got there.
00:27:02.020 And what are you doing?
00:27:02.700 What is your role?
00:27:04.080 Is it biz dev?
00:27:05.140 Are you mainly going building relationships?
00:27:07.780 You're not necessarily doing the data analytics.
00:27:10.220 What are you doing within Cambridge Analytica?
00:27:12.000 Right.
00:27:12.360 So I had no data experience before, really, not much.
00:27:16.540 I mean, obviously, working on the Obama campaign, I had a digital and social media strategy
00:27:21.360 experience, but not data analytics and data science.
00:27:25.620 So my role in the company was business development.
00:27:28.720 So I would go out and build relationships with people who could use data analytics or were
00:27:35.940 approaching us that they wanted to use data analytics and figure out what their goals were,
00:27:40.380 what capacity they had, what they're trying to achieve, and help design a program.
00:27:46.680 So that usually included me helping design a proposal of how they were going to use data,
00:27:52.980 how that would help them achieve their goals, and then moving that through to contract and
00:27:57.940 helping build the team that would go do that for them.
00:28:00.520 So are you selling or are you pitching somebody else's selling and closing?
00:28:04.020 I'm usually, well, in the very beginning, I was working with Alexander or Dr. Alex Taylor,
00:28:14.060 who was the chief data scientist at the time, and they would do the main pitching in the
00:28:19.100 beginning until I was at the company for long enough to start pitches on my own.
00:28:23.320 But that took me about six months to learn enough from every department to actually be
00:28:28.340 able to go do that by myself.
00:28:30.340 Eventually, the goal-
00:28:31.020 That's still impressive, though, six months later for you to start pitching and sitting
00:28:33.560 down with these high-profile people.
00:28:35.920 That takes a lot to be able to do.
00:28:37.760 How old are you at that time when you're doing this?
00:28:39.560 This is-
00:28:40.240 26.
00:28:41.440 You're 26 when you're doing this.
00:28:42.620 Yes.
00:28:43.000 And you're sitting with some of the most powerful people in the world.
00:28:45.220 Yes.
00:28:45.660 And you're getting them to agree to do business with you guys and spend millions of dollars
00:28:50.520 on advertising.
00:28:51.440 Yes.
00:28:52.840 It's pretty impressive to be able to do that.
00:28:55.360 So you're going, you're building relationships, you're bringing them in, they're signing the contract.
00:28:59.100 Is it purely a marketing strategy pitch that you're talking to them about?
00:29:05.360 Or is it, we're the best of the best, we're the only game in town?
00:29:08.180 Like, did you guys almost have a monopoly in that play or no?
00:29:10.740 So in some respects, the SCL group had so much global experience in running political campaigns
00:29:20.100 that that was the big pitch at the time.
00:29:24.280 Oh, look at what we've done all around the world, all of these huge campaigns in countries
00:29:29.600 with up to hundreds of millions of people or, you know, tiny island nations.
00:29:34.500 Any size of political campaign, we can figure out a strategy, we can execute, and we can
00:29:39.700 win if you give us enough time and funding.
00:29:42.540 And that was kind of the big play because on the data analytics side, that was very new.
00:29:47.700 I joined the company in 2014.
00:29:49.560 And only in 2013 did they birth Cambridge Analytica, which was to be the North American subsidiary
00:29:59.700 of the SCL group, specifically because in the United States, there weren't any data laws
00:30:05.300 or regulations.
00:30:06.540 And therefore, the amount of data that you could purchase and license to do predictive analytics
00:30:12.460 was unmatched anywhere else.
00:30:15.580 So let's unpack the company.
00:30:18.060 Who is behind it?
00:30:19.180 Because you see a lot of names that come behind Cambridge Analytica.
00:30:23.540 Who was the power, the money behind the brand?
00:30:27.360 So for Cambridge Analytica, specifically, it was the Mercer family and Steve Bannon.
00:30:33.140 And they came in in 2013 in order to basically have a separate but like wholly owned subsidiary
00:30:42.420 of the SCL group that would just operate in North America.
00:30:45.800 And they were really specifically concerned with the United States, obviously.
00:30:50.220 So Alexander had pitched them, here's all of my political experience from all around the
00:30:55.220 world.
00:30:55.580 Here's what my company can do.
00:30:57.280 I want to start a data analytics company.
00:30:59.340 And he had decided that because Sophie Schmidt, the daughter of Eric Schmidt of Google, had
00:31:06.360 interned for him in 2010, I believe.
00:31:09.520 Or maybe it was 2013.
00:31:10.700 Sorry, I'll have to double check in my book.
00:31:13.740 But she had interned for him and every day had shown him what Google Analytics was doing.
00:31:19.240 And that predictive analytics was the future.
00:31:22.160 So he thought, as soon as she left, I'm going to go build my own data analytics company, not
00:31:28.660 just in the eyes of Google, but I'm going to combine data and politics and supercharge
00:31:34.860 everything that I'm already doing.
00:31:36.480 Now, where does Facebook come into play?
00:31:38.680 When does the conversation of Facebook come in where the ability to get 5,000 different
00:31:45.920 points of tracking of this person's known for this and that person's known for this?
00:31:52.020 When does Facebook come into play?
00:31:53.920 Right.
00:31:54.280 So, again, that's also in 2013.
00:31:56.620 It was part of the founding strategy of Cambridge Analytica, which was to begin to build one
00:32:02.560 of the largest databases that anyone had ever seen.
00:32:06.740 And this included harvesting data off of Facebook, which at the time was pretty easy to do.
00:32:12.780 Facebook had started a program, I believe, in 2010 where you could, as a developer, pay
00:32:18.860 for access to any of the data for Facebook users.
00:32:23.000 You just had to create an application.
00:32:25.040 You could create something like a game, Candy Crush Farmville, or you could create a quiz.
00:32:31.580 The famous one is, this is your digital life, which is one of the first ones that Cambridge
00:32:36.100 made.
00:32:36.620 But you probably also would recognize ones like, what country should you really be living
00:32:42.080 in?
00:32:42.880 Or, who's your favorite Disney princess?
00:32:46.060 And so those...
00:32:47.220 This has given you data.
00:32:48.540 This has given you information.
00:32:49.880 Absolutely.
00:32:51.020 I'm sure most people never read the terms and conditions of those apps.
00:32:55.280 But if you did, you would have seen that it gave the developer of that application, not
00:33:02.380 only your data, but the data of everybody else in your network, which unfortunately is
00:33:09.460 not actually legally possible.
00:33:12.080 It's not legally possible.
00:33:13.620 It's technically possible, but not legally.
00:33:16.400 I cannot consent on behalf of another able-bodied adult.
00:33:19.920 They have to consent on their own behalf.
00:33:21.600 How does that make it legal, though?
00:33:23.200 It's not.
00:33:23.680 So how were they able to do it?
00:33:25.620 I mean, Facebook's not a small company when they're doing that.
00:33:28.080 Facebook created the Friends API, and the Friends API allowed people to do that.
00:33:33.020 How much money did that make them?
00:33:35.880 It's hard to estimate that.
00:33:38.160 It's hard to estimate that.
00:33:39.520 I actually don't have those numbers.
00:33:41.740 But originally, they thought this is, in order to have a developer program, you know, they
00:33:47.520 wave a carrot stick in front of people.
00:33:49.840 Oh, you get access to all of this data in order to improve your product and get more users
00:33:54.580 and whatever it is.
00:33:56.200 And, you know, you pay us a nominal fee for that.
00:33:59.460 They had at least 40,000 developers participating in this program.
00:34:03.300 So whatever the fees were from 40,000 different companies, at the time, that was good enough
00:34:08.620 for Facebook.
00:34:09.620 But again, they were still a bit young.
00:34:12.740 It was only a couple years later where they realized our data that we're collecting off
00:34:18.200 our users is so incredibly valuable that if we don't give anyone access to it and they
00:34:24.060 have to advertise in Facebook, then we're going to make a lot more money.
00:34:27.700 And that's why they closed it off in April 2015.
00:34:31.680 April 2015.
00:34:32.760 How long did it stay open?
00:34:35.020 About five years, I think.
00:34:36.680 Five years.
00:34:37.720 And a lot of people took advantage of it.
00:34:39.700 Absolutely.
00:34:40.040 So is Cambridge Analytica possible to rebuild today?
00:34:46.200 Not with the same Facebook data sets.
00:34:50.360 Obviously, those data sets are still out there.
00:34:52.460 They're all over the world and we can never get our privacy back because of it.
00:34:56.240 Or at least if you had a Facebook account before April 2015.
00:35:00.600 But you can build something very similar because the majority of Cambridge's data was not from
00:35:07.540 Facebook.
00:35:08.460 It was from big data aggregators like Experian and Info Group, Magellan, Axiom, L2, which
00:35:17.020 is labels and lists.
00:35:18.560 It's a political data company.
00:35:19.860 Anyone, anywhere in the world, whether you're American or Russian or from anywhere, you can
00:35:28.400 buy that data.
00:35:30.540 You can just buy that data.
00:35:32.820 Those are, that's the lack of regulation in this country.
00:35:36.020 Anybody can buy that data.
00:35:37.860 I saw the number that the data is a trillion dollar a year industry.
00:35:41.220 Multi-trillion.
00:35:42.060 Multi-trillion dollar a year industry.
00:35:44.020 Absolutely.
00:35:44.220 Data is the world's most valuable asset now.
00:35:47.820 It runs all decision making and all user experience and all communications for every
00:35:55.400 organization, for-profit, non-profit, governmental.
00:35:59.780 You know, 10 years ago, TEDx did their convention.
00:36:02.360 I think it was in either Seattle or Canada.
00:36:04.960 This was like maybe 11 years ago or 10 years ago.
00:36:08.120 And one of the speakers got up and said, the future of business, anything you do, it's
00:36:13.220 all about data.
00:36:14.020 If you have data, you have value.
00:36:15.520 Absolutely.
00:36:15.960 Everybody's saying, what are you talking about?
00:36:17.220 You got oil, you got this, you got that.
00:36:18.700 Nope.
00:36:19.060 Data is the future of everything.
00:36:20.900 People that make a lot of money, it's going to be data companies.
00:36:23.820 So Facebook gets fined $5 billion.
00:36:25.920 What's $5 billion for Facebook when they pay $5 billion fine?
00:36:29.460 It's nothing to them because their stock price spiked from all the press and they made the
00:36:35.180 money back.
00:36:36.320 In no time.
00:36:37.060 Yeah.
00:36:37.620 $5 billion to Facebook is pretty much nothing.
00:36:40.000 But this is a point that I really want to make, which is that it's nothing to Facebook,
00:36:45.180 but it's a really big deal for the Federal Trade Commission.
00:36:49.440 Why is that?
00:36:50.500 Because the FTC has never had that type of budget and they're responsible for protecting
00:36:56.740 consumers, right?
00:36:58.000 So now they have $5 billion to play with.
00:37:00.320 Yeah.
00:37:01.380 I think that's a very good thing.
00:37:03.120 Interesting, the way you put it.
00:37:04.260 It might not be good for, it might not mean anything to Facebook, but it means a lot to
00:37:08.500 the government.
00:37:09.080 Yeah.
00:37:09.240 So what they do with it, now it allows them to go a little bit deeper with some other
00:37:14.180 companies that are maybe doing what they're doing.
00:37:15.820 I hope it doesn't just go to investigations.
00:37:18.580 I hope it goes into investing into technologies that can protect consumers.
00:37:23.300 Well, government's not famous for doing good with money, so we'll see what they're going
00:37:26.380 to do.
00:37:26.500 Hopefully they can stretch that $5 billion because you can do some damage with that.
00:37:29.940 Right.
00:37:30.160 Who is the modern-day Cambridge Analytica today?
00:37:32.960 Is there one?
00:37:33.960 I wish it was just one.
00:37:35.840 Many of them.
00:37:36.740 So many of them.
00:37:38.140 Is there a dominant one?
00:37:40.040 Not really.
00:37:41.420 I would say a lot of them are still quite small.
00:37:45.600 There's even a lot of them that came out of Cambridge Analytica.
00:37:48.620 A lot of my former colleagues just started new political consultancies.
00:37:52.320 And recently, I think it was two, maybe three months ago, the University of Oxford put out
00:38:00.140 this really scathing report that showed companies even worse than Cambridge Analytica popping up
00:38:06.900 all over the world, companies that could be described as propaganda as a service, which
00:38:12.960 I could say some of that is very relatable to what I saw at Cambridge Analytica, but some
00:38:17.840 of it is worse.
00:38:18.960 They have disinformation and fake news as a service.
00:38:24.400 They have that as a service.
00:38:25.740 Yeah.
00:38:26.500 Disinformation and fake news.
00:38:28.140 Yeah.
00:38:28.460 Disinformation and fake news.
00:38:30.560 Definitely saw some disinformation at Cambridge Analytica, but it wasn't really a core competency.
00:38:37.600 But it was used in the Trump campaign for sure.
00:38:41.000 And also, a lot of these companies now offer bot farms and making mass amounts of fake accounts
00:38:48.900 and troll factories and all of these different things that, you know, Cambridge Analytica,
00:38:56.340 to my knowledge, never did at all.
00:38:58.720 Cambridge Analytica never did at all.
00:39:00.380 No.
00:39:00.680 So you guys didn't say this.
00:39:01.820 You didn't say, I'll have some fake news for $2 million.
00:39:05.340 I'll have some disinformation for $5 million.
00:39:07.760 And give me some of these troll bots to help me out.
00:39:11.760 No.
00:39:12.240 A lot of people, unfortunately, confused what was coming out of Russia and what was coming
00:39:16.940 out of Cambridge Analytica.
00:39:18.580 So that was coming from them.
00:39:20.460 It wasn't coming out of Cambridge Analytica.
00:39:22.260 Absolutely.
00:39:22.700 So can you explain to the rest of us what is behavioral micro-targeting, traditional micro-targeting,
00:39:29.240 and demographic polling?
00:39:30.700 Of course.
00:39:31.900 So let me start with demographic polling.
00:39:35.100 Sure.
00:39:35.460 So demographic polling will mean that I'm going to go out and ask questions of the population,
00:39:43.740 and I'm going to make sure that my polling addresses the correct amount of men and women,
00:39:51.640 different age groups, different ethnicities, different belief structures, different parts
00:39:58.220 of the country or states that people live in according to the population.
00:40:03.300 So it's weighted properly.
00:40:04.980 And I'm going to ask opinion questions like, do you like Donald Trump?
00:40:09.480 Right?
00:40:10.440 And that's all well and good, but that doesn't give you that much information.
00:40:17.880 I'll give you a basic poll.
00:40:19.180 Okay, you know, less than 50% of the country likes Donald Trump right now.
00:40:23.160 Okay, what do you do with that for political communications?
00:40:28.160 That doesn't help anything.
00:40:29.540 It kind of just tests the water to see where people are at.
00:40:34.120 What is the national feeling, right?
00:40:36.660 And so if you're going to go into, what did you ask for the next level?
00:40:41.900 Traditional micro-targeting or behavioral micro-targeting.
00:40:44.340 So traditional micro-targeting is going to take all of these different people and put them
00:40:51.980 into groups.
00:40:52.880 And it's not just going to be, you know, all right, I'm going to talk to all of the women.
00:40:59.200 I'm going to talk to all of the youth.
00:41:01.420 But sometimes it does look like that.
00:41:03.280 It might just be like, okay, I'm going to talk to all of the conservative women.
00:41:07.560 I'm going to talk to all of the young people who care about the environment.
00:41:10.420 I'm going to talk to all of the conservative Hispanics in this state.
00:41:16.480 And so it's putting together basic demographic categories and going a little bit further.
00:41:25.300 And then you will have a handful of different campaigns that will come out of a political
00:41:29.980 campaign targeted at those different types of people.
00:41:33.560 Such as. Give me an example.
00:41:34.060 So again, like, we're going to talk to youth about the environment.
00:41:38.680 Okay, great.
00:41:39.560 Well, that might mean that I'm not going to make any other environmental campaigns because
00:41:44.220 I've seen that only youth are active or are talking about these topics when I've done
00:41:49.080 polling.
00:41:50.100 So we're going to talk just about environmental initiatives to the youth because they're going
00:41:54.480 to be the people that are actually going to get active.
00:41:56.740 They're going to come out to events.
00:41:57.980 They're going to go vote for me because of my environmental policy, right?
00:42:01.940 And so that's what that will look like.
00:42:05.420 Now, when you go into behavioral micro-targeting or, you know, I would say real micro-targeting,
00:42:12.920 this is when you don't have to pick and choose your campaigns here and there.
00:42:17.920 You don't just have to choose, you know, okay, five or ten big topics targeted at different
00:42:22.980 groups of people.
00:42:24.040 You can target every single person in America if you want to.
00:42:28.540 And you're going to target everybody according to how they view the world.
00:42:33.740 So instead of just the youth campaign on the environment, I'm going to have a different
00:42:39.860 campaign towards youth who are open-minded and extroverted.
00:42:44.940 They're going to have a campaign about how you can get involved, how you can help stop
00:42:49.180 climate change because they're going to go and they're going to take an action.
00:42:53.600 And they're going to go and they're going to share that on social media.
00:42:56.140 And you can find something very specific for those people to get active that is about our
00:43:01.380 hope for the future and how the, you know, the planet for our children can be better than
00:43:05.460 it is for us today.
00:43:07.240 Now, there's going to be a whole different group of other youth that also care about the
00:43:13.520 environment, but they're introverted and neurotic.
00:43:16.860 So you're going to play instead on their fears about the environment.
00:43:20.920 You're going to show them those pictures of a polar bear standing on a tiny piece of
00:43:25.460 ice with melting ice caps in the background.
00:43:28.960 You're going to show them the giant floating island of plastic and all of the dead birds
00:43:35.340 and fish around it.
00:43:36.660 You're going to use fear-based messaging in order to drive them to action or to care about
00:43:42.520 your environmental policy.
00:43:43.900 And when you're trying to do something positive in the world, I guess that doesn't sound as
00:43:50.880 bad when I phrase it on a let's save the environment type of platform.
00:43:57.140 But it starts to get really bad when you talk about a different topic.
00:44:01.280 Let's talk about registering to vote.
00:44:03.100 What if I saw that the open-minded and extroverted people could be encouraged to go vote, but
00:44:11.200 the neurotic people could really easily be encouraged to not vote at all?
00:44:17.360 Because I could instead make them afraid of politics.
00:44:22.060 I could make them disengage with government.
00:44:26.140 I could make them feel like their government has never done anything for them, so why should
00:44:30.460 they care?
00:44:30.960 So you could flip and forget, like, let's win by not getting the guys to come out and
00:44:37.400 vote and impose fear that this may not even happen or there's not even a chance or don't
00:44:43.520 even worry about coming out and voting.
00:44:46.080 That's humble.
00:44:46.940 So you guys went that deep into it?
00:44:49.300 That's what I was shown in the two-day debrief that I talk about in my book that I was given
00:44:55.660 a month after the Trump campaign.
00:44:57.060 All my colleagues that had worked on the Trump campaign and the Trump Super PAC showed me
00:45:02.440 the strategies that they used when we were wondering, okay, what did you guys do?
00:45:07.420 How did you win this?
00:45:09.160 This is a crazy political upset that no one in history is ever going to forget.
00:45:13.020 What did you do?
00:45:13.760 And we thought we'd see some pretty cool stuff with numbers, engagement.
00:45:21.200 How did you get there?
00:45:22.480 What were they clicking on?
00:45:23.640 What were the tools that you used?
00:45:26.380 There was a little bit of that.
00:45:27.820 But there was also some really dark stuff that they showed us.
00:45:31.860 And voter suppression tactics were one of them.
00:45:34.920 And they showed us these charts of how they labeled different groups of people.
00:45:39.920 You know, like I said, you know, the neurotic youth environmentalist or, you know, the conservative
00:45:47.220 Hispanic Texan.
00:45:50.400 There's, you know, these different groups.
00:45:52.840 Great.
00:45:53.180 But they found groups of people who were going to vote for Hillary Clinton.
00:45:59.060 They couldn't ever be convinced to vote for Donald Trump because the way that modeling
00:46:04.200 works in politics is you're going to show how likely people are to vote, how likely they
00:46:09.240 are to support certain candidates, what issues are most important to them.
00:46:12.740 Those are kind of the main crux of political modeling.
00:46:17.340 So if you find the group of people that might vote if they're shown the right message or they
00:46:24.780 might not if they're shown the wrong message and they're on the Hillary Clinton side,
00:46:29.060 then the cheapest way that you can win with those people is by getting them not to go
00:46:34.460 to the polls at all, because no amount of money is ever going to get them to vote for
00:46:38.720 Donald Trump.
00:46:39.800 The cheapest way is to get them to stay home.
00:46:42.420 Yes.
00:46:44.000 That's the way that these tools are designed.
00:46:46.860 How much did the campaign folks know that these were some of the tactics being used?
00:46:53.480 Oh, they were very aware.
00:46:56.080 The target group was called Deterans.
00:46:59.060 To deter people.
00:47:00.960 For this part.
00:47:02.460 Yeah.
00:47:02.820 To prevent them to coming out.
00:47:04.040 Yeah.
00:47:04.500 Deterans.
00:47:05.160 The charts that were used in the campaign headquarters, there was this big group of people on this
00:47:10.260 chart that were called Deterans.
00:47:12.660 It's like an X and Y axis.
00:47:14.440 You know, the Trump people are over here.
00:47:16.540 Hillary people over here.
00:47:18.520 Very likely voters at the top and people who will never vote in their lives at the bottom.
00:47:22.840 So if you can find people who are in the middle, who may or may not vote, and they're definitely
00:47:27.360 on the Hillary Clinton side, there's only one way you're going to use your money.
00:47:31.660 So, and then you guys had also a group that you called Persuadables.
00:47:35.520 Yeah.
00:47:35.820 Is that a whole different category?
00:47:37.740 No.
00:47:38.580 Well, it's all on the same chart.
00:47:40.780 Okay.
00:47:40.920 So, the Persuadables are people that are in between Hillary and Trump.
00:47:46.660 Independents, libertarians.
00:47:48.000 I can go either way.
00:47:49.040 I can go Hillary.
00:47:49.680 I can go Trump.
00:47:50.220 Yeah.
00:47:50.500 Swing voters.
00:47:51.320 Right.
00:47:51.580 In brand advertising, they're called the switchers.
00:47:54.020 You can really easily get them to try a different brand.
00:47:57.520 That's the same thing in politics, except Persuadability is something that, you know,
00:48:02.900 it's very nuanced to measure.
00:48:04.620 Everyone's persuadable on some topics, but Persuadability on a presidential candidate is
00:48:12.360 a very specific type of person, right?
00:48:15.420 And finding those people and finding the Persuadables that are very likely to vote is where the majority
00:48:22.080 of the money always gets spent.
00:48:24.180 Now, which one of those groups was the most effective?
00:48:27.920 If you were to say, you know how, like, you got a market, and so what's your number one
00:48:33.300 selling product?
00:48:33.980 Milk.
00:48:34.220 Milk, okay.
00:48:34.860 And you know it's like the main thing.
00:48:36.480 You got In-N-Out.
00:48:37.200 What's the number one combo I sell is one?
00:48:38.920 Or, you know, McDonald's is Big Mac.
00:48:41.000 Right.
00:48:41.420 Which one was the most effective out of all those different groups?
00:48:45.180 From all of the case studies that I saw, the most effective were increasing intent to
00:48:52.640 vote for Donald Trump with Persuadables and decreasing intent to vote for Hillary.
00:49:01.020 Decreasing vote to vote for Hillary.
00:49:03.080 So the deterrents weren't necessarily the biggest ones that helped you out.
00:49:06.860 I mean, it's a bit of both.
00:49:09.580 When you're winning an election with tens of thousands of votes in some states, every
00:49:16.160 little bit is something that needs to be considered.
00:49:18.100 Now, let me ask you, what other clients did you guys have at that time?
00:49:23.060 I know I saw Brexit.
00:49:24.320 I think you guys were working with Brexit as well.
00:49:26.060 The Leavey EU campaign, yes.
00:49:27.480 Who else?
00:49:28.140 What other major campaigns?
00:49:29.380 I mean, there were campaigns all over the world.
00:49:32.140 I mean, Uhuru Kenyatta in Kenya, working on the last presidential elections in Mexico,
00:49:40.080 working on, I mean, even in the state of Texas for Senator Ted Cruz's primary, working in Romania,
00:49:49.340 working for other parties in the United Kingdom, working for, I mean, really about 50 different
00:49:57.000 countries over my time there.
00:49:59.440 Simultaneously?
00:50:00.500 Yeah.
00:50:00.940 How big is the team at the time?
00:50:03.260 By the time we reached our peak, we were about 120 people full-time and another 30 to 50 consultants
00:50:13.860 around the world that would come on for kind of ad hoc projects.
00:50:17.760 That's not a lot.
00:50:18.780 No.
00:50:19.020 That's a small group.
00:50:20.680 And you guys were about to be, but I think the CFO, Julian, said you guys were about to
00:50:25.280 be a billion-dollar company.
00:50:27.240 That's what everyone was aiming for.
00:50:30.420 Was that in the talks?
00:50:31.420 Was that the conversations behind closed doors?
00:50:33.300 Or not really?
00:50:33.880 Yes.
00:50:34.420 Alexander talked about that every day.
00:50:36.120 We're building a billion-dollar company.
00:50:38.380 Don't you want to be a part of it?
00:50:40.240 So he was a visionary.
00:50:41.940 Absolutely.
00:50:42.340 Now, you also said in an interview that three and a half years of experience working with
00:50:49.220 him, you didn't have a bad experience.
00:50:51.480 You said you actually enjoyed working with him.
00:50:53.420 Like, it wasn't like it was, you know, something where you said, this guy was my friend.
00:50:58.120 He was my mentor at one point, right?
00:51:00.220 And then things turned.
00:51:01.320 I thought he was.
00:51:02.320 Was there a dark side of working with him or no?
00:51:04.720 Yes.
00:51:05.520 What was that?
00:51:05.960 He was very volatile.
00:51:07.420 Okay.
00:51:07.700 So he'd be the type of person where out of one side of his mouth, he's saying, you know,
00:51:13.640 what bar are we going to tonight after our last meeting?
00:51:17.080 And out of the other side of his mouth, he's screaming at you because we lost a contract
00:51:21.800 to somebody else.
00:51:24.020 But he would also be the type of person that would say, oh, I only, like, yell or get upset
00:51:28.860 to make a point.
00:51:30.020 And then almost like Jekyll and Hyde, he would switch his personality.
00:51:34.340 And then he's like, okay, let's go out to dinner.
00:51:36.860 And so that was always kind of jarring and like an emotional roller coaster.
00:51:42.120 And I always thought, okay, well, you know, I've never worked for, you know, a for-profit
00:51:47.280 company really before.
00:51:49.220 Maybe this is what it's like because it's a bit ruthless.
00:51:52.640 It's cutthroat.
00:51:53.520 You're trying to make money.
00:51:56.260 You know, I had never done that.
00:51:58.020 I had spent my whole life in academia or working for nonprofits and advocacy organizations
00:52:04.620 and charities.
00:52:05.400 So I was like, okay, well, he's building a billion-dollar company.
00:52:09.440 This must be what it's like.
00:52:11.520 I don't know.
00:52:12.620 Like, this is supposed to be normal, like working with somebody like this.
00:52:15.720 Right.
00:52:16.240 Yeah.
00:52:16.620 And by the way, many of them are wired like that.
00:52:20.200 Right.
00:52:20.460 Many of them are wired like that.
00:52:21.860 But, you know, you had a lot of people talk about the fact that behind closed doors, Hillary
00:52:26.800 had a little bit of that herself.
00:52:28.980 And he had a lot of people talking about behind closed doors that Trump has a little bit of
00:52:32.660 that.
00:52:32.880 So almost-
00:52:33.360 Well, not behind closed doors.
00:52:34.640 Well, for him, it's like open doors.
00:52:35.820 He had a whole TV show about it.
00:52:39.700 You're fired.
00:52:40.500 That's right.
00:52:41.000 And by the way, even right now, when he gives a talk, get her out of here.
00:52:45.360 Absolutely.
00:52:45.680 Get her out of here.
00:52:46.640 See all that fake news?
00:52:47.900 I bet they're not going to show the middle finger she put up.
00:52:50.180 Get her out of here.
00:52:51.040 I mean, he just doesn't even, even with the cameras on, he's got no filters.
00:52:55.960 Exactly.
00:52:56.520 Which in a way, you know, you got to appreciate that because you know he's, he can't help
00:53:00.540 himself.
00:53:01.020 He is who?
00:53:01.560 He can't even help himself on Twitter.
00:53:02.960 You can at least say, let me think about this tweet for an hour before I send it out.
00:53:07.220 No, it's going out.
00:53:08.360 Yeah, exactly.
00:53:09.380 So you guys have a lot of different clients.
00:53:11.280 David Carroll, obviously, the guy who, in the documentary, the great hack documentary,
00:53:16.440 he asked a very good question.
00:53:17.400 He said, where did you guys get our data?
00:53:19.240 How did they process it?
00:53:21.080 Who did they share it with?
00:53:22.840 Do we have a right to opt out?
00:53:24.740 Now, your necklace right there, which you are the founder of, Own Your Own Data.
00:53:28.560 You're the founder of Own Your Own Data.
00:53:30.420 Is it ownyourowndata.org?
00:53:32.400 Am I saying it right?
00:53:33.160 So, ownyourdata.foundation is our new nonprofit.
00:53:38.240 But I actually started the campaign last April, right, you know, the week after I became a
00:53:43.040 whistleblower, specifically to start raising public awareness, starting to open up people's
00:53:49.240 minds to the fact that your data is important.
00:53:52.780 Yep.
00:53:52.960 Your data has contributed to one of the world's biggest industries and is now the world's
00:53:58.540 most valuable asset.
00:54:00.560 It surpassed oil and gas in 2017 in its value.
00:54:04.260 Yet, the entire time you've been producing data on digital devices, which for some people
00:54:09.560 is their entire lives, you've never had any rights to that data.
00:54:14.100 But, you know, a part of that, this is what I'll come back to you and challenge me on it.
00:54:21.620 I actually want to hear your argument.
00:54:23.020 So, how many times you hear in the music industry, you know, oh my gosh, I signed a contract.
00:54:27.860 I didn't know I gave up my, you know, rights of this forever to you.
00:54:32.220 How many times have you seen it in movies, you know, these stories?
00:54:35.220 Well, it wasn't a contract.
00:54:36.880 I never read, I trust you should have read the contract, right?
00:54:39.580 Or, you know, yeah, I own this thing myself, 51%.
00:54:42.640 I thought you wanted 30%.
00:54:43.740 No, this is how I set it up, right?
00:54:45.100 So, how much of that are we responsible for to just say, oh, the new site is called what?
00:54:51.940 The Facebook?
00:54:53.040 Okay, sign me up.
00:54:54.880 What information do I need to put in?
00:54:56.640 My date of birth, where I was born, my relationship is complicated.
00:55:00.380 Go ahead and put it.
00:55:01.260 How much of it is on us to have the freedom to choose and make the right decisions?
00:55:05.960 And how much of it is on the creator of the brand of Facebook?
00:55:08.640 Well, I'm so happy that you asked that, actually, because I think it's a bit of a balance, you know?
00:55:14.460 It's important to have informed consumers and informed citizens so that we know how to protect ourselves.
00:55:20.320 It's also really imperative that the entire onus is not on us, that companies and even governments are giving us more transparency and awareness of what they're doing.
00:55:33.860 And they're not putting us in a position where, if we are not well-educated, that we're being taken advantage of and that we can so easily be abused.
00:55:43.280 So, right now, we have two massive problems, which is that tech companies will not make the ethical decision without being forced to.
00:55:50.320 By laws and regulations that we don't yet have.
00:55:53.960 Can you say that one more time?
00:55:55.040 Of course.
00:55:56.300 So, tech companies will not make the ethical decision without being forced to by laws and regulations that we don't yet have.
00:56:04.860 Right?
00:56:05.320 And so, that's why we're in the position where Facebook has so much power and companies like Facebook.
00:56:10.540 And on the other hand, we have a population that is incredibly digitally illiterate.
00:56:17.180 We do not understand what our data rights are, how to protect them.
00:56:20.380 We don't understand basic cybersecurity protocols and how to keep our data private if we wanted it that way.
00:56:25.820 We don't understand media literacy, you know, how to spot disinformation and fake news.
00:56:31.480 You know, kids don't know how to understand cyberbullying and how to stop it.
00:56:34.980 We don't know how to be ethical to each other online, especially when we're anonymous.
00:56:38.920 These are all things that need to start being integrated into the education system because we just have an undereducated population for our overexposure to, you know, our digital life.
00:56:51.060 How much of that are we teaching in high school right now?
00:56:53.140 How much of that are we teaching in junior high school right now?
00:56:55.880 I mean, you were 13, 14 years old, and 13 years old is, what, 8th grade, 7th grade?
00:57:00.560 I think that's what it is, right?
00:57:01.600 So, how much of that is our educational system right now saying, be careful texting this person, messaging this person?
00:57:09.200 If you get a profile like this, let me show you five different examples.
00:57:11.800 Are we actually, I actually don't know if we're doing this or not.
00:57:13.960 Right now, it's not taught in schools, but it's just starting.
00:57:19.300 And that's why I started the Own Your Data Foundation.
00:57:21.380 That's what we actually do.
00:57:22.300 We do digital literacy training for kids in schools.
00:57:25.020 We're starting with middle schools because we think that's really the first age group, like 8 to 12 years old, where your parents have probably given you a phone.
00:57:35.420 Even if your parents haven't given you a phone, you might have a family computer, and you're probably using digital devices in school.
00:57:42.100 So, you have your own accounts, whether it be social media or at least email accounts.
00:57:48.940 You're surfing online for sure in order to do at least research projects.
00:57:53.320 But, you know, a lot of kids have full exposure.
00:57:56.460 They're on their phones all day, every day.
00:57:58.960 There are some kids where if their parents do not stop them, they will actually be on it 24-7.
00:58:03.220 And not having the awareness of all of the different issues that I just listed is really debilitating, and it's really harming the psyche of kids, and it's harming their chances to be successful.
00:58:17.560 I fully agree with you.
00:58:18.560 Fully agree with you.
00:58:19.280 I think there's a lot of things that we don't teach enough of.
00:58:20.920 I think that's one of them that we've got to be more involved talking.
00:58:24.360 There was a movie I watched.
00:58:25.420 I think it was called Connected.
00:58:27.360 Is it connected by the guy from Horrible Bosses?
00:58:29.740 Who's the guy?
00:58:30.860 The guy that Jennifer Aniston was, no, no, the other guy, Horrible Bosses.
00:58:35.560 Kevin Spacey was his boss in the movie.
00:58:37.440 What's that guy's name?
00:58:38.320 He was a great comedian.
00:58:39.540 Oh, I know who you're talking about.
00:58:40.820 That was a really funny movie.
00:58:42.400 Phenomenal actor, right?
00:58:43.520 So good.
00:58:44.160 But in this movie, the story is about his kid is being bullied online, and these friends of his in school create a profile, and they say, hey, you share your penis with me.
00:58:55.240 I'm going to share the picture of my girl because I'm going to share my privates, and they took the picture of the girl that they knew he was obsessed with, right?
00:59:03.440 And he sends it to him, and then the next day, those bullies from school take the picture and send it to everybody in school.
00:59:10.320 Right.
00:59:10.460 He goes to school, comes home, one day the dad is coming home from work, he goes to his room, he's putting a loud heavy metal music, and he goes in, he's about to hang himself.
00:59:18.420 Dad grabs him, puts him, and says what happened, and then they find out what the whole story was.
00:59:21.420 That was a perfect example of what kind of bullying is going to be taking place right now if you don't teach your kids.
00:59:25.520 One of the best movies to watch with your kids is that movie because I think it will show your kids what is possible.
00:59:32.160 Anyways, I don't want to digress from it.
00:59:33.480 Let's go back to what we were talking about.
00:59:34.920 Would you consider yourself, I mean, I know politically I've heard you say you were for Bernie.
00:59:40.460 Yes.
00:59:41.600 You were not Hillary camp, you were Bernie, and you're a Democrat.
00:59:46.140 Would you still position yourself as that today, or has it changed a little bit?
00:59:49.460 I would say the way that I see American politics is more from an independent stance, especially because I spent my entire adult life living in the United Kingdom,
01:00:00.060 where actually even the Democrats in the United States look quite conservative.
01:00:05.280 In the United Kingdom and in a lot of other countries in Europe, it's taken for expectation and taken for granted that everybody has free access to health care,
01:00:19.980 that if you become homeless, you get a government house, that you can have a weekly or monthly stipend from the government that will cover the needs of you and your family if you fall on hard times.
01:00:32.920 So, a lot of the policies in America, even on the Democrat side, I find to be actually shockingly unhumanitarian.
01:00:43.940 So, it's really hard for me.
01:00:45.360 Strong statement you're making.
01:00:46.580 Yeah.
01:00:47.580 Somebody listening to this could say you could be semi-socialist.
01:00:51.780 They can say whatever they want.
01:00:53.240 I care about human rights.
01:00:55.160 Okay.
01:00:55.720 But economically, you're comfortable more leaning towards the socialist side if it comes down to programs to take care of people.
01:01:03.960 Would you say you put yourself in that position a little bit?
01:01:06.020 Absolutely.
01:01:06.500 Okay.
01:01:06.660 Which is why I have always been a Democrat, because I do believe in social programs.
01:01:10.960 I've never voted for a Republican before, although I would consider it, if they had policies that made sense to me.
01:01:20.160 Any Republican that's been attractive to you, anybody, that you say, you know what, that guy could have worked?
01:01:25.240 Marco Rubio.
01:01:26.500 Really?
01:01:27.100 Yeah.
01:01:27.520 Why Marco Rubio?
01:01:29.600 Again, it was that crazy moment the first time that I ever saw him speak in person,
01:01:34.700 where I was just so attracted to his personality and to some of his policies and the way that he talks about unifying people
01:01:45.460 and, you know, including people that have usually been left behind.
01:01:51.180 That makes sense to me.
01:01:53.400 And he's a fantastic speaker.
01:01:55.120 Oh, my gosh.
01:01:56.360 Yeah.
01:01:56.620 I mean, right now I'll take anyone that even knows how to speak in full sentences in the White House.
01:02:00.540 So it's fair to say that the gift we had outside for you of President Donald Trump's poster signed,
01:02:06.580 we had that as a gift for you.
01:02:07.720 We weren't going to give it to you to go home with it.
01:02:09.880 Oh, perfect.
01:02:11.340 Thank you so much.
01:02:12.280 Didn't you get, like, the two MAGA hats?
01:02:13.780 We got two different sides for you, so we have a waiting outside for you.
01:02:16.440 We got MAGA candy, MAGA steak.
01:02:18.300 We got everything for you outside.
01:02:19.480 Oh, good.
01:02:19.720 I've seen some really interesting takes on the MAGA hats recently, which say,
01:02:24.260 Make racism bad again.
01:02:26.200 Make racism bad again.
01:02:28.520 Make racism bad again.
01:02:30.160 Yeah.
01:02:31.440 MRBA.
01:02:32.200 Uh-huh.
01:02:32.760 Unfortunately, I think this president has made people think that it's acceptable.
01:02:39.360 You think so?
01:02:40.160 Yes.
01:02:40.480 You fully believe that?
01:02:41.660 Mm-hmm.
01:02:42.020 100%.
01:02:42.760 Yep.
01:02:43.280 Okay.
01:02:44.000 Would you position yourself as a true believer?
01:02:47.240 In what?
01:02:48.060 In your beliefs.
01:02:49.040 In my beliefs, yes.
01:02:50.460 Absolutely.
01:02:50.880 Like you said earlier, if you don't stand for something, you'll consider yourself as a true believer.
01:02:56.820 Yeah, I'm back to that.
01:02:58.820 Okay.
01:02:59.020 I had a few years where I was steered down the wrong path.
01:03:02.640 So the reason why I ask this question is the following reason.
01:03:05.200 Because you know how at first, anything we do, we're naively in love.
01:03:11.900 You know, like you and I are dating.
01:03:13.800 We're 13 years old.
01:03:14.980 And I'm the first guy.
01:03:16.360 You've kissed.
01:03:16.960 You're the first girl I've kissed.
01:03:18.520 I'm like, oh my gosh.
01:03:19.480 I love her for the rest of my life.
01:03:21.260 She's my mommy.
01:03:21.960 I love her so much.
01:03:22.960 And we're inseparable, right?
01:03:24.280 Yeah.
01:03:24.440 Puppy love, right?
01:03:25.600 Yeah.
01:03:25.900 I had that when I was a kid.
01:03:27.240 Yeah, me too.
01:03:28.000 Believe me.
01:03:28.400 I had that in Germany at the refugee camp I was staying at.
01:03:31.140 Wow.
01:03:31.700 And then you move on and you go to a different relationship and a different relationship and a different relationship.
01:03:35.820 And the older you get, the tougher it becomes to go experience what you once experienced with that puppy love, right?
01:03:43.680 So for you, you're in a family, your grandpa's an MI person, 27 years military, you know, then from there you go and you're inspired to work on the Barack Obama campaign, senator, you have breakfast with him, with him and Rahm Emanuel.
01:04:01.800 And you're like, oh my gosh, I can't believe this person exists.
01:04:04.900 And then you see him win.
01:04:06.240 You go back to school.
01:04:07.060 You talk to your professors.
01:04:08.620 And then from there you come out and you start working on a couple different campaigns.
01:04:12.180 You go to China, you're kind of seeing the human rights that kind of moves you.
01:04:15.620 You come over here, you write your Ph.D. thesis on it.
01:04:18.240 Then you get involved with Cambridge Analytica.
01:04:20.800 Then you're getting involved.
01:04:22.160 And then you're seeing what happened with this campaign.
01:04:24.280 President Trump gets elected.
01:04:25.440 He becomes president.
01:04:26.800 Then you become a whistleblower.
01:04:28.480 Then you step away.
01:04:30.200 And then, but you also said sometime where, you know, even you have to make money.
01:04:34.160 So sometimes to make money, you've got to take some jobs that you don't know what happens.
01:04:37.900 Are you at a point where you still have that naive, innocence, love of wanting to correct an injustice?
01:04:47.840 Or have you gone from that to skeptic to a little bit cynic?
01:04:52.080 Have you gone to that part where politics got you to be a cynic a little bit right now or not yet?
01:04:56.820 Well, I think cynicism has its place.
01:04:59.700 It's always good to have a dose of, instead of cynicism, I would say skepticism, to make sure that you are actually questioning what people are telling you.
01:05:10.000 I think I spent too many years believing people at face value that what they were telling me was true.
01:05:15.920 That's why I'm asking that, yeah.
01:05:16.820 And that they actually had an intention to do something good for the world when they didn't.
01:05:21.140 And so now I'm a lot more skeptical of what I'm told.
01:05:26.000 I do more due diligence, definitely, than I did before, before I think about working with people or thinking that what they say to the public is actually what they believe behind closed doors.
01:05:39.360 And, you know, that's why I think right now in the presidential fields, when I think about, you know, who represents my true beliefs, some people have some good things to say.
01:05:52.760 But, you know, I haven't thrown my support behind anybody specifically because there's nobody that really speaks to everything that I'm talking about.
01:06:02.020 I mean, the only candidates that we have that even have technology policy are Elizabeth Warren and Andrew Yang.
01:06:08.460 Andrew Yang's fantastic, but, you know, he would make a really great, you know, CTO of America.
01:06:15.320 But I think we really have been shown over the past few years that we need someone with a lot of foreign policy experience, someone that can go out and do diplomacy.
01:06:26.140 Maybe Elizabeth Warren is one of those people, maybe she's not.
01:06:28.900 But the rest of the political field hasn't even thought about data or privacy policy or technology regulation.
01:06:39.840 And they don't even talk about it.
01:06:41.920 And, you know, obviously that's my number one issue at the moment because I believe it underpins so many of our other problems that we have in society that it needs to be taken care of and it's not being addressed.
01:06:55.880 I wonder who is going to do it, though.
01:06:57.700 I wonder who is going to actually be talking about it, because when you look at it, I haven't heard Joe Biden talk about it.
01:07:03.100 I haven't heard Sanders talk about it.
01:07:04.920 Yang will talk about it and people will resonate with them.
01:07:08.720 Elizabeth Warren is part of her message.
01:07:10.340 Tulsi Gabbard, I haven't heard.
01:07:12.940 So do you think, what do you think are the chances of anybody being able to beat Trump right now the way it's going?
01:07:18.380 No. I would say right now, if it continues to go as it is right now, no one can beat him.
01:07:26.700 No one can beat him?
01:07:28.200 Not if it continues to go as it's going right now.
01:07:30.980 What do you mean by that?
01:07:32.100 I think the Democratic side is spending too much time tearing each other apart as opposed to actually building a unified message that can get people to care about politics again and actually get people out to the polls.
01:07:51.420 I think right now the DNC is terribly disorganized, and that's unfortunate.
01:07:57.080 Now, there's another topic of what's going on right now, which is impeachment.
01:08:02.900 So where is that going to go?
01:08:05.220 Is he going to be impeached by the Senate?
01:08:07.140 We don't know.
01:08:07.760 The articles have not even been sent over to the Senate.
01:08:09.980 I have a strong feeling after watching the Republican members of the House of Representatives making their testimony that it is very unlikely that the Senate will proceed.
01:08:19.340 Two-thirds?
01:08:20.300 Two-thirds?
01:08:21.180 Yeah.
01:08:21.920 I mean, so who voted president?
01:08:24.560 Tulsi Gabbard voted president.
01:08:25.940 Two other people voted president, right?
01:08:27.520 And 100% Republicans voted against it.
01:08:30.840 And in the Senate, you already heard what he said that he's going to do.
01:08:35.640 It's not even going to die day one, right?
01:08:37.400 But do you think – here's a curious question for you since you've worked in a marketing world and messaging is critical.
01:08:45.000 Do you think sometimes a lot of these candidates are in the shadow of Nancy Pelosi and the impeachment campaign that they have, that they're driving, where it's taking –
01:08:57.440 like last night, nobody knew the debate was taking place last night.
01:09:00.080 I was like, oh, shit, we've got a Democratic debate, right?
01:09:02.040 Do you think –
01:09:02.500 I also forgot.
01:09:03.800 That's what I'm saying.
01:09:04.160 And I actually watched that.
01:09:05.060 So do you think in a strategic way it's actually hurting the camp because, like, it's almost like a father that cannot help take the shadow away so his son can –
01:09:18.960 you know, it's like the DNC cannot take the attention away to say, listen, let Biden, let Bernie, let Warren, let these guys get there.
01:09:25.300 Because the media should be talking about them 24-7, not be talking about impeachment, knowing you're not going to win two-thirds on Senate.
01:09:32.580 So what are your thoughts on that?
01:09:34.580 You think it's kind of hurting a little bit of the Democratic candidates?
01:09:37.040 Well, I think technically it could be, but we can't think of it that way.
01:09:40.580 Because when laws are broken and when our Constitution is violated, people need to be held to account.
01:09:47.860 And I'm sorry, but I actually believe that when this president is no longer immune, when he's no longer in this seat, that he will be indicted for many different crimes, actually.
01:09:57.440 You think so?
01:09:58.060 Absolutely.
01:09:58.780 How certain are you of that?
01:09:59.680 I'm pretty sure there's sealed indictments waiting for him.
01:10:02.040 You think so?
01:10:02.720 Yes.
01:10:03.000 So let me ask you, so I mean, the same can be said, because for me, when we're going through this, here's how I process it.
01:10:08.760 And again, I prefer you challenge me nonstop the entire time, okay?
01:10:14.000 So this is how I process it.
01:10:16.380 So when we're going through Cambridge Analytica and I'm looking at, okay, it's very obvious, you know, the different kind of persuadables, the deterrence, you know, the possible pro-Trump, the absolute anti-Hillary, great, great strategy.
01:10:30.080 I get it.
01:10:30.600 But somebody could say, well, Brittany, I mean, let's not be naive.
01:10:34.920 This has been going on for a long time.
01:10:36.500 It just happens to be that today's tool is this.
01:10:38.440 Somebody could say, you know, there used to be time we used it by bullying people, like literally bullying people and preventing some communities from being able to vote and knowing who to target and putting the fear into some communities to not even going to vote by bringing some power people, like in the 1800s and the 1900s.
01:10:57.760 Hey, making sure people were fed to vote, you know, just throw some food at them.
01:11:02.080 They needed some poor areas to win their votes over.
01:11:04.860 Well, then, you know, it could be we used the mob a little bit because a mob helps with Kennedy to help them with election.
01:11:10.240 And, you know, the mobs involved with that election that took place.
01:11:12.800 And then, well, you know, what helped with some of these other guys is radio, whoever was better on radio.
01:11:20.020 Oh, it's not fair.
01:11:20.860 Nixon wasn't good on TV, but Nixon was better on radio.
01:11:23.420 But the reason why JFK won is because JFK is better on TV and Nixon wasn't good on TV.
01:11:27.180 He was sweating.
01:11:27.860 He hadn't shaved.
01:11:28.540 He had a four o'clock, you know.
01:11:29.880 So, hey, but that's not fair because JFK is better looking.
01:11:32.880 You don't need to vote for a president because he's better looking.
01:11:35.800 I mean, because Nixon's not as handsome as he speaks as much.
01:11:38.820 So, or somebody could say, well, if you look at media today, 99% of media except for one TV station is on the liberal side.
01:11:47.520 Somebody could say mainstream media is all liberal.
01:11:49.340 You can say MSNBC, NBC, CBS, ABC, you know, CNN.
01:11:56.140 You can go Time Magazine, Fortune Magazine, Money Magazine, New York Times, LA Times.
01:12:01.760 I mean, you could say New York Post right, Breitbart right, Drudge Report right, Washington Times right, but Washington Post left.
01:12:09.960 Somebody could say, well, this is great.
01:12:12.120 I don't know why you guys are so upset because liberals have been doing this for a very, very long time with the media.
01:12:17.140 And the only opposition you have is Fox.
01:12:19.940 So, maybe, just maybe, again, I want to hear your argument on this.
01:12:24.100 Just maybe, you are getting the taste of your own medicine.
01:12:27.680 Why are you upset now?
01:12:28.500 Because Trump won.
01:12:30.120 Because if Hillary would have won, would you have come out and said that she used some tactics?
01:12:35.060 Or maybe Trump was involved?
01:12:36.260 Or, you know, Russia was involved?
01:12:37.680 So, again, I want to hear your response to that.
01:12:39.780 So, I'm just asking, at minimum, that we obey the laws that we've already agreed to uphold.
01:12:49.340 Voter suppression is illegal.
01:12:51.920 Discrimination using racism and sexism in order to gain power, incitement of violence, all of those things are definitely illegal.
01:13:00.520 Yet, somehow, the Trump campaign and the Trump super PAC were allowed to use these tactics in order to get him into the White House.
01:13:06.640 And ever since he's been there, he's used some of the same tactics to stay there.
01:13:11.080 That's a big problem for me.
01:13:13.300 And, unfortunately, Facebook refuses to enforce these laws on its platform either, even though it is the world's largest communications platform.
01:13:22.300 I'm not talking about censorship versus free speech.
01:13:25.660 And I hate that people always bring it to that because my free speech is not unfettered.
01:13:31.520 My free speech ends when your human rights begin.
01:13:35.560 So, I am not allowed to discriminate against you.
01:13:38.200 I am not allowed to incite violence upon you.
01:13:40.580 I'm not allowed to suppress your vote.
01:13:42.680 But yet, somehow, I'm allowed to do that on Facebook or politicians are allowed to.
01:13:46.360 I'm not allowed to because I'm a common person.
01:13:48.480 And this is the problem that I'm talking about.
01:13:51.880 No, I'm with you on that part.
01:13:53.360 I mean, are you kidding me?
01:13:54.020 Absolutely not.
01:13:54.600 But what are your thoughts on mainstream media?
01:13:56.940 Because mainstream media, I mean, you could say, Britton, you're a smart cookie.
01:14:01.620 It's not like you're lightweight.
01:14:02.820 You're brilliant yourself.
01:14:04.380 I mean, mainstream media was 100% Team Hillary.
01:14:08.220 Mainstream media, unfortunately, was 100% Team Trump because they covered him disproportionately.
01:14:14.320 And name recognition is everything at the polling booth.
01:14:16.900 But that's a different story.
01:14:17.880 That's called being dumb.
01:14:19.380 That's called being dumb.
01:14:20.360 That's not called being Team Trump.
01:14:22.300 That's called the more you talk about him, the more attention you give him versus the more you talk about.
01:14:27.900 So, what I mean by this is the following.
01:14:29.860 So, let's just say if you and I break up, okay?
01:14:32.440 And I'm going out there and say, but you don't understand.
01:14:34.840 She's also this and she's also, she was this, she was that, she was this.
01:14:38.040 And, you know, you're kind of like, but look, here's what I wanted.
01:14:41.860 And that's not what he wanted.
01:14:42.880 I wanted to have kids.
01:14:43.760 He didn't want to have kids.
01:14:44.540 And so, you're talking about why we had a different.
01:14:47.860 I'm talking about how bad of a person you were.
01:14:50.880 Guess who's winning?
01:14:53.040 You're winning.
01:14:54.100 I know.
01:14:54.500 I'm mainstream media.
01:14:55.900 Right.
01:14:57.160 You kind of are like, listen, we need to build a wall.
01:14:59.480 I hate the wall.
01:15:00.380 That's what I want to do.
01:15:02.040 But if you keep on saying wall, wall, wall, wall, wall, people think about the wall.
01:15:05.800 Exactly.
01:15:06.480 No one thought about the wall before he brought up the wall.
01:15:08.440 No, not at all.
01:15:09.100 I'm not sitting here telling you.
01:15:10.520 Like, what I'm saying to you is, I honestly think the DNC needs to hire a legitimate marketing firm to help with the languaging.
01:15:21.700 And if MSM, mainstream media, needs to collectively come together and change their messaging or else the way they're going right now, it's going to be bad for a long time.
01:15:31.180 Because the more you give me an anti how bad of a person I am, you're constantly building me up.
01:15:35.900 I don't think that's an effective strategy.
01:15:37.540 No, I totally agree in something that you'll, I don't know if you'll find it funny or horrible, but after Donald Trump won the election and we as Cambridge Analytica Commercial were going out to pitch advertising campaigns,
01:15:53.220 when we went to go pitch big media companies, they would say, you know, we'd get a meeting with CNN, for example, and we'd go in and we'd be like,
01:16:05.320 oh, well, we thought it was going to be really hard to get this meeting because, you know, you don't like our biggest client.
01:16:11.400 And they'd say, oh, no, we made so much money off of covering Donald Trump.
01:16:17.500 You're forgiven.
01:16:18.520 Don't worry about it.
01:16:19.200 Get out of here.
01:16:20.080 Yeah.
01:16:20.380 So what does that tell you about their loyalty?
01:16:23.800 Is it a loyalty to money or is it a loyalty to their beliefs?
01:16:28.200 Indirectly, my interpretation of what you just said is they're not true believers.
01:16:32.060 No, but again, the entire point of a news agency is that they're supposed to do their research and present unbiased facts.
01:16:40.060 Come on.
01:16:40.600 Come on.
01:16:41.800 That's the point.
01:16:42.860 You think.
01:16:43.960 So they're supposed to keep themselves in business.
01:16:46.120 You think Sean Hannity is going to be unbiased?
01:16:48.320 You think Rachel Maddow is going to be unbiased?
01:16:51.200 Never.
01:16:51.680 You think a Cooper or a Waters World, you think these guys are going to come on now.
01:16:58.080 But their entire business model is to stay in business.
01:17:00.680 I totally get it.
01:17:01.500 And continue to present what they see as news.
01:17:04.040 I get it.
01:17:05.680 It's going to be interesting who the next candidate is going to be.
01:17:07.560 And when I say next candidate, let's just say the next four years, you know, he's reelected.
01:17:11.200 Because if it goes like this, the strategy they're using, it's going to be disastrous if it goes this way.
01:17:18.020 And did you see the other day one of the congressmen said, don't buy a house in D.C., rent,
01:17:24.120 because many of you are going to lose your jobs with the way you're going right now because your community is not going to vote for you.
01:17:28.880 It's, again, United States of America, politicians, like if you and I sit down here, for instance, you and I,
01:17:37.480 we go have a drink together, we go for a cup of coffee, we go have dinner together, we're going to have differences.
01:17:42.900 But we're probably going to walk away saying, hey, really enjoy the conversation.
01:17:45.780 Of course.
01:17:45.920 This was phenomenal.
01:17:46.620 Totally.
01:17:46.860 Yeah, totally.
01:17:48.000 Like, you know, he came up and said, well, you know, and I'm like, listen, in my company, I have 50% Democrats.
01:17:54.060 We've got 12,000 agents, 50% Democrats, 50% Republicans.
01:17:57.060 And obviously when I say 50% Republicans, we've got the 10%, 15% in the middle that's independent, libertarian.
01:18:02.380 And obviously you've got a community that could care less about politics.
01:18:05.240 Yeah, totally.
01:18:05.660 Don't even bring up anything with politics.
01:18:07.940 I don't want to talk about it.
01:18:10.000 But you and I can have that civil conversation.
01:18:13.400 These guys in D.C. are having a hard time having that civil conversation.
01:18:16.140 And that is so sad, isn't it?
01:18:18.340 It's sad.
01:18:18.660 It's very sad.
01:18:19.680 It's extremely sad to see that taking place.
01:18:20.860 As we said when we started our conversation today, you know, unfortunately, who loses when there's arguments in politics?
01:18:30.760 It's the citizens.
01:18:31.800 No doubt about it.
01:18:32.600 The voters are losing.
01:18:34.220 I'm from the school of thought of believing that in every generation, like, I believe Trump got elected because of Obama.
01:18:45.620 I think Obama gave birth to Trump.
01:18:47.600 But I think Bush gave birth to Obama.
01:18:49.820 And I think Clinton gave birth to Bush.
01:18:51.680 Yeah, it's a pendulum swing.
01:18:52.800 And I think Carter gave birth to Reagan.
01:18:54.860 And I think, you know, I mean, I can go forward.
01:18:57.320 You know, we can go back and forth if you notice what happens.
01:18:59.740 We get sick of something saying, you know, I don't know.
01:19:01.880 I want this.
01:19:02.620 No, no, no.
01:19:03.060 I want it back again.
01:19:03.940 You know what?
01:19:04.280 It was better when it was Democrat.
01:19:05.400 I don't know.
01:19:05.800 Maybe we need a Republican.
01:19:06.740 Like, we keep going back and forth.
01:19:07.900 But I think the one thing that I'd love to see happen, which I haven't yet seen in America during my time of being here.
01:19:15.580 I was a Clinton fan.
01:19:17.000 Bill, I was a Clinton fan.
01:19:18.260 Obviously, forget about what he did with Monica.
01:19:20.200 But as somebody who would sit down and do his stuff, I was a fan of his.
01:19:24.500 It'd be very interesting if all of a sudden we get a synergist that's actually able to bring people together.
01:19:32.100 I know.
01:19:32.380 I don't know if it's going to happen or not.
01:19:34.080 Maybe I'm a little bit too optimistic that it's possible.
01:19:37.560 But, you know, I've seen these things happen.
01:19:39.980 You know, you see families that are divided and somebody comes in and brings the whole family together.
01:19:43.940 It's a beautiful thing.
01:19:45.040 I know.
01:19:45.460 Very, very interesting to have somebody come in here.
01:19:47.620 Let me ask you this question.
01:19:48.840 We've got like a few minutes left.
01:19:49.920 What is life like right now for you as a whistleblower, career-wise, your personal life, your comfort, your level of comfort of feeling safe, your fears?
01:20:05.260 What is life of a whistleblower today?
01:20:08.340 Well, it's definitely the scariest thing I've ever done.
01:20:11.280 I'm not going to pretend anything other than that.
01:20:13.960 But it's been something where I've been so lucky that what I said resonated with people.
01:20:23.420 People actually care now.
01:20:25.900 They care about their use of technology.
01:20:28.000 They care about being abused and taken advantage of by big tech.
01:20:31.100 They care about owning their data and actually having control over the value that they produce every day and over their private information if they want to keep it private.
01:20:40.820 But it should be all of our choice.
01:20:42.820 And so I've been lucky.
01:20:44.300 You know, I was given a platform.
01:20:46.960 I had the incredible opportunity to work with the great HAC team.
01:20:52.820 And we've now been shortlisted for an Oscar this week, which is just so incredible.
01:20:56.740 And I was so lucky to work with HarperCollins as well to write this book, Targeted, and to get the word out there.
01:21:04.600 And now I have millions of supporters around the world who ask me every day, what can I do?
01:21:10.240 How do I protect myself?
01:21:11.580 How do I support the Own Your Data campaign?
01:21:14.140 And these are people that are calling their legislators.
01:21:16.980 These are people that are going out and getting active.
01:21:18.980 Some of these are people that are working in big advertising companies that are now working on data protection policy and working on data ownership mechanisms for their consumers where that concept didn't even exist in their companies before.
01:21:34.200 I mean, it's really a revolution.
01:21:36.580 I've been, again, lucky and honored to be a part of it because a lot of other whistleblowers don't get that.
01:21:44.320 I, earlier, I think it was before this interview that we were chatting about National Whistleblower Day that just started this year in Congress.
01:21:52.600 Yeah, July 30th, right?
01:21:53.900 July 30th.
01:21:54.640 Yeah.
01:21:55.220 And, you know, it was many days of different sessions in order to help whistleblowers.
01:22:00.520 And some of these sessions were, you know, is your story a book or how to tell your story to the media?
01:22:07.000 Was it therapeutic?
01:22:07.840 And, you know, at first it was really disheartening for me because I saw some people in this room who had a really important story to tell.
01:22:18.100 They had managed to find evidence of corruption in government agencies or within important companies.
01:22:24.440 And they didn't know how to talk to the media or they had been trying for years and no one wanted to tell their story.
01:22:32.080 And I'm sitting there with five million press, you know, press pieces about me and my story and a book deal and a film.
01:22:40.240 And I'm just like, wow, it's so amazing that some of these people persist and keep on going, even though people are not listening.
01:22:49.940 And the fact that I was so lucky that people wanted to listen just blows my mind.
01:22:55.480 And I want to address, you know, the safety question that you had, which is that, you know, no, I don't feel completely safe.
01:23:03.520 It's not like I get threats every day.
01:23:05.840 But there are definitely a lot of powerful people that would prefer if I stopped doing interviews like this every day and would prefer if I stopped pushing data privacy legislation in Congress.
01:23:18.820 But I'm not going to stop.
01:23:21.040 The threats are not interesting to me.
01:23:22.940 So I think that it's just important to recognize that becoming a whistleblower is not easy and it's something that should be encouraged in order to force transparency and to weed out corruption.
01:23:34.640 You know, one day, hopefully, whistleblowers are protected enough that it's a lot easier for us to stop corruption before it becomes a really big problem.
01:23:43.260 Would you say Julian Assange is a friend or somebody you admire?
01:23:47.960 Well, he was someone that I admired for a very long time.
01:23:52.460 I think whatever role that he had in the hacking of the DNC or not, that's not something that I support, obviously.
01:24:02.440 But the work that he did in the beginning and what he stood for, for full transparency and for holding power to account is something that I will always support.
01:24:16.120 His dropping of the Iraq war files to show the crimes against humanity that were committed or war crimes that were committed had such an effect on what I did for the rest of my life and the way that I viewed the world and the way that I view my own government and the way that I question things.
01:24:34.940 Yes. So I think, you know, he's someone that is in a very sad situation right now, and it's specifically because whistleblowing laws are not strong enough and they need to be.
01:24:47.800 How was it when you met with him? Because I know you and him had a meeting together. How was that experience?
01:24:51.700 It was so sad to see someone who has been basically in solitary confinement for seven years. He was nearly see-through, and I hardly got to have a conversation with him.
01:25:04.160 I mean, you could tell he was obviously psychologically affected by being in there because it was almost like he was just talking at me for the whole, like, 20 minutes that I was there.
01:25:12.520 You know, everything that was inside his head because he doesn't really get that much human contact. So everything he was thinking about, he just rattled off, and, you know, it's really sad to see that, especially when I worked in human rights, working with prisoners of conscience and working with political prisoners was something that I specialized in, and, you know, at the time I really did kind of see him as a political prisoner.
01:25:39.240 I mean, his life is done. I mean, what are you going to do? Married, kids, public life, going out, seeing things, movies, just the day-to-day. I want to go to a restaurant having dinner. He can never do that for the rest of his life.
01:25:53.940 Australia's fighting to get him back, but Australia versus America doesn't usually end up with Australia on top.
01:25:59.900 No, it's not one that you're going to win too often.
01:26:06.100 Right.
01:26:06.940 What are some of the biggest threats we have today?
01:26:08.880 Now, obviously, you know, data is one. And by the way, you know what's so weird is I'm a CEO of a financial foreman. I go to a lot of these conferences with these big 50, 100, 200 billion auto insurance companies.
01:26:18.080 And in the last 18 months, the most common conversation that's been coming up is cybersecurity.
01:26:23.400 I mean, it's just, it's, we've never seen this much before, and it continues to coming up right now.
01:26:28.180 Based on what you see and what you know and being on the inside on many of these things, what do you see as the biggest threat we are facing today?
01:26:35.420 Is it data, cybersecurity? Is it China? Is it Russia? Is it internal? Is it companies the size of Facebook, Amazon?
01:26:46.640 You know, what do you see as the biggest threat to the average person?
01:26:50.960 I would say the biggest threat to the average person is the fact that there is a complete and utter lack of data protection.
01:26:58.920 And data protection has a lot of different parts to it, right?
01:27:04.100 That is everything from, yes, cybersecurity fending off attacks that could come from anywhere in the world from bad actors.
01:27:13.020 That is also securing the value that us as individuals produce, the tracking and traceability of where data goes, who has it, where it's held, what it's being used for,
01:27:26.400 and actually having any sort of opt-in or permission structures for my personal information to be used by certain people for certain purposes,
01:27:35.160 and the right to actually monetize my own value for myself as opposed to being exploited.
01:27:41.100 Data and the way that it is used means that anybody in the world can buy my time, my attention, and my privacy.
01:27:52.200 It goes to the highest bidder.
01:27:54.440 And right now, because of our lack of legislation and regulation, because of our lack of technology that can actually manage and track and trace data in a reliable way,
01:28:05.200 it means that our democracy is up to the highest bidder.
01:28:08.440 And that's what scares me the most, which is why I do what I do every day.
01:28:15.660 What is my data worth?
01:28:16.920 What is the average person's data worth?
01:28:19.120 Your data can be worth whatever you want it to be, depending on who you want to share it with.
01:28:23.580 I really hate that Mark Zuckerberg tells people it's worth, oh, $17 a quarter or something like that.
01:28:31.540 You know what?
01:28:32.400 I just had a friend that did a study with, you know, a set of diabetes researchers and pharma companies
01:28:38.940 and said, for somebody that qualifies for a diabetes study, how much do you pay for that medical data?
01:28:47.860 Because, you know, most medical data comes from young 18 to 35-year-old white men in college who go for extra beer money to medical trials, right?
01:28:59.700 And so to find someone that qualifies for a diabetes study, it costs them $28,000 for six to eight weeks of data
01:29:08.220 in order to find the people, in order to get people to become part of a study, in order to complete the study.
01:29:16.480 For a couple months of medical data, which is probably just blood tests, urine samples, they go in a couple times, right?
01:29:23.340 So if we are to build a future where we own our data and where we are actually able to profit from our own human value,
01:29:34.400 we need to have systems where we can share our data anonymously and securely,
01:29:40.120 where we have laws and regulations that allow us to own it as our property,
01:29:44.720 and that, okay, I'm fine if I produce data with a pharma company or I produce data with Facebook,
01:29:49.600 they can have part ownership and I can have part ownership.
01:29:52.220 I wouldn't have produced that data without them.
01:29:54.060 That's fine.
01:29:55.100 Fractionalized ownership is cool.
01:29:56.780 But we really need to recognize that we have rights to the information that we produce
01:30:02.240 in order to flip the switch and change these business models from being exploitative to empowering.
01:30:09.200 I think it's going to come soon, though.
01:30:10.840 I actually think it's going to come soon.
01:30:12.160 I don't think it's something that's, you know, every time some idea comes in,
01:30:15.740 there's a little bit of abuse, then there's a lot of abuse,
01:30:18.440 then there's incredible wealth made, and then there's some regulations.
01:30:21.540 And then you kind of work it out with the regulators and the entrepreneurs and they make it work.
01:30:26.720 I actually think, based on what you're saying, it's reasonable.
01:30:28.940 A business like Facebook may say, okay, I'll pay this much for your data,
01:30:31.900 I'll pay this much for your data, let's partner up on it.
01:30:34.480 Very interesting.
01:30:35.660 What's the cost of being a president today?
01:30:38.080 Here's what I mean by it.
01:30:39.620 Is there a dollar amount, because I hear the numbers saying it costs $2 billion to be a president today, right?
01:30:44.180 It costs a billion and a half to be a president today.
01:30:48.160 Can anybody get $2 billion behind them and be a president?
01:30:52.240 Or do you need a little bit of a pitch man, somebody that's going to be able to be doing the work?
01:30:56.800 What would you say is the number for someone to become a president?
01:30:59.420 Well, Donald Trump won the election with $600 million.
01:31:03.480 That's not a lot.
01:31:04.360 And Hillary spent $1.3 billion.
01:31:06.740 What does that tell you?
01:31:07.560 It tells you that the tools are incredibly important.
01:31:13.080 He definitely had a more sophisticated data strategy, that's for sure.
01:31:18.280 But I think if we had laws and regulations that prevented fake news, disinformation, voter suppression, racism, incitement of violence,
01:31:28.880 then he wouldn't have won, because that's how he used to get where he is today.
01:31:32.540 So someone who is a better pitch man won and saved himself $700 million,
01:31:39.980 or someone with a better marketing strategy won and saved himself $700 million?
01:31:46.340 Someone that was willing to break any laws in order to win.
01:31:50.300 You think that?
01:31:51.540 Won the campaign.
01:31:52.840 You firmly believe that?
01:31:53.900 Absolutely.
01:31:55.160 And you're saying you don't believe Hillary's camp at all broke any laws?
01:31:59.080 I don't know. I didn't work there.
01:32:00.460 Got it. Got it. So you're talking about from your POV of where you were at.
01:32:04.300 So to date, you think the campaign this year, it's going to be that much money spent as well?
01:32:08.880 Like that kind of money? We're going to go a little higher.
01:32:11.340 Yeah, thanks very much to David Bossie, head of Citizens United,
01:32:15.880 who now has, again, put democracy on an auction block,
01:32:22.280 and whoever wants to pay the most money is going to be the most powerful mouthpiece.
01:32:26.280 That's the biggest disaster that we've ever had in politics in America.
01:32:31.840 The rest of the world thinks it's insane that we allow super PACs to exist
01:32:37.140 where you can funnel hundreds of millions of dollars through them.
01:32:41.160 It's completely insane.
01:32:42.620 You can obfuscate where that money comes from.
01:32:44.880 You can put complex structures so you don't know who the donors actually are.
01:32:48.660 I mean, it's disgusting.
01:32:49.840 It is disgusting.
01:32:50.560 You think it's going to go away, though?
01:32:52.600 Well, the first ever tabling of the reversal of it has started to be discussed this year,
01:33:00.540 again, by Adam Schiff.
01:33:02.480 Thank God for him.
01:33:03.700 I think it's such an important conversation,
01:33:06.500 especially because David Bossie was also the head of the Defeat Crooked Hillary campaign,
01:33:11.720 which was the Make America No. 1 super PAC that was run by Cambridge Analytica
01:33:15.920 and spent all of their money targeting neurotic people with fear-based messaging and disinformation
01:33:21.780 to get them to not vote for Hillary Clinton.
01:33:25.180 What's Alex doing today?
01:33:26.900 Is he working today?
01:33:28.400 Absolutely.
01:33:29.200 He is.
01:33:29.800 I know from very many sources that he is still very active as a political consultant.
01:33:35.520 Apparently, after the recent conservative victory in the United Kingdom,
01:33:43.640 he was seen with the heads of the Leave EU Brexit campaign chugging champagne in Mayfair.
01:33:52.860 Still active.
01:33:54.460 He might have had something to do with it.
01:33:58.320 Not he might have.
01:33:59.400 I mean, at this point, anybody who knows who he is, he did.
01:34:02.720 Somebody asked me a question.
01:34:04.380 I posted a question.
01:34:05.220 Somebody said, tell us about your involvement with Funware.
01:34:09.440 What's your, is it Funware, Funware?
01:34:11.180 Am I saying it properly?
01:34:12.420 P-H-U-N, yeah.
01:34:13.820 P-H-U-N.
01:34:14.420 I stepped down from negotiations.
01:34:17.220 Okay, so you're not a part of it at all right now.
01:34:18.560 To join their board, yeah.
01:34:20.580 I was really looking forward to working with a data company
01:34:23.780 that actually wanted to give their data back to consumers.
01:34:28.540 They had designed a strategy that I was helping them with
01:34:32.720 all of the data that they own on individuals would be put into a wallet,
01:34:38.180 so an app on your phone.
01:34:39.480 You would log in, and you would be able to see all the data that they had on you,
01:34:45.200 how much money they had earned off of it,
01:34:47.140 and then it would be given back to you.
01:34:49.100 So you now own that.
01:34:50.180 You can decide.
01:34:50.660 That's what they were doing.
01:34:51.820 That was the plan.
01:34:53.540 And then I found out that they were working with the Trump campaign
01:34:56.340 from investigative journalists, not from the executives themselves.
01:35:01.260 So I ceased all negotiations, and I'm no longer involved with them.
01:35:05.840 Very quickly.
01:35:07.260 Oh, yeah.
01:35:07.780 I've never received a dollar, a stock, a token from those individuals,
01:35:11.980 and I will not pursue that in the future.
01:35:14.980 That's good to know.
01:35:15.460 That's very good to know, because I think it was a tweet by David Carroll
01:35:20.880 that said, you know, look at Brittany Kaiser.
01:35:23.440 She's back at it again, involved with these guys and helping out with Trump.
01:35:28.540 What are the chances of Zuckerberg or Dorsey?
01:35:31.040 If Zuckerberg or Dorsey ran,
01:35:33.320 how much of an advantage did they have for being a president?
01:35:36.880 Huh.
01:35:37.500 Well, that's an interesting one.
01:35:41.140 Because 2.2 billion followers, essentially.
01:35:43.420 Mark Zuckerberg was, from everything that I heard,
01:35:50.800 had designs on running for president
01:35:53.440 before the Cambridge Analytica, Facebook data crisis.
01:35:59.580 Now I don't think he would decide to pursue that
01:36:03.920 given the current atmosphere.
01:36:05.980 But considering he has the world's largest communications platform,
01:36:10.300 I think nearly anything that he wanted to do,
01:36:12.860 he could accomplish, if he were to abuse his own tools.
01:36:16.340 If he were to abuse his own tools.
01:36:18.880 Yes.
01:36:19.340 Got it.
01:36:19.720 If he were to abuse his own tools.
01:36:21.020 I mean, he lets any politician anywhere in the world abuse his tools, so...
01:36:25.460 Why wouldn't he let himself?
01:36:26.420 Is that kind of what you're thinking?
01:36:26.920 Why wouldn't he let himself?
01:36:28.000 I think at this point, the light is so much on.
01:36:30.480 He would have to step down, sell the majority of his shares,
01:36:33.560 et cetera, et cetera, to say, if I want to do it,
01:36:36.620 I don't know if he's going to be running.
01:36:37.840 Although I heard he gave a recommendation to Mayor Pete
01:36:40.800 to one of his best marketers at Facebook to help Mayor Pete,
01:36:44.640 and apparently it's working for him.
01:36:46.120 He was able to get some attention there.
01:36:47.700 Let's do a quick speed round, okay, to wrap this up.
01:36:50.100 I'll give a name and tell me the first word that comes to mind.
01:36:53.200 Okay?
01:36:53.660 This will be fun.
01:36:54.420 This will be fun.
01:36:54.980 First word that comes to mind.
01:36:57.080 Whatever it may be, tell me the first word that comes to mind.
01:36:59.420 Okay, Linda Tripp, do you know who she is?
01:37:02.380 Yeah.
01:37:03.200 She's a whistleblower.
01:37:04.240 I know, actually.
01:37:06.280 I was going to say old school.
01:37:07.480 Yeah, old school.
01:37:09.020 Nostalgia.
01:37:09.600 Yes, nostalgia.
01:37:10.820 Okay, cool.
01:37:11.720 How about Sharon Watkins?
01:37:14.040 Remind me.
01:37:14.860 Enron, whistleblower.
01:37:16.040 Oh, I met her.
01:37:17.100 Okay.
01:37:17.440 I met her for the first time at the congressional thing.
01:37:20.820 Yeah, hero.
01:37:22.020 Hero.
01:37:22.680 Okay, cool.
01:37:23.360 Mark Felt, Nixon scandal.
01:37:25.460 He was the FBI guy that came out.
01:37:26.800 Oh, yes, he was.
01:37:27.740 Sorry.
01:37:27.900 So, I'm studying whistleblowers to be prepared for you.
01:37:29.980 No, that's great.
01:37:30.840 I think of Daniel Ellsberg when I think of Watergate.
01:37:34.020 Absolutely.
01:37:34.660 Here's a good one.
01:37:35.640 Here's a good one here.
01:37:37.800 Jeffrey Wigand.
01:37:39.300 I may not be saying it right.
01:37:41.080 Jeffrey Wigand.
01:37:41.920 What was he involved with?
01:37:42.800 He was a former tobacco company executive that did the 60 Minutes.
01:37:46.840 The movie The Insider with Russell Crowe.
01:37:49.140 Oh, I never saw that, actually.
01:37:50.860 Well, let me put it to you this way.
01:37:51.880 Is this something I need to see?
01:37:53.100 A hundred percent.
01:37:54.280 Okay, great.
01:37:54.540 A hundred percent.
01:37:55.600 It's phenomenal.
01:37:56.200 I will.
01:37:57.040 It's on my list.
01:37:57.720 Julian Assange.
01:37:59.600 Oh, sad.
01:38:01.340 Sad.
01:38:02.020 Rachel Maddow.
01:38:05.460 Angry.
01:38:06.300 Okay, Trump.
01:38:07.840 Criminal.
01:38:09.060 Okay, Bernie.
01:38:12.180 Activist.
01:38:13.920 Biden.
01:38:16.760 Vice.
01:38:18.860 Hillary Clinton.
01:38:20.820 Secretary.
01:38:21.340 David Carroll.
01:38:21.880 Okay.
01:38:22.680 David Carroll.
01:38:25.960 Professor.
01:38:27.760 Robert Mueller.
01:38:29.600 That's hard.
01:38:31.320 So many words come to mind.
01:38:34.080 Integrity.
01:38:35.120 Okay.
01:38:36.460 Any lobbyist for these big companies as a profession is what I'm asking.
01:38:42.580 So I'm a lobbyist, too.
01:38:43.580 So I'm a lobbyist, too.
01:38:45.260 You're a lobbyist, too.
01:38:46.700 But I'm a lobbyist pro bono for things that I believe in.
01:38:51.380 And that's a different story.
01:38:53.320 Lobbying is important.
01:38:54.200 Lobbying is important.
01:38:54.240 But it depends what you're lobbying for.
01:38:57.520 Okay.
01:38:57.840 Lobbying has a place.
01:39:00.140 So...
01:39:00.620 You do believe that it has a place?
01:39:02.100 Yeah.
01:39:02.560 Some lobbyists are very seedy.
01:39:04.220 Some of them are not.
01:39:05.320 Okay.
01:39:07.200 Robert Mercer.
01:39:08.360 Data scientist.
01:39:13.300 Ted Cruz.
01:39:14.840 Senator.
01:39:15.740 Chris Wiley.
01:39:17.120 Whistleblower.
01:39:18.100 Steve Bannon.
01:39:20.300 Populist.
01:39:22.740 Corey Lawandoski.
01:39:25.840 Used car salesman.
01:39:27.600 Used car salesman.
01:39:29.100 Alexander Nix.
01:39:34.300 Fugitive.
01:39:35.300 Fugitive.
01:39:36.500 Bloomberg.
01:39:37.480 Last one.
01:39:38.360 Candidate.
01:39:42.880 Candidate.
01:39:44.100 This guy's got $54 billion.
01:39:46.020 He's got all the money in the world to be able to make it really work.
01:39:49.900 He also has quite a media platform.
01:39:51.740 Massive media platform.
01:39:52.920 Are you kidding me?
01:39:53.460 It's a massive media platform.
01:39:54.880 It's going to be interesting to see if it's going to work.
01:39:58.120 As a case study.
01:39:59.480 I'm just curious to know how far he can go with this working out.
01:40:02.900 First of all, Brittany, thank you for coming out.
01:40:05.440 What I do want to say is, guys, if you haven't yet purchased a book, buy a copy and start reading it.
01:40:10.380 I highly recommend you read this type of content because it's good for you to be in the know, especially today knowing data is now multi-trillion dollar your industry.
01:40:18.820 For you to know how to protect yourself, your business, and your family.
01:40:21.220 Brittany, thank you so much for coming out.
01:40:24.180 I had so much fun with this.
01:40:26.080 Absolutely.
01:40:26.580 Thank you so much for having me.
01:40:28.240 It's a pleasure.
01:40:29.160 Appreciate you.
01:40:29.860 Thanks, everybody, for listening.
01:40:31.060 And by the way, if you haven't already subscribed to Valuetainment on iTunes, please do so.
01:40:35.680 Give us a five-star.
01:40:37.100 Write a review if you haven't already.
01:40:38.580 And if you have any questions for me that you may have, you can always find me on Snapchat, Instagram, Facebook, or YouTube.
01:40:44.520 Just search my name, PatrickBitDavid.
01:40:46.280 And I actually do respond back when they snap me or send me a message on Instagram.
01:40:51.280 With that being said, have a great day today.
01:40:53.080 Take care, everybody.
01:40:53.840 Bye-bye.