On this week's episode of the Gains For Girls Podcast, we're joined by White House Chief Information Officer Teresa Payton to discuss human trafficking and the new law President Trump signed into law this week, the Take It Down Act.
00:05:18.060I believe it's a national emergency what we're facing, which is why we are so glad that they were able to sign the Take It Down Act into law.
00:05:27.040But really insightful stuff and what people need to hear.
00:05:31.120We talked about the media largely ignoring it.
00:05:56.920I mentioned this to you prior to recording, but your passion and what you have been able to do, the impact that you have been able to have, the awareness that you have been able to help spread is incredible.
00:06:10.040I think it aligns very well with the fight that I have found myself fighting.
00:06:15.980And today, this week, it's a pretty big week because we had, we just saw where President Trump signed Melania Trump's Take It Down Act into law.
00:06:39.980Before we get into any of that, which I want to talk to you about, you've had a remarkable career yourself from the White House to founding your own cybersecurity firm.
00:06:49.900Can you just tell us what initially drew you to the field of tech and security?
00:07:57.100At the time, he's out of the Navy now, and we got stationed in Mayport, Florida, which at that time was not really a technology hotbed.
00:08:05.540And so I had the really challenging time to get a job and got a job at Barnett Bank, which is now part of Bank of America.
00:08:11.840And that's really where my eyes started to get opened up with delivering cutting-edge technology that customers loved.
00:08:19.100Then all of a sudden, these fraudsters would show up, and they'd try to get in between Barnett Bank and our customers and their money.
00:08:25.660And that's when I realized, oh, there's really bad people out there, and there really is the opportunity for technology to be misused.
00:08:31.880So I spent 16 years in the financial services industry really trying to help combat fraudsters, cybercriminals, while delivering technical, elegant solutions for our customers.
00:08:45.720Then I had the opportunity to serve 2006 to 2008 for President George W. Bush.
00:08:51.980And, again, I thought I knew a lot coming from banking.
00:08:58.320You know, all those things that the banks have to combat, we have to put so many processes in place to not let that happen under your watch.
00:09:08.060And then I'd get to the White House and start getting briefed into the capabilities of nation states and realize, no, there's a lot more going on out there that's not discussed.
00:09:17.800So I felt really called and led that when I left the White House, I thought, I have to warn everybody about what's coming next.
00:09:26.440And that really defined for me what I feel like my calling is.
00:09:30.820I'm a faith-based person, so I feel like there's a plan.
00:09:34.820And I really feel like I'm being called and led to help people live their best lives, choosing to use technology where they want to, deciding when not to use technology, but really making sure that they are safe and secure in the process.
00:10:19.820Scamming them, basically enticing them to send over explicit photos.
00:10:24.980Again, these are high schoolers, middle schoolers, send over explicit photos, and then ultimately saying, hey, you have to give us money or else we're going to release this photo everywhere.
00:10:35.140And again, of course, this ended horrifically for the families, for the friends, for the community.
00:10:40.040So now I think the harms are being exposed.
00:10:43.980And as I mentioned, we've seen this Take It Down Act now be signed into law.
00:10:49.840First of all, this was bipartisan, which is you and I can both acknowledge it's incredible to have anything that is bipartisan in the political climate that we have, certainly in regard to elected representation at the federal level, any level, really, for that matter.
00:11:04.880But you wrote about the need for urgency.
00:11:08.640So again, can you discuss the legislation, the importance behind it, what it does, and why specifically this is something that needs to be handled with swift and decisive action like Congress and President Trump has done now?
00:11:21.520Yeah, and so my understanding around what got signed today, and you're right, it does have bipartisan support.
00:11:28.880And, you know, first of all, thank you, Senator Ted Cruz and Amy Klobuchar for coming out and saying, hands across the aisle, this is not a partisan issue.
00:11:40.440So you may recall, there were actually people saying, this law is going to violate free speech.
00:11:47.040And I just want to tell people, if you are not related to or friends with somebody who has been a victim of non-consensual, illicit images being created, being distributed about them, or those terrible tragedies that you relate that happen in your own hometown.
00:12:04.240You know, somebody who is victimized by pornographic images because they were tricked, enticed, you know, by fraudsters and, you know, horrible, evil people.
00:12:15.180If you've never met any of these victims, then you don't know what you're talking about.
00:12:19.180And this is not stifling freedom of speech.
00:12:22.720This is giving necessary remedies to victims, victims who, in many cases, do take their own life.
00:12:31.000And if we can do something about this, if we can create a bill that enforces swift takedown of these non-consensual images being created, being displayed, being stolen, being displayed, if we can save one life, then this bill needs to happen.
00:12:52.060So what this bill, as I understand, is going to create once it's enacted.
00:12:57.800Now, now that the bill is signed, it doesn't mean tomorrow that these reliefs come for victims.
00:13:03.360There's still a time period where these organizations will have time to get ready to this, to be in compliance.
00:13:12.780But by the way, Riley, here's the thing.
00:13:14.680Many of the social media companies, they were looking forward to having this regulation.
00:13:19.000So something, knowing what the yardstick is that they're going to be measured by and standing up to that.
00:13:24.420So it's probably going to be another 30, maybe as long as 180 days for this law to be turned into sort of the compliance regulatory framework.
00:13:33.320But when it is, if you report that you see pornographic imagery on the platforms and you are a victim and it's non-consensual, within 48 hours of you filing your takedown request, they must comply or there will be heavy penalties.
00:14:22.440You mentioned they maybe wanted this done prior.
00:14:26.780Is this something they could have handled or was it more of a situation where they were almost sitting idly by, sitting on their hands, waiting for someone else?
00:14:35.440Because I think that's been a common theme, whatever the topic is, especially over these past maybe five, six years, where there has been this level of fear, this level of, you know, you don't want to be canceled.
00:14:49.180You don't want to be censored, whatever, you know, that, that tactic is that works for you to be manipulated into, into silence or lack of willingness to do what needs to be done to do the right thing.
00:15:00.520Do you think that's kind of the stance meta took or would it have eventually always needed federal action to be taken?
00:15:06.840We need federal action and the reason being is actually for victims and because we actually help a lot of these victims.
00:15:15.580It's a maze of regulations and it's a maze of procedures to get these things taken down and it just feels like you're playing whack-a-mole.
00:15:28.580So having a federal umbrella to basically say to every state, hey, you know, your state's attorney general and your state can pass whatever laws it wants to pass, but at a minimum, this is the federal law.
00:15:41.560And then that now gives victims the ability to seek remedies within the federal court systems if they so desire.
00:15:50.140It gives law enforcement authorities, the opportunity to seek criminal prosecutions within the federal court system, but it also gives the social media and big tech companies an out, if you will.
00:16:03.460So if they were to be sued saying you impinged upon my freedom of speech and expression, they now can point to a law and say, no, we're actually being in compliance with the regulatory framework.
00:16:16.360I think, you know, in fairness to Meta, because Meta can and should do more and really every social media company can and should do more.
00:16:26.020Let's not make the federal law the basement of what you get done.
00:16:29.700Let's make it, hey, this is where we start, but we're going to do a lot more than this.
00:16:34.740Candidly, I would love to see social media companies say, we treat this so seriously, we see it as a competitive advantage.
00:16:42.620You want to be on our platform because our platform is more responsive to victims.
00:16:47.120That, to me, I think could be a real competitive advantage.
00:16:49.780But it remains to be seen whether or not big tech and social media companies feel that they can be bold and be brave on that particular matter.
00:16:57.140But to be fair to Meta, one of the statistics that I saw said that they removed about, in just the first half of last year, so just six months, about 1.2 million non-consensual illicit imagery.
00:17:13.720So you think about all the people who have to, even if you automate all of that, there's still somebody looking at it and just making sure, making sure they're not accidentally removing an image that's not illicit imagery.
00:17:37.120When you phrase it like that, when you have the stats and the numbers to indicate that millions and millions of these photos or images or videos or clips or whatever it is are out there,
00:17:49.840it certainly is a full-time job and it definitely is a, what I would deem a national emergency, a crisis really, that this is even happening.
00:17:59.200It's a discussion that we have to have.
00:18:00.780This episode is brought to you by Defender.
00:18:06.840With its 626-horsepower twin-turbo V8 engine, the Defender Okta is taking on the Dakar Rally, the ultimate off-road challenge.
00:18:18.960I mentioned AI to you and I'll, again, I'll speak for myself, but probably speaking for most Americans, it's pretty over my head.
00:18:30.900Given its ability, what it can do, how it does what it does, I can type in my phone and chat GPT, you know, give me a meal plan for this week that has this many grams of protein, this many grams of carbs, and it spits it out like within seconds.
00:18:44.580Same thing with workouts, whatever the topic is, it's amazing.
00:18:48.300But as someone who's involved, of course, in cybersecurity, what do you see for its future in terms of both, I think, good and bad?
00:19:01.500There's on this issue, and then there's some other issues as well.
00:19:04.480But, you know, so let's stay, if we stay on sort of the deep fake technology, and then we'll get into sort of your chat conversation that you had to do your meal plan.
00:19:13.380So that's the positive side of the technology, right?
00:19:19.780And, you know, I always tell people, check its homework, because it's like you're dealing with an eight-year-old when you get the response back.
00:19:26.140It's like, here's the first couple of things I found.
00:19:28.480And it's like, okay, well, why don't you check your work?
00:19:30.660So you definitely always check the work, even on what it suggests to you has protein, because it does have the ability.
00:19:37.020Many engineers build it so that it's helpful to you and gives you an answer as quickly as possible, which means it will get it wrong in its desire to respond to you as quickly as possible.
00:19:47.320That's why I say it's like not to insult eight-year-olds, but it's kind of like an eight-year-old.
00:19:50.460Like you give them an assignment and the first encyclopedia they come to, that's going to be your answer, and it might be wrong.
00:19:56.280So they call that hallucination in those.
00:19:58.740But when you think about the ability to create deepfakes, which actually can have some really cool uses to it.
00:20:04.680So, for example, you could use it to update training videos.
00:20:07.220So you could say, I already have a video where I've taught people end-user education awareness for cybersecurity, or I taught people how to code in Python, or I taught people how to develop a meal plan, you know, Riley's life in a day of like, here's how I eat, and here's how I work out, and here's how I do the things that I do.
00:20:25.280And you may say, you know what, I just need to change three things.
00:20:27.640And rather than going into like copy editing or hiring somebody or something, it's like really simple.
00:20:33.900So you could use deepfake technology to update your training videos.
00:20:37.160Like that's a cool use of the technology.
00:20:39.740The problem with the deepfake technology is right now it is estimated by researchers.
00:20:44.700And one of the reports comes from DeepTrace Labs.
00:20:47.800There's a couple other reports that are very similar.
00:20:49.940It says that out of all the deepfakes being generated today, about 98% of them are estimated to be pornographic, and 99% of those targeting women.
00:21:02.740So this is a real present danger for my daughter, for me, for you, Riley, for all the women in our lives that we care about.
00:21:11.020But we need to think more broadly about this because it's a real and present danger, not just to the victims, but children are getting exposed to this imagery.
00:22:09.080Like the social media stuff, how to protect your kids, what parents need to be doing, how they need to be involved, really incredible stuff.
00:22:17.000But one of the other conversations that John and I were having was about really the lack of media coverage around human trafficking.
00:22:25.780It was about maybe two or three weeks ago at this point when RFK told the world in this cabinet meeting that Biden's HHS was a collaborator in child trafficking, sex, and for slavery that resulted, again, in hundreds of thousands of still missing children.
00:22:45.560And somehow it just has been largely ignored by virtually every single outlet.
00:22:51.480So I wanted to ask you, like, do you think the media coverage of what we're seeing, again, it's bipartisan support in Congress, but do you think the media coverage, certainly on both sides, has been fair and an accurate depiction of how big of an issue this really is?
00:23:09.080It has not been, and I can only share my experience on this.
00:23:13.920So we do a lot of work with the National Center for Missing and Exploited Children, which, you know, deals with runaway situations, kidnapping situations, but there's also an element of trafficking that comes with that.
00:23:28.740And, you know, as a parent of three kids, I'm very concerned about trafficking.
00:23:33.640And I also look at sort of, you know, as people move to this country, the ones that move, you know, whether it's legal or whether they didn't come here through, you know, sort of the right process, they came here illegally.
00:23:43.460It ends up setting up people, the weakest among us, into a situation where they can be trafficked.
00:23:49.800And I will tell you, you know, so I was on the reality TV show, Hunted, which was on CBS.
00:23:56.940And so coming out of that, I met a lot of really great people who do documentaries and who do reality TV.
00:24:03.020And I've pitched several times over the years.
00:24:06.760We should do a story and help people understand what human trafficking looks like.
00:24:15.740We should see if we can solve some cases.
00:24:18.680And let's take the best and the brightest in our country and do a collaborative effort with groups like the U.S. Marshals, like the group in DHS that works on this.
00:24:29.740And let's do something to, one, highlight the work of the brave men and women who are doing this work today to end human trafficking and child trafficking.
00:24:37.640But, two, bring about awareness because we really have to have a community watch project around this.
00:24:44.360And the only way you can have a community watch project around this is you have to bring awareness.
00:24:48.680And every time I talk to different producers about this, they're like, oh, it's such a dark topic.
00:24:59.160And, you know, trying to get the word out on this and trying to, you know, basically convince somebody who does documentary work or somebody who does reality TV work.
00:25:09.260Like, there's something here we could do.
00:25:38.060Well, I think of people like you, of course, I think of people like Tim Tebow, who just was featured in an episode with Sean Ryan, where he showed I think it was 111,000 IP addresses that had downloaded child pornography of kids under the age of 12 in just the past 30 days.
00:26:52.580And it's been great that Tim Tebow has put sort of his time and his celebrity and his foundation behind this to really uncover this.
00:27:01.460And really, candidly, in order to have the public be more aware, to spot the challenges, to know I have an issue.
00:27:12.400So one of the things that I learned in protecting God's children, which is something that, at least in our diocese, all Catholic school parents have to go through if you even want to go on a field trip with your kid.
00:27:22.880And one of the things we learned in the protecting the God's children classes is that people who traffic in children, especially, they don't actually believe they're doing something wrong.
00:27:36.480And even though they've been arrested, they'll even say in these videos, which is very hard to watch as a parent, like, I've been told it's wrong, but I would most likely do it again.
00:27:58.720I was trained on how to talk to my children so that they would be comfortable telling me, even if somebody told them not to tell me.
00:28:05.740But not every parent goes through that training.
00:28:08.000And this is where, if we can bring a way to storytell about this and highlight, number one, the brave men and women in this country, both in the nonprofit organizations, but in our law enforcement organizations that are literally banging on doors and rescuing people from human trafficking, rescuing children from this, busting these horrible, evil people who are now creating the imagery.
00:28:32.520We always had bad people around us, but the misuse of technology allows them to automate their evil ways at a speed and scale none of us really want to think about.
00:28:44.260And so if we can shine a light on that and educate the public on what this looks like, you know, maybe, just maybe, somebody who's getting ready to do an evil, heinous thing because they think there's nothing wrong with it.
00:28:57.380Maybe they would see some of this and say, gosh, there's something wrong with what I'm doing.
00:29:42.040Well, you know, obviously we love helping companies and individuals keep criminals out of their lives.
00:29:48.500And one of the things we are seeing since we're on the topic of deepfake technology, especially with us sadly having the passing of the Pope, but now we have sort of the joy and the future of our new Pope.
00:30:02.340One of the things that we are seeing is that many faithful around the world, they don't have to be Catholic.
00:30:09.320They can be, you know, kind of all faiths and persuasions, you know, believing in a God and wanting to sort of connect to different people, even like a Tim Tebow, for example.
00:30:20.280And we're starting to see fake social media accounts, imposter accounts.
00:30:26.240But the thing that's different is, is they are able to create very convincing deepfake audio, deepfake video, even create very convincing looking passports, driver's licenses.
00:30:37.020And they are targeting people online and defrauding them of money.
00:30:42.100So they think they're helping the Pope or a Cardinal or a priest, or they think they're hoping, helping, you know, Mark Wahlberg, you know, you name any sort of notable faith-based person who's out there.
00:30:53.520They probably have a charity tied to them.
00:30:55.980And so they think they're part of a fan club and they think they're actually engaging with either the Christian or the faith-based celebrity themselves or the organization.
00:31:06.040And they're sending their hard-earned money to criminal syndicates.
00:31:11.600And so, you know, if you could get the word out to your followers of, you know, there's ways to validate that you are dealing with the legitimate person or not the legitimate person, but understand you are a target.
00:31:40.460And so a lot of times I just do enough investigative information and I turn it over the social media platform and I turn it over a law enforcement.
00:31:48.280And I'm like, here's another one for you.
00:32:05.640If you think it's really flagrant and they're violating a law and they're defrauding people potentially of money or human trafficking is another sort of enticement to get people to meet them.
00:32:15.440So we've had a couple of victims actually call our firm and say they were told to buy plane tickets to fly somewhere to meet a faith-based person.
00:32:25.220And we're like, no, you can knock it on that plane.
00:32:31.180So if you could just kind of warn everybody, you know, just like we teach children about stranger danger and know you can't get in a car to help somebody find a lost puppy.
00:32:42.320And no, you can't take candy from people you don't know.
00:32:44.960And no, you can't get in a car and show somebody directions to get to their mother who had a heart attack.