Gaines for Girls with Riley Gaines - May 21, 2025


Protecting Privacy: The Power Behind the Take It Down Act


Episode Stats

Length

36 minutes

Words per Minute

166.17639

Word Count

5,988

Sentence Count

392

Misogynist Sentences

5

Hate Speech Sentences

2


Summary

On this week's episode of the Gains For Girls Podcast, we're joined by White House Chief Information Officer Teresa Payton to discuss human trafficking and the new law President Trump signed into law this week, the Take It Down Act.


Transcript

00:00:00.000 Hey, New York sports fans, are you ready to level up your game?
00:00:03.460 Introducing Sports Plus from the New York Post,
00:00:05.880 your all-access pass to the very best in Big Apple sports coverage.
00:00:09.640 From insider columns to unfiltered opinions,
00:00:11.980 the Post's legendary voices, Joel Sherman, John Hyman,
00:00:15.380 Mike Vaccaro, Molly Walker, and more,
00:00:17.400 bring you the stories you can't get anywhere else.
00:00:19.940 Daily newsletters? Check.
00:00:21.420 Team-by-team updates? You got it.
00:00:23.380 Text with Post writers? Oh yeah, you're that close to the action.
00:00:27.480 Why now?
00:00:28.060 Oh, just a little something called an electric baseball summer
00:00:31.340 with wall-to-wall Yankees and Mets coverage,
00:00:34.000 a monumental WNBA season following the defending champs,
00:00:37.840 the New York Liberty, like never before,
00:00:39.860 big moves brewing for the Rangers,
00:00:41.540 major shakeups coming for St. John's basketball.
00:00:44.120 It's all happening and you don't want to miss a minute.
00:00:47.180 And the best part? Your first 30 days are free.
00:00:50.260 That's right, free.
00:00:51.940 Head to newyorkpost.com slash get sports plus and sign up now.
00:00:55.260 Sports plus from the New York Post, real New York sports.
00:01:05.720 Hello, everybody.
00:01:07.020 Welcome back to the Gains for Girls podcast.
00:01:09.560 I hope you watched last week's episode.
00:01:11.480 We had John Rich on and we talked a lot about,
00:01:14.160 we talked about several different topics,
00:01:15.900 but one of the topics that we discussed was human trafficking.
00:01:18.920 One thing specifically that still has me just totally mind blown
00:01:24.620 is how RFK, RFK Jr., again, the Secretary of Health,
00:01:29.700 how he admitted publicly to the world in this cabinet meeting
00:01:34.080 that Biden's HHS was a collaborator in child trafficking,
00:01:39.160 sex trafficking, and slavery that ultimately resulted in hundreds of thousands
00:01:44.240 of missing children, still missing children, and it's largely been ignored.
00:01:49.600 So I wanted to follow up that conversation today with more cultural news.
00:01:53.580 One of those important acts being done just this past week,
00:01:57.680 where President Trump signed into law the Take It Down Act.
00:02:01.840 We've got a clip here.
00:02:02.700 Let's watch it.
00:02:04.280 Today, it's my honor to officially sign the Take It Down Act into law.
00:02:08.660 It's a big thing, very important.
00:02:10.340 It's so horrible what takes place.
00:02:13.120 This will be the first ever federal law to combat the distribution of explicit,
00:02:18.320 imaginary, posted without subjects' consent.
00:02:22.640 They take horrible pictures, and I guess sometimes even make up the pictures
00:02:27.380 that they post without consent or anything else.
00:02:30.300 And very importantly, this includes for forgeries generated by artificial intelligence
00:02:35.820 known as deepfakes.
00:02:37.420 We've all heard about deepfakes.
00:02:38.540 With the rise of AI image generation, countless women have been harassed
00:02:42.860 with deepfakes and other explicit images distributed against their will.
00:02:47.740 This is the wrong, and it's just so horribly wrong.
00:02:51.640 And it's a very abusive situation, like in some cases people have never seen before.
00:02:57.780 And today we're making it totally illegal.
00:03:02.660 Incredible work that was done.
00:03:04.460 This was really championed by First Lady Melania.
00:03:06.580 Incredible she was there.
00:03:08.540 Of course, she got to sign this as well, which was pretty awesome to see her honored in this
00:03:12.380 way for her efforts on this piece of legislation explicitly.
00:03:17.060 But before we get into any of that and our amazing guest that we have today, who is an expert
00:03:22.220 in the field, I want to tell you about our sponsor for today's episode.
00:03:27.220 It is Cozy Earth.
00:03:28.920 They have just been incredible.
00:03:30.720 I have their products, I tell you every time.
00:03:33.680 I have their pajamas, I have their sheets on our bed.
00:03:36.080 I love them.
00:03:36.840 My husband loves them.
00:03:38.240 My parents just moved.
00:03:39.580 They have their sheets.
00:03:40.680 They love them.
00:03:41.700 It has really helped create a sanctuary, if you will, within our own home, which is needed.
00:03:48.240 I travel so much.
00:03:49.300 So when I'm home, I want to enjoy sleeping in my bed.
00:03:52.760 And I get to do that with my sheets and my pajamas, my loungewear from Cozy Earth.
00:03:59.060 So I want you to go to CozyEarth.com.
00:04:01.560 You can use my code GAINS, G-A-I-N-E-S, for 40% off.
00:04:06.120 Again, best-selling sheets, pajamas, and more.
00:04:08.960 There's a 100-day return if you don't like it.
00:04:12.580 But trust me, you will.
00:04:13.760 You will not regret your purchase.
00:04:16.000 Luxury should not be out of reach for anyone.
00:04:18.900 You deserve it.
00:04:19.480 So check it out at CozyEarth.com.
00:04:21.260 Make sure you're following our show, Gains for Girls.
00:04:24.320 You can do that at YouTube.
00:04:26.040 Go to YouTube.com slash Outkick.
00:04:28.460 That's where you can like and subscribe.
00:04:30.240 You never want to miss a moment of this show.
00:04:32.820 Your support means the world to me.
00:04:35.260 Today's guest.
00:04:36.200 We're talking to Teresa Payton.
00:04:38.880 She was the first female White House chief information officer.
00:04:42.380 So she's been a trailblazer in her field.
00:04:45.540 I mean, really across the board for women.
00:04:47.520 This was under President Bush in 2006 to 2008.
00:04:50.500 She is an AI strategist.
00:04:53.020 She is a business and personal security expert.
00:04:57.620 What she has been able to expose specifically in the world of deepfakes as it pertains to human trafficking and beyond is incredible.
00:05:09.460 And I use that word lightly because it's horrific.
00:05:12.180 The things that we will discuss in today's episode.
00:05:15.380 I mean, similar to President Trump.
00:05:18.060 I believe it's a national emergency what we're facing, which is why we are so glad that they were able to sign the Take It Down Act into law.
00:05:27.040 But really insightful stuff and what people need to hear.
00:05:31.120 We talked about the media largely ignoring it.
00:05:33.500 It's hard to hear.
00:05:34.280 It's a sensitive topic.
00:05:35.380 It's emotional.
00:05:36.100 It's heartbreaking to hear stories of victims, of families, these fraudsters who are monsters.
00:05:42.980 But we need to hear it.
00:05:44.180 Parents need to tell their kids about it.
00:05:46.020 They need to have the conversation.
00:05:47.920 So check out my conversation here with Teresa Payton.
00:05:50.380 Well, Teresa, thank you for joining the Gains for Girls podcast.
00:05:55.160 I am just thrilled to have you on.
00:05:56.920 I mentioned this to you prior to recording, but your passion and what you have been able to do, the impact that you have been able to have, the awareness that you have been able to help spread is incredible.
00:06:10.040 I think it aligns very well with the fight that I have found myself fighting.
00:06:14.380 So just very grateful for you.
00:06:15.980 And today, this week, it's a pretty big week because we had, we just saw where President Trump signed Melania Trump's Take It Down Act into law.
00:06:25.160 It was pretty awesome in the clip.
00:06:26.740 He even had Melania sign it too.
00:06:29.240 And of course, this bill requires big tech platforms to delete revenge porn within 48 hours and requires jail time for perpetrators.
00:06:38.520 So really incredible stuff.
00:06:39.980 Before we get into any of that, which I want to talk to you about, you've had a remarkable career yourself from the White House to founding your own cybersecurity firm.
00:06:49.900 Can you just tell us what initially drew you to the field of tech and security?
00:06:55.540 Sure.
00:06:55.740 Absolutely.
00:06:56.760 Coming out of graduate school at UVA, so I went to undergrad at Immaculata University.
00:07:00.980 And at the time, it was actually an all-women's college except for like the night school.
00:07:08.340 But the traditional college was an all-women's college at the time, played sports in college, played field hockey, and went to UVA.
00:07:15.620 And when I left UVA with my master's in hand, Master of Science in Management Information Technology,
00:07:21.340 my focus was on I wanted to change the world with cutting-edge technology.
00:07:27.080 We weren't even calling it cybersecurity yet.
00:07:29.340 I wasn't even really thinking about criminals yet.
00:07:31.420 I just knew that we had great technology, and I just wanted to make people's lives better with technology.
00:07:38.320 And obviously, that was very naive on my part.
00:07:41.100 I wasn't even thinking about the misuse of technology and how dangerous things could get in the future.
00:07:46.220 I ended up – my first job was in the financial services industry, and that was, again, just sort of a – I sort of say it's a God thing.
00:07:54.960 My husband was in the Navy.
00:07:57.100 At the time, he's out of the Navy now, and we got stationed in Mayport, Florida, which at that time was not really a technology hotbed.
00:08:05.540 And so I had the really challenging time to get a job and got a job at Barnett Bank, which is now part of Bank of America.
00:08:11.840 And that's really where my eyes started to get opened up with delivering cutting-edge technology that customers loved.
00:08:19.100 Then all of a sudden, these fraudsters would show up, and they'd try to get in between Barnett Bank and our customers and their money.
00:08:25.660 And that's when I realized, oh, there's really bad people out there, and there really is the opportunity for technology to be misused.
00:08:31.880 So I spent 16 years in the financial services industry really trying to help combat fraudsters, cybercriminals, while delivering technical, elegant solutions for our customers.
00:08:45.720 Then I had the opportunity to serve 2006 to 2008 for President George W. Bush.
00:08:51.980 And, again, I thought I knew a lot coming from banking.
00:08:54.940 I'd seen terrorist financing.
00:08:56.520 I'd seen money laundering.
00:08:58.320 You know, all those things that the banks have to combat, we have to put so many processes in place to not let that happen under your watch.
00:09:06.360 And I thought I had seen it all.
00:09:08.060 And then I'd get to the White House and start getting briefed into the capabilities of nation states and realize, no, there's a lot more going on out there that's not discussed.
00:09:17.800 So I felt really called and led that when I left the White House, I thought, I have to warn everybody about what's coming next.
00:09:26.440 And that really defined for me what I feel like my calling is.
00:09:30.820 I'm a faith-based person, so I feel like there's a plan.
00:09:34.820 And I really feel like I'm being called and led to help people live their best lives, choosing to use technology where they want to, deciding when not to use technology, but really making sure that they are safe and secure in the process.
00:09:49.820 Yeah, incredible.
00:10:19.820 Scamming them, basically enticing them to send over explicit photos.
00:10:24.980 Again, these are high schoolers, middle schoolers, send over explicit photos, and then ultimately saying, hey, you have to give us money or else we're going to release this photo everywhere.
00:10:35.140 And again, of course, this ended horrifically for the families, for the friends, for the community.
00:10:40.040 So now I think the harms are being exposed.
00:10:43.980 And as I mentioned, we've seen this Take It Down Act now be signed into law.
00:10:47.940 And you recently wrote about this.
00:10:49.840 First of all, this was bipartisan, which is you and I can both acknowledge it's incredible to have anything that is bipartisan in the political climate that we have, certainly in regard to elected representation at the federal level, any level, really, for that matter.
00:11:04.880 But you wrote about the need for urgency.
00:11:08.640 So again, can you discuss the legislation, the importance behind it, what it does, and why specifically this is something that needs to be handled with swift and decisive action like Congress and President Trump has done now?
00:11:21.520 Yeah, and so my understanding around what got signed today, and you're right, it does have bipartisan support.
00:11:28.880 And, you know, first of all, thank you, Senator Ted Cruz and Amy Klobuchar for coming out and saying, hands across the aisle, this is not a partisan issue.
00:11:38.680 This is also not a free speech issue.
00:11:40.440 So you may recall, there were actually people saying, this law is going to violate free speech.
00:11:47.040 And I just want to tell people, if you are not related to or friends with somebody who has been a victim of non-consensual, illicit images being created, being distributed about them, or those terrible tragedies that you relate that happen in your own hometown.
00:12:04.240 You know, somebody who is victimized by pornographic images because they were tricked, enticed, you know, by fraudsters and, you know, horrible, evil people.
00:12:15.180 If you've never met any of these victims, then you don't know what you're talking about.
00:12:19.180 And this is not stifling freedom of speech.
00:12:22.720 This is giving necessary remedies to victims, victims who, in many cases, do take their own life.
00:12:31.000 And if we can do something about this, if we can create a bill that enforces swift takedown of these non-consensual images being created, being displayed, being stolen, being displayed, if we can save one life, then this bill needs to happen.
00:12:52.060 So what this bill, as I understand, is going to create once it's enacted.
00:12:57.800 Now, now that the bill is signed, it doesn't mean tomorrow that these reliefs come for victims.
00:13:03.360 There's still a time period where these organizations will have time to get ready to this, to be in compliance.
00:13:12.780 But by the way, Riley, here's the thing.
00:13:14.680 Many of the social media companies, they were looking forward to having this regulation.
00:13:19.000 So something, knowing what the yardstick is that they're going to be measured by and standing up to that.
00:13:24.420 So it's probably going to be another 30, maybe as long as 180 days for this law to be turned into sort of the compliance regulatory framework.
00:13:33.320 But when it is, if you report that you see pornographic imagery on the platforms and you are a victim and it's non-consensual, within 48 hours of you filing your takedown request, they must comply or there will be heavy penalties.
00:13:52.440 This is huge.
00:14:22.440 You mentioned they maybe wanted this done prior.
00:14:26.780 Is this something they could have handled or was it more of a situation where they were almost sitting idly by, sitting on their hands, waiting for someone else?
00:14:35.440 Because I think that's been a common theme, whatever the topic is, especially over these past maybe five, six years, where there has been this level of fear, this level of, you know, you don't want to be canceled.
00:14:49.180 You don't want to be censored, whatever, you know, that, that tactic is that works for you to be manipulated into, into silence or lack of willingness to do what needs to be done to do the right thing.
00:15:00.520 Do you think that's kind of the stance meta took or would it have eventually always needed federal action to be taken?
00:15:06.840 We need federal action and the reason being is actually for victims and because we actually help a lot of these victims.
00:15:15.580 It's a maze of regulations and it's a maze of procedures to get these things taken down and it just feels like you're playing whack-a-mole.
00:15:28.580 So having a federal umbrella to basically say to every state, hey, you know, your state's attorney general and your state can pass whatever laws it wants to pass, but at a minimum, this is the federal law.
00:15:41.560 And then that now gives victims the ability to seek remedies within the federal court systems if they so desire.
00:15:50.140 It gives law enforcement authorities, the opportunity to seek criminal prosecutions within the federal court system, but it also gives the social media and big tech companies an out, if you will.
00:16:03.460 So if they were to be sued saying you impinged upon my freedom of speech and expression, they now can point to a law and say, no, we're actually being in compliance with the regulatory framework.
00:16:16.360 I think, you know, in fairness to Meta, because Meta can and should do more and really every social media company can and should do more.
00:16:26.020 Let's not make the federal law the basement of what you get done.
00:16:29.700 Let's make it, hey, this is where we start, but we're going to do a lot more than this.
00:16:34.740 Candidly, I would love to see social media companies say, we treat this so seriously, we see it as a competitive advantage.
00:16:42.620 You want to be on our platform because our platform is more responsive to victims.
00:16:47.120 That, to me, I think could be a real competitive advantage.
00:16:49.780 But it remains to be seen whether or not big tech and social media companies feel that they can be bold and be brave on that particular matter.
00:16:57.140 But to be fair to Meta, one of the statistics that I saw said that they removed about, in just the first half of last year, so just six months, about 1.2 million non-consensual illicit imagery.
00:17:12.840 That's a lot.
00:17:13.720 So you think about all the people who have to, even if you automate all of that, there's still somebody looking at it and just making sure, making sure they're not accidentally removing an image that's not illicit imagery.
00:17:26.280 That's a lot of work.
00:17:27.640 And so to be fair to them, can they do more?
00:17:29.680 Yes.
00:17:29.940 Should they do more?
00:17:30.780 Absolutely.
00:17:31.780 But they did have to remove a lot.
00:17:33.660 This is a real huge problem.
00:17:36.800 Yeah.
00:17:37.120 When you phrase it like that, when you have the stats and the numbers to indicate that millions and millions of these photos or images or videos or clips or whatever it is are out there,
00:17:49.840 it certainly is a full-time job and it definitely is a, what I would deem a national emergency, a crisis really, that this is even happening.
00:17:59.200 It's a discussion that we have to have.
00:18:00.780 This episode is brought to you by Defender.
00:18:06.840 With its 626-horsepower twin-turbo V8 engine, the Defender Okta is taking on the Dakar Rally, the ultimate off-road challenge.
00:18:16.640 Learn more at LandRover.ca.
00:18:18.960 I mentioned AI to you and I'll, again, I'll speak for myself, but probably speaking for most Americans, it's pretty over my head.
00:18:30.900 Given its ability, what it can do, how it does what it does, I can type in my phone and chat GPT, you know, give me a meal plan for this week that has this many grams of protein, this many grams of carbs, and it spits it out like within seconds.
00:18:44.580 Same thing with workouts, whatever the topic is, it's amazing.
00:18:48.300 But as someone who's involved, of course, in cybersecurity, what do you see for its future in terms of both, I think, good and bad?
00:18:57.160 Yeah, we have some real challenges.
00:19:01.500 There's on this issue, and then there's some other issues as well.
00:19:04.480 But, you know, so let's stay, if we stay on sort of the deep fake technology, and then we'll get into sort of your chat conversation that you had to do your meal plan.
00:19:13.380 So that's the positive side of the technology, right?
00:19:15.640 It's like, hey, I'm on the go.
00:19:17.120 I want to be healthy.
00:19:18.160 Can you give me a meal plan?
00:19:19.780 And, you know, I always tell people, check its homework, because it's like you're dealing with an eight-year-old when you get the response back.
00:19:26.140 It's like, here's the first couple of things I found.
00:19:27.840 What do you think?
00:19:28.480 And it's like, okay, well, why don't you check your work?
00:19:30.660 So you definitely always check the work, even on what it suggests to you has protein, because it does have the ability.
00:19:37.020 Many engineers build it so that it's helpful to you and gives you an answer as quickly as possible, which means it will get it wrong in its desire to respond to you as quickly as possible.
00:19:47.320 That's why I say it's like not to insult eight-year-olds, but it's kind of like an eight-year-old.
00:19:50.460 Like you give them an assignment and the first encyclopedia they come to, that's going to be your answer, and it might be wrong.
00:19:56.280 So they call that hallucination in those.
00:19:58.740 But when you think about the ability to create deepfakes, which actually can have some really cool uses to it.
00:20:04.680 So, for example, you could use it to update training videos.
00:20:07.220 So you could say, I already have a video where I've taught people end-user education awareness for cybersecurity, or I taught people how to code in Python, or I taught people how to develop a meal plan, you know, Riley's life in a day of like, here's how I eat, and here's how I work out, and here's how I do the things that I do.
00:20:25.280 And you may say, you know what, I just need to change three things.
00:20:27.640 And rather than going into like copy editing or hiring somebody or something, it's like really simple.
00:20:33.900 So you could use deepfake technology to update your training videos.
00:20:37.160 Like that's a cool use of the technology.
00:20:39.740 The problem with the deepfake technology is right now it is estimated by researchers.
00:20:44.700 And one of the reports comes from DeepTrace Labs.
00:20:47.800 There's a couple other reports that are very similar.
00:20:49.940 It says that out of all the deepfakes being generated today, about 98% of them are estimated to be pornographic, and 99% of those targeting women.
00:21:02.740 So this is a real present danger for my daughter, for me, for you, Riley, for all the women in our lives that we care about.
00:21:11.020 But we need to think more broadly about this because it's a real and present danger, not just to the victims, but children are getting exposed to this imagery.
00:21:23.180 Yeah.
00:21:23.880 That is very concerning.
00:21:26.660 And so the victim gets re-victimized over and over again because it propagates, they have no remedies, it's hard to get it taken down.
00:21:33.800 But children are getting exposed to these images.
00:21:36.540 And I think we all can agree that that is not freedom of speech for children to see these images.
00:21:43.380 It's wrong.
00:21:44.420 And we are not doing right by our children if we don't fix this issue.
00:21:49.080 Right.
00:21:49.740 Last week, we had John Rich on, and he's been amazing across the board.
00:21:55.540 He's like one of just my favorite people, but he's very passionate about human trafficking, about protecting the innocence of children.
00:22:01.880 And he had this incredible conversation a few weeks ago with DHS that he has pinned on his social media.
00:22:07.600 And he talks about this, right?
00:22:09.080 Like the social media stuff, how to protect your kids, what parents need to be doing, how they need to be involved, really incredible stuff.
00:22:17.000 But one of the other conversations that John and I were having was about really the lack of media coverage around human trafficking.
00:22:25.780 It was about maybe two or three weeks ago at this point when RFK told the world in this cabinet meeting that Biden's HHS was a collaborator in child trafficking, sex, and for slavery that resulted, again, in hundreds of thousands of still missing children.
00:22:45.560 And somehow it just has been largely ignored by virtually every single outlet.
00:22:51.480 So I wanted to ask you, like, do you think the media coverage of what we're seeing, again, it's bipartisan support in Congress, but do you think the media coverage, certainly on both sides, has been fair and an accurate depiction of how big of an issue this really is?
00:23:09.080 It has not been, and I can only share my experience on this.
00:23:13.920 So we do a lot of work with the National Center for Missing and Exploited Children, which, you know, deals with runaway situations, kidnapping situations, but there's also an element of trafficking that comes with that.
00:23:27.120 And we've also worked on cases.
00:23:28.740 And, you know, as a parent of three kids, I'm very concerned about trafficking.
00:23:33.640 And I also look at sort of, you know, as people move to this country, the ones that move, you know, whether it's legal or whether they didn't come here through, you know, sort of the right process, they came here illegally.
00:23:43.460 It ends up setting up people, the weakest among us, into a situation where they can be trafficked.
00:23:49.800 And I will tell you, you know, so I was on the reality TV show, Hunted, which was on CBS.
00:23:56.940 And so coming out of that, I met a lot of really great people who do documentaries and who do reality TV.
00:24:03.020 And I've pitched several times over the years.
00:24:06.760 We should do a story and help people understand what human trafficking looks like.
00:24:13.440 We should follow some cases.
00:24:15.740 We should see if we can solve some cases.
00:24:18.680 And let's take the best and the brightest in our country and do a collaborative effort with groups like the U.S. Marshals, like the group in DHS that works on this.
00:24:29.740 And let's do something to, one, highlight the work of the brave men and women who are doing this work today to end human trafficking and child trafficking.
00:24:37.640 But, two, bring about awareness because we really have to have a community watch project around this.
00:24:44.360 And the only way you can have a community watch project around this is you have to bring awareness.
00:24:48.680 And every time I talk to different producers about this, they're like, oh, it's such a dark topic.
00:24:54.940 I just don't think we can sell it.
00:24:57.480 So that's my experience.
00:24:59.160 And, you know, trying to get the word out on this and trying to, you know, basically convince somebody who does documentary work or somebody who does reality TV work.
00:25:09.260 Like, there's something here we could do.
00:25:11.280 And, like, lives matter.
00:25:12.840 And we could really make a difference.
00:25:14.640 And maybe it's just me.
00:25:15.680 I mean, I'm not a professional storyteller.
00:25:18.420 I'm not a, you know, professional TV person.
00:25:20.760 And so maybe I'm just not the right person to bring that message forward.
00:25:24.800 But I do feel like the media has to cover this more.
00:25:28.120 And I do understand.
00:25:29.300 It's a very delicate conversation.
00:25:31.780 It's not something people want to talk about.
00:25:33.980 And but we have to get the word out.
00:25:37.960 Yeah.
00:25:38.060 Well, I think of people like you, of course, I think of people like Tim Tebow, who just was featured in an episode with Sean Ryan, where he showed I think it was 111,000 IP addresses that had downloaded child pornography of kids under the age of 12 in just the past 30 days.
00:25:56.980 Again, 111,000 IP addresses.
00:26:00.060 And he had this map and it showed the pretty dark red spots where it was the worst.
00:26:04.840 You know, the South was infiltrated.
00:26:06.380 Washington, D.C., crazily enough, was a very red dot.
00:26:10.060 New York was a big red dot.
00:26:11.580 Several places in California, big red dots.
00:26:14.040 It was horrific.
00:26:15.360 But it opened my eyes so much because, again, most people, because it's not talked about, because it is taboo, it's heartbreaking.
00:26:23.260 It's emotional.
00:26:23.860 People don't want to to hear about this all the time.
00:26:26.600 And I understand that.
00:26:27.520 But it opened my eyes so much to the horrors that exist, how people are evil people, monsters exist.
00:26:37.760 Really, really sick stuff.
00:26:39.020 So I think that would be incredibly beneficial.
00:26:40.860 And I hope this falls on the right ears.
00:26:43.020 Hopefully, there's someone listening that could help navigate or execute something like this, because it would be really powerful.
00:26:52.080 You're right.
00:26:52.580 And it's been great that Tim Tebow has put sort of his time and his celebrity and his foundation behind this to really uncover this.
00:27:01.460 And really, candidly, in order to have the public be more aware, to spot the challenges, to know I have an issue.
00:27:12.400 So one of the things that I learned in protecting God's children, which is something that, at least in our diocese, all Catholic school parents have to go through if you even want to go on a field trip with your kid.
00:27:22.880 And one of the things we learned in the protecting the God's children classes is that people who traffic in children, especially, they don't actually believe they're doing something wrong.
00:27:36.480 And even though they've been arrested, they'll even say in these videos, which is very hard to watch as a parent, like, I've been told it's wrong, but I would most likely do it again.
00:27:46.300 And you're just absolutely horrified.
00:27:48.200 You know, these are people who have had jail time.
00:27:49.900 These are people who have been through counseling.
00:27:51.320 And I'm not saying everybody is like that, but at least in the training that I went through, I was trained to spot the signs.
00:27:57.480 I was trained to ask questions.
00:27:58.720 I was trained on how to talk to my children so that they would be comfortable telling me, even if somebody told them not to tell me.
00:28:05.740 But not every parent goes through that training.
00:28:08.000 And this is where, if we can bring a way to storytell about this and highlight, number one, the brave men and women in this country, both in the nonprofit organizations, but in our law enforcement organizations that are literally banging on doors and rescuing people from human trafficking, rescuing children from this, busting these horrible, evil people who are now creating the imagery.
00:28:32.520 We always had bad people around us, but the misuse of technology allows them to automate their evil ways at a speed and scale none of us really want to think about.
00:28:44.260 And so if we can shine a light on that and educate the public on what this looks like, you know, maybe, just maybe, somebody who's getting ready to do an evil, heinous thing because they think there's nothing wrong with it.
00:28:57.380 Maybe they would see some of this and say, gosh, there's something wrong with what I'm doing.
00:29:02.860 I need help.
00:29:04.360 We've got to find a way to get to zero victims.
00:29:08.460 We have to find a way to get to these evil people and either lock them up or rehabilitate them, if that's even possible.
00:29:17.400 Totally, totally agree.
00:29:19.360 As most Americans, I certainly believe do.
00:29:23.520 What are, last thing for you, what's kind of the next thing?
00:29:27.020 You're working on the human trafficking stuff, exposing, again, the sextortion, I believe is what they call it.
00:29:34.220 Some of those, again, awful, awful things.
00:29:38.040 What else is on the docket for you?
00:29:42.040 Well, you know, obviously we love helping companies and individuals keep criminals out of their lives.
00:29:48.500 And one of the things we are seeing since we're on the topic of deepfake technology, especially with us sadly having the passing of the Pope, but now we have sort of the joy and the future of our new Pope.
00:30:02.340 One of the things that we are seeing is that many faithful around the world, they don't have to be Catholic.
00:30:09.320 They can be, you know, kind of all faiths and persuasions, you know, believing in a God and wanting to sort of connect to different people, even like a Tim Tebow, for example.
00:30:20.280 And we're starting to see fake social media accounts, imposter accounts.
00:30:26.240 But the thing that's different is, is they are able to create very convincing deepfake audio, deepfake video, even create very convincing looking passports, driver's licenses.
00:30:37.020 And they are targeting people online and defrauding them of money.
00:30:42.100 So they think they're helping the Pope or a Cardinal or a priest, or they think they're hoping, helping, you know, Mark Wahlberg, you know, you name any sort of notable faith-based person who's out there.
00:30:53.520 They probably have a charity tied to them.
00:30:55.980 And so they think they're part of a fan club and they think they're actually engaging with either the Christian or the faith-based celebrity themselves or the organization.
00:31:06.040 And they're sending their hard-earned money to criminal syndicates.
00:31:11.600 And so, you know, if you could get the word out to your followers of, you know, there's ways to validate that you are dealing with the legitimate person or not the legitimate person, but understand you are a target.
00:31:24.980 I find it amazing.
00:31:26.020 You know, obviously a lot of it's automated because of the accounts I follow.
00:31:30.140 I get these, I get approached all the time by these imposter accounts.
00:31:34.740 And I think, did you even look at my bio?
00:31:37.000 Like, I would not approach me.
00:31:39.400 I said, I would not.
00:31:40.460 And so a lot of times I just do enough investigative information and I turn it over the social media platform and I turn it over a law enforcement.
00:31:48.280 And I'm like, here's another one for you.
00:31:49.440 Here's another one for you.
00:31:50.600 You're just, just being a good citizen.
00:31:52.440 And that would be the thing I would tell people is if you see these imposter accounts on social media platforms, they cannot keep up.
00:31:59.280 So the best thing you can do is flag it to the social media platform.
00:32:03.940 You can report it to the FTC.gov.
00:32:05.640 If you think it's really flagrant and they're violating a law and they're defrauding people potentially of money or human trafficking is another sort of enticement to get people to meet them.
00:32:15.440 So we've had a couple of victims actually call our firm and say they were told to buy plane tickets to fly somewhere to meet a faith-based person.
00:32:25.220 And we're like, no, you can knock it on that plane.
00:32:28.980 That is not how this works.
00:32:31.180 So if you could just kind of warn everybody, you know, just like we teach children about stranger danger and know you can't get in a car to help somebody find a lost puppy.
00:32:42.320 And no, you can't take candy from people you don't know.
00:32:44.960 And no, you can't get in a car and show somebody directions to get to their mother who had a heart attack.
00:32:49.200 Like these are all ruses.
00:32:50.720 We have to teach people that, yes, I know you think you're looking at their driver's license, their passport.
00:32:57.200 This is their voice.
00:32:58.620 This is where they were on the set doing something.
00:33:01.120 It's not them.
00:33:01.940 It's deepfake technology.
00:33:03.240 And so that's kind of the new frontier we're dealing with.
00:33:07.420 Sometimes it's faith-based.
00:33:08.640 Sometimes it's the CFO at a business is getting a deepfake audio and video of the CEO saying,
00:33:14.980 you need to wire transfer money to this business for a secret acquisition.
00:33:18.840 And it's happening.
00:33:19.820 So we have to really be extra, you know, diligent and aware and just warn others.
00:33:26.280 That's right.
00:33:27.360 Yeah.
00:33:27.700 I know people who have sent money to who they believe is Shania Twain or Taylor Swift.
00:33:33.920 It's really horrific.
00:33:35.840 And you would imagine people could spot it, but there are several, whether it's elderly,
00:33:40.800 whether it's young people, you know, there are people who no one is immune is what I will say.
00:33:47.300 I appreciate you so much.
00:33:48.900 I appreciate how you've been number one, a trailblazer for women to how you have led in
00:33:56.080 a field of cybersecurity and related fields.
00:33:59.340 Um, it's just been really incredible to, to be able to witness and it's inspiring to me.
00:34:05.020 So you're a role model and I appreciate you and all of the work you've done a whole, whole bunch.
00:34:09.720 I appreciate having time on the show with you, Riley, you're a great host and you've built
00:34:14.860 a great platform helping others.
00:34:16.560 And I just appreciate what you're doing out there.
00:34:18.960 Keep fighting the good fight and, uh, have me back.
00:34:22.080 If I can be helpful to you and your listeners anytime.
00:34:25.580 Of course.
00:34:26.040 Of course.
00:34:26.420 Well, thank you.
00:34:28.580 Thank you guys for tuning in to the games for girls podcast.
00:34:31.240 Again, if you are a parent, if you have children, it doesn't matter what age, uh, have the conversation
00:34:38.340 with them, the threats that exist, how people are, are evil, are horrific, uh, how, uh, there are
00:34:46.240 manipulators out there who are capable of really, really awful things and understand that no one
00:34:51.980 is immune.
00:34:52.720 Have that conversation.
00:34:54.000 Go back and listen to the episode with John Rich last week.
00:34:56.500 Listen to this episode this week.
00:34:58.600 Follow us on YouTube at youtube.com slash outkick.
00:35:02.060 That's where you can like and subscribe.
00:35:03.820 You can comment, you can engage in conversations with others in the comment section, uh, which
00:35:09.300 we appreciate, uh, never miss an episode.
00:35:12.560 Your support means the world to me.
00:35:15.000 Uh, and I want you to go to cozy earth again, cozy earth.com use my code gains for 40% off the
00:35:20.820 best sheets, loungewear pajamas.
00:35:22.640 You will not regret it.
00:35:23.920 Trust me.
00:35:25.400 Um, appreciate you guys.
00:35:26.780 And we will see you again next week.
00:35:32.060 We'll see you again next week.