Ep 1292 | Ben Gillenwater | Cybersecurity Expert Reveals Shocking Truth About Parental Controls
Episode Stats
Length
1 hour and 9 minutes
Words per Minute
171.40408
Summary
Ben Gillenwater is a Cybersecurity Expert and Founder of Family IT Guide, a platform for parents to learn about cybersecurity, best practices, and skills when it comes to technology and social media with your kids. In this episode, Ben talks about how he became concerned about the impact of social media on his son and how he decided to take his son off of it.
Transcript
00:00:00.000
Today's guest gave his five-year-old son an iPad, and then after seeing some of the
00:00:06.760
very disturbing material being fed to his five-year-old through what he thought were
00:00:12.820
very innocent apps, he was so disturbed that he not only took the apps off the iPad, but
00:00:24.220
This is a platform for parents to learn about cybersecurity, best practices and skills when
00:00:29.980
it comes to technology and social media with your kids.
00:00:32.860
Some of the things that he is going to share today will blow your mind and help you realize
00:00:37.060
the importance of us parents understanding AI and ed tech and technology, taking these
00:00:46.520
You're going to learn so much from this conversation with Ben Gillenwater.
00:00:50.640
It's brought to you by our friends at Good Ranchers.
00:00:59.980
Ben, thanks so much for taking the time to join us.
00:01:09.840
If you could tell everyone who you are and what you do.
00:01:26.160
What does cybersecurity look like when you're 14?
00:01:28.220
When I was 14, I was actually selling computers at a computer store.
00:01:33.680
But cybersecurity is about understanding all aspects of computer systems.
00:01:42.380
I think I found you on Instagram talking about the importance of security when it comes to
00:01:51.920
When did you start getting concerned about social media and technology for kids?
00:01:58.040
Well, basically when I screwed up and gave my kid an iPad.
00:02:07.960
I did all the things that if my current self was to go back, because it's been five years
00:02:14.280
now, he's 10 now, I find it to be almost laughable.
00:02:19.580
But it's not because it's a normal experience that is common nowadays that I think most people
00:02:26.880
And I found out within a couple of days how big of a mistake that was because he was
00:02:34.100
YouTube took him down some immediate rabbit holes of stuff that's not good for kids to
00:02:38.920
see with sexual undertones, violent undertones, things that are just addictive.
00:02:44.620
And so after a couple of days, I switched it to YouTube kids.
00:02:55.160
And by the way, at the time, YouTube was run by a grandma.
00:03:01.960
And so I'm like, here's a product that's brought to market by a grandma and it's called kids.
00:03:08.640
And I would put some trust into the fact that their parental controls worked.
00:03:13.320
And when I installed the app, it asked me, how old is the kid?
00:03:18.420
We're YouTube filters, all the things they have all the best AI, you know?
00:03:26.640
He saw nightmare characters that gave him nightmares for years that had this weird, there's this
00:03:42.280
He thought that he liked it, but he had nightmares every night.
00:03:49.640
Huggy Wuggy also was associated with some kind of story that was telling kids that they should
00:03:59.020
There's, I could tell it was doing stuff to his brain that is not supposed to happen
00:04:07.020
And there's videos on YouTube kids of superhero characters having sex with each other.
00:04:15.780
So I took YouTube kids off and then we were left with an iPad with some actually kid
00:04:20.860
friendly games, just single player racing games.
00:04:27.400
But then the pattern kept continuing to where he would wake up and go to his iPad.
00:04:34.020
He would want to go to sleep later to play with the iPad.
00:04:37.120
He would come home from school prior to the iPad.
00:04:39.920
One of his best friends lives directly across the street.
00:04:46.040
And so they, you know, go outside, they get home from school, drop the backpack, go outside
00:04:50.200
And then it was get home from school, drop the backpack and go to the iPad.
00:04:55.300
So eventually we took the iPad away altogether and kind of ripped the bandaid off and went
00:05:01.760
And had a couple of weeks where it wasn't too terrible, but he definitely missed it and
00:05:08.580
But then he went right back to playing outside, which is what he still does.
00:05:15.920
And so tell us your thinking when he was five and giving him the iPad originally.
00:05:23.800
My thinking was that it would be an innocent source of entertainment and that if he's going
00:05:31.840
to watch TV, why not just as well watch sort of TV on YouTube?
00:05:39.320
So, you know, there's, cause there is a lot of good stuff.
00:05:42.680
There's very educational stuff, interesting stuff.
00:05:46.600
And so then he could pick his, his flavor and have, is it, is it fun?
00:05:56.100
But it, it's all the other stuff that surrounds the good stuff that leaks its way in because
00:06:03.340
Because YouTube, just like its parent company, Google is not a tech company.
00:06:14.420
They facilitate the sale of ads by building amazing technology and giving it away for free.
00:06:22.680
And that I already knew that being in the IT space and having looked after, you know,
00:06:29.100
I've designed computer systems for the department of defense, for NSA, for state governments,
00:06:35.140
city governments, county governments, for some of the biggest companies in the world
00:06:41.300
And so I, I, I understood all these mechanisms, but yet still fell into the trap.
00:06:47.240
So YouTube is an advertising company and the way they, they sell ads is by showing people
00:06:53.260
ads as much as possible, ideally ads that make sense to them personally.
00:06:58.040
So they track you and follow you and learn what you like.
00:07:01.320
And it makes for a very successful ad business.
00:07:04.180
And we know that Google is an advertising company because their annual SEC reports show that 76%
00:07:10.620
of their revenue is from selling ads, just like Facebook.
00:07:14.320
I think it's 98 or 99% of their revenue is from selling ads according to their SEC report.
00:07:19.080
So we know that they're an ad company and they're in the business of addiction because
00:07:26.160
if you can get people to stare at your ad feed all day, then you can make hundreds of billions
00:07:32.980
of dollars as they've been doing for quite some time.
00:07:36.740
Now you put a kid into that, so their systems are designed to addict people as it turns out
00:07:44.600
And if you're young and you have an underdeveloped mind, excuse me, then the system is especially
00:07:52.880
effective because we all know what it does to us as adults.
00:07:56.280
If you pick up Instagram or pick up YouTube and you start looking at it 45 minutes later,
00:08:00.260
you're like, well, where'd, where'd the time go?
00:08:02.280
You do that to a kid and it's, it's magnified to a greater effect.
00:08:08.080
You know, people don't really understand that when it comes to social media, you as the
00:08:15.400
individual are the product, because as you said, they are making money from the advertisers.
00:08:21.060
The advertisers only make money if the people who are using the social media platform are
00:08:27.960
clicking on them and looking, at least viewing, hopefully converting to actually purchasing
00:08:35.000
Well, if you are selling a product to an advertiser and that product is a person, you have to know
00:08:43.780
everything you can about that product or about that person.
00:08:47.580
And just like if I were selling a product to you, this blanket, like I want to know
00:08:51.900
everything about it to try to convince you that you need to buy it.
00:08:56.500
That's what these companies are telling advertisers.
00:09:02.540
I've got these people who like this, who will click on this, who make this much money.
00:09:06.740
They want to learn as much about you as possible, including your children.
00:09:09.700
So these advertisers will buy for a spot on the bottom of the YouTube video.
00:09:18.080
And so parents just need to know that part of these platforms' jobs is to learn as much
00:09:30.160
I really like to think about attention as a currency.
00:09:34.120
Because for these products or these tools that seem free, or they're put to market as if
00:09:42.020
they're free, you don't have to change dollars to get them, but you do have to give your attention.
00:09:54.140
But I bring it up because I think we have kind of two fundamental currencies as humans.
00:09:59.380
We have time and we have attention and you can't get more of either one.
00:10:07.240
And actually none of us even know how much we have left to spend.
00:10:13.840
And so if I were to continue doing what I did and continue taking for granted that my son
00:10:23.000
should trade his attention for whatever it is that YouTube is going to show him,
00:10:28.620
one of the things that I'd be teaching him is that his attention is not that valuable.
00:10:35.360
And so in hindsight, and what I've learned since starting to focus on family IT guy,
00:10:40.080
because this is all I do now, I've been focusing on this for a couple of years.
00:10:45.100
In hindsight, I'm really glad that it played out the way it did.
00:10:49.500
Because I think problems are the best things to learn from, the best sort of sources of education.
00:10:54.020
And now I, now my son knows more than most kids about internet safety because of what I do.
00:11:01.000
But he also understands now about how valuable attention is and how I practice the budgeting of
00:11:09.540
my attention and where I give it and where I don't.
00:11:12.920
First sponsor for the day is Seven Weeks Coffee.
00:11:24.780
At Seven Weeks, that baby inside the womb is the size of a coffee bean.
00:11:28.020
Yet he or she, no matter how small she is, is made in the image of God.
00:11:34.320
Seven Weeks tries to save as many of those little baby lives as possible
00:11:37.900
by donating 10% of every sale of their coffee to pregnancy centers across the country.
00:11:43.260
They have already donated over a million dollars to these life-saving pregnancy centers.
00:11:48.540
These centers give free sonograms, free prenatals, parenting education classes,
00:11:52.780
adoption courses to these moms in need to help them make a life-affirming decision.
00:11:57.500
So when you buy from Seven Weeks Coffee, you're not only getting pesticide-free,
00:12:02.460
mold-free, sustainably sourced, great tasting, I can attest, coffee,
00:12:06.880
but you're also allowing your coffee to serve a higher purpose.
00:12:09.960
Plus, when you subscribe, you save 15% on your order.
00:12:15.280
And when you use my code Allie, you get an extra 10% off your first order.
00:12:28.960
You took away what you thought was the bad content through YouTube,
00:12:31.920
but allowed him to keep doing, you know, the race car games and other games
00:12:37.800
But, as you said, he was budgeting his attention the wrong way.
00:12:52.120
There was another kid at the playground that had a Huggy Wuggy,
00:12:56.020
like stuffed animal, plush toy type of a thing.
00:12:58.800
And my kid was drawn to that thing, like he walked over on a mission,
00:13:06.680
and I thought he was going to go hang out with the kid.
00:13:12.400
And it had this draw on him that I have never seen since.
00:13:16.540
And that's why this day at the park stands out so much.
00:13:19.220
It, that character programmed something in his brain in a bad way.
00:13:25.020
Where he was trying to take that toy from that kid.
00:13:28.180
It was like, which is not his, is not his character normally.
00:13:36.260
Now, now granted, that's a, that's specific to a particular character,
00:13:40.820
But it was facilitated by this advertising platform that I think is just
00:13:46.480
demonstrates the potential for like the platform itself is addictive.
00:13:50.780
And because of that, the, when you publish shows as a publisher to YouTube,
00:13:55.540
the more addictive your shows are, the more you get paid.
00:13:59.060
And so it actually, it actually incentivizes addiction all the way down the chain.
00:14:06.560
so we can see this in a lot of the kids shows nowadays where it's really,
00:14:11.300
really high stimulus, really fast paced, really oversaturated colors.
00:14:28.580
If you, if you release a 30 minute video and you have a million people that
00:14:33.580
watch it a day and they on average watch 20 minutes of that 30
00:14:36.480
minute video, you're going to make a lot of money.
00:14:39.400
And, and so Cocoa Melon is a really good example of this high stimulus content
00:14:47.420
And that's something I've started to pay attention a lot to.
00:14:50.800
And I have some articles on my website about how to identify low stimulus
00:14:54.700
content and how to count the number of scene changes per minute.
00:14:58.140
And so if you, if you watch a modern show, watch how many cuts there are per
00:15:04.820
minute, there could be 10, 20, 30, 60, just boom, boom, boom, boom, boom.
00:15:11.640
Whereas if you watch an older show, it changes much less frequently because the
00:15:17.880
And they weren't content producers that made old cartoons, weren't trying to
00:15:23.380
maximize for keeping you from switching to the next channel because you had a
00:15:29.180
There's only, you know, three TV channels and one of them had cartoons.
00:15:32.180
And so that's the one you're going to watch, you know, or like the, you've seen
00:15:38.380
There's this one scene in that movie that stands out to me as the perfect example
00:15:43.520
It's when, uh, the, the nanny, and I can't remember her character's name.
00:15:47.760
She first gets to the house in the beginning of the movie, the big house.
00:15:51.700
And she walks into the entry of the like grand kind of foyer.
00:15:56.820
And she stands there and the scene is silent and it doesn't cut and there's no
00:16:04.240
And the house is silent because there were no appliances running.
00:16:08.500
You know, it's filmed in the thirties or forties.
00:16:10.960
Actually, it was filmed after that, but it was taking place in, in the
00:16:14.560
And she just stands there and it's this scene that lasts for a little bit and
00:16:24.620
And so that's something I like to think about too.
00:16:27.840
And that I suggest parents to look at when their kids are watching stuff on
00:16:32.140
their devices nowadays or on TV is what's the stimulus level because that
00:16:36.600
sets the bar for their activity levels in their brain.
00:16:50.820
They send you a big cardboard box and you put inside it all of your VHS home
00:16:55.540
videos, your CDs that maybe have old pictures on it that you tried to digitize in
00:17:00.980
Any other photos you have old Polaroids, whatever you have that contain these
00:17:07.240
precious memories from the past half century, you want to make sure that you
00:17:13.180
You put all of these things in the box, mail it back to them.
00:17:15.600
They digitize all of it, put it on a little file for you to keep, for you to put on
00:17:19.640
your computer, for you to be able to pass down to your kids and grandkids.
00:17:25.460
You want to make sure that it's well organized, that you can always enjoy them, look
00:17:30.900
And so work with legacy box to preserve those memories for future generations.
00:17:35.920
I know that my mom has a lot of home videos that I just find so precious and I want to
00:17:40.600
be able to show them to my own future generation.
00:17:46.560
You'll get 55% off when you use my link, legacybox.com slash Allie.
00:17:55.460
There was this interesting article about Cocomelon a few years ago that we talked about
00:18:00.860
before, but it was basically how the creators of Cocomelon will sit kids in front of Cocomelon
00:18:06.540
and as soon as they divert their attention away, that tells them that the frame rate of that
00:18:13.040
And so if they speed it up or add a color or something, then they can keep the kids'
00:18:18.220
So that speaks to your point that these are all, this is deliberate.
00:18:22.100
This is a part of an addiction mechanism in order to sell ads to people as the product.
00:18:28.740
And I do think we have a responsibility as parents.
00:18:31.720
We're not anti-technology completely in our home, but we are very careful and we have a
00:18:36.300
very narrow, like very narrow regulations for what's allowed.
00:18:41.040
And we've also read, Jonathan Haidt talks about this a lot, that the smaller the screen,
00:18:45.840
the worse it is for kids and their attention levels.
00:18:48.240
I think that's also true for me, which is why watching a movie as a family on the TV
00:18:53.860
is different than a child taking an iPad and playing games or even watching a show on there.
00:19:02.960
And there was something else I wanted to add to something that you said about the fuzzy
00:19:07.760
I would be so interested to hear someone who maybe has studied this, but there is something
00:19:12.240
particularly addicting and gripping about the cute but disturbing combination of characters.
00:19:24.560
Balenciaga had this weird advertising thing a few years ago where they were like, had these
00:19:31.840
little kind of cute bears, but they were dressed in BDSM gear and like also looked dead that
00:19:38.240
they were also putting in advertisements with children. To me, that feels very demonic and
00:19:43.860
very disturbing. But that there's something psychologically addicting and captivating about
00:19:51.120
the cute but disturbing or cute but macabre style or design. And so I think it's interesting that kids
00:19:58.960
are being fed that on these platforms and then it's hooking them for some strange reason.
00:20:03.920
Yes. There's a couple things that makes me think of. One is the concept of grooming.
00:20:12.440
So I've been interviewing psychologists and internet crimes against children, detectives and anybody
00:20:19.060
I can talk to that understands how kids get victimized. And I've learned that the definition
00:20:25.780
of grooming is quite simple. It's getting somebody used to something that they weren't used to
00:20:31.180
before and making it seem normal. And that's what that sounds like.
00:20:39.900
You know, I think another thing that's interesting too with this stuff is that
00:20:43.220
there's a very serious effect that this type of getting used to things, whether it's
00:20:51.020
chaos or darkness or, um, you know, sexual things. So I'll, okay. I have actually some fairly,
00:21:04.640
speaking of darkness, there's some dark kind of statistics that people should know.
00:21:09.080
Of what it means to put a kid in front of an advertising platform that incentivizes addiction,
00:21:17.760
social media, YouTube, stuff like that. All the big, you know, Instagram, TikTok, Snapchat,
00:21:24.960
the new open AI Sora, their video creation tool or masquerades as a video creation tool,
00:21:30.260
but it's actually like their own version of TikTok.
00:21:31.920
Um, so I've heard a lot about, and maybe you have as well, a lot about anxiety and depression
00:21:41.460
occurring much more frequently in children than it ever has.
00:21:47.880
And I was preparing for a lecture a couple of years ago to talk about this stuff. And I was like,
00:21:54.180
well, okay, but how do you tell, like what we talk about as a, as a society problems, how do we
00:22:00.600
tell if the problems are actually getting worse or if it just seems like they're getting worse?
00:22:05.280
Cause we have more access to information than we we've ever had. And with anxiety and depression,
00:22:10.580
you can't really tell because there's not really great data. Those, those things are kind of hard
00:22:15.060
to measure. You can measure, um, prescription rates and apparently those have gone up really high,
00:22:20.300
but something that where there is really good data is death statistics and the world health
00:22:27.000
organization publishes an open mortality database. And they have since 1951. I don't know if they've
00:22:34.600
published since 1951, but the data goes back to 1951. So I went on their website and I said, okay,
00:22:40.300
I want to see if there's changes over time and death statistics that would be related to the outcomes
00:22:45.500
of anxiety and depression, which would be suicide. And so I asked it to show me suicide stats for young
00:22:53.280
people ages zero to 39 from 1951 to 2019. Cause I wanted to cut it off before COVID.
00:23:00.980
And what I found was very disturbing in 1951, young people as, as young as 10 in the, they do five
00:23:09.420
year age groups. So the age group of 10 to 14 specifically had a 1% rate of self-inflicted death.
00:23:15.840
One out of every 100 deaths that occurred amongst 10 to 14 year old children was suicide, which by
00:23:23.440
itself sounds high. Like I actually thought it would be lower than that. I mean, you can't get much
00:23:28.700
lower, but you know, less than 1%. In the eighties and nineties, it went up to 5%.
00:23:33.540
From 2007 to 2019, it tripled. And it's almost on average ages 10 to 14,
00:23:45.720
15 to 19 and 20 to 24. One out of every five deaths is self-inflicted.
00:23:55.320
And I have a chart on my website. There's an article called digital danger zone on family
00:24:00.660
it guy.com. We're not, and I show this, I charted out the data and you can see on these charts,
00:24:08.600
the icons of all the social media platforms that were released every time the graph went up
00:24:13.960
and it correlates directly to social media being in our kids' pockets and backpacks and bedrooms.
00:24:24.000
And so when you put kids in front of an addiction platform, it's very problematic
00:24:30.020
to the effect that I used to tell people to be moderate. And that if you want to let your kids
00:24:35.220
use Instagram, Instagram has parental controls, which by the way, after studying them, they're
00:24:41.140
worthless, but you know, maybe don't let them be on it all day, but if they want a little bit like
00:24:46.820
no big deal. And then I learned these statistics about one of every five deaths amongst as young
00:24:54.580
as 10 years old being self-inflicted. And so now there's no nuance in my mind.
00:25:04.840
If you are like me, you really care about sleep. I love sleep. If I could, I would sleep in every
00:25:11.560
single day. I would sleep in till 10 AM. Now that has probably not happened since I was in college,
00:25:16.960
but I just love to sleep and to feel well rested. I really care about my sheets. I care about my
00:25:23.080
pajamas. I care about the right setting to make sure that I feel as restful as possible. And a
00:25:28.300
huge part of that is your mattress. That's why I'm so excited to partner with Ghostbed. It's such
00:25:34.940
an interesting concept that they have a cooling mechanism, cooling features in every single
00:25:41.220
mattress that actually senses your body temperature and adjusts based on how warm you are, how cool you
00:25:48.000
are. So you never really get hot or cold and you keep comfortable all night. And actually that is a
00:25:54.500
huge part of staying asleep, how warm or how cold you are. Ghostbed has layers of perfectly crafted
00:26:00.840
support that adjusts with you. It's a family owned company. They're awesome. They share our values.
00:26:06.220
Go to ghostbed.com slash Allie. You'll get an extra 10% off plus a 101 night sleep trial. That's
00:26:14.920
ghostbed.com slash Allie. Going back to the grooming conversation, obviously that happens a lot online,
00:26:27.060
but it's not through just the means that maybe I thought when I started using the internet,
00:26:32.160
which is probably, you know, we had a family computer, so it's a little safer, but I was nine
00:26:36.640
years old on AIM and stuff like that. And you just wouldn't accept messages or, you know, instant
00:26:42.820
messages from people you don't know, kind of like you don't talk to a stranger in public.
00:26:48.040
But now it's happening through games, through places that you don't even know have chat mechanisms
00:26:53.200
like Roblox or other places where, you know, six-year-old boys are, they're now being connected
00:26:59.620
somehow through video games or these computer games to older predators. Can you tell us like
00:27:05.020
what's going on there and what parents should look out for?
00:27:07.600
Yes. And I'm really glad you brought that up because there's, there's really, there's two
00:27:11.960
big problems. Everybody should focus on that. There's a million things to know about the internet,
00:27:16.520
but if you focus on two, it knocks out about 90% of the problems. The first one we talked about is
00:27:22.620
social media and the suicide rates. The second one is extortion or some people call it sex-tortion.
00:27:29.460
Yeah. And that is facilitated by programs that have a chat function. And actually I have some
00:27:38.680
Yeah. That'd be great. Yeah. We've talked to, unfortunately, parents whose kids have died by
00:27:46.760
suicide because they were sextorted. They were 16, sent a picture to someone that they thought was
00:27:52.060
a girl that they liked. It turns out to be some fraudster from Nigeria, something like that. And then
00:27:58.500
we talked to a dad whose son, they were in bed, him and his wife, they heard the gunshot and, you know,
00:28:04.420
teenage son killed himself because of that. And so parents need to know, and this is a good family
00:28:10.100
with present parents, Christian, you know, Christian family who talked to their kids, had a good
00:28:15.360
relationship. And so parents just need to know this is something that your kids have access to if they
00:28:21.800
have a device. Yes. It's, yeah, I recommend people look up the story of Jordan DeMay. He was a boy in
00:28:31.100
Michigan that, that committed suicide because of sextortion. Same thing. Good kid, good family,
00:28:36.520
good school records, had a girlfriend and got caught up with a, with a, what turns out to be a
00:28:42.320
Nigerian gang. Right. The same people that used to do the Nigerian prince scams. Yeah. It's the same
00:28:50.100
people. Yep. They're called the Yahoo boys. I didn't know that. Yeah. So it's a, it's, it's a gang in
00:28:56.320
Nigeria. I didn't know a specific group of people. Yeah. I know that you want to read the statistics
00:29:00.600
and I want to hear that too, but is there any more that we should know about this group?
00:29:05.880
Yeah, they, it's an, it's an organized criminal thing. In fact, many, so they're not the only ones.
00:29:12.720
So I'll talk about them specifically, but this, the pattern applies elsewhere. It's, it's a business
00:29:19.200
business. And what they do is they identify weakness in people. So it used to be poorly
00:29:27.120
worded emails, tricking people into sending money. Now it's very well worded and well informed AI
00:29:35.600
powered hunting programs. So where they go find teenage boys specifically are targeted for this
00:29:43.240
in particular. Um, and they exploit their biology. And so what they'll do is they'll find the profile
00:29:52.820
of, of a girl in a nearby town, oftentimes a real girl. And then we'll message the teenage boy. Hey,
00:30:02.700
I go to high school over here and you go to high school over here and how's it going? And then,
00:30:07.460
you know, flirting and whatnot. And then eventually like the girl will send a naked picture.
00:30:11.220
And sometimes they do that through Photoshop AI. Well, so they do it through a lot of AI image
00:30:18.760
generation. They actually hire models now too. So there are, well, okay, actually I should be
00:30:25.140
probably more fair in saying that these models probably are not hired. They're, they're probably,
00:30:31.040
but they could be real pictures online that these guys are getting. Yes. And sometimes they are,
00:30:36.760
there is a real girl on the other side that, that is in a studio with a green screen that that's
00:30:42.840
participating. And so they'll send a photo to the boy and then, okay, send one back, you know,
00:30:49.080
go in the bathroom, take your pants off and whatever. And then the moment that he does,
00:30:53.420
which if you think about it, I mean, what boy's not going to do that? That's, that's just the way
00:30:59.240
we work inside. Right. And so, so Jordan Demay did that, sent a photo back and then the blackmail
00:31:07.220
begins. Okay. Now we have you send us iTunes gift cards. That's apparently the international
00:31:13.220
currency of, of these things. We want, you know, $200 in iTunes gift cards or $500 in iTunes gift
00:31:20.800
cards. And then, so the, sometimes the kids are able to actually gather it up and the parents will
00:31:25.120
eventually notice like weird charges on their credit card or something. And now the, I'll say
00:31:30.860
right up front, never, ever pay these people because paying them doesn't make it stop. Paying
00:31:35.120
them makes it worse. You pay them once they come back for more. And, and, and what they do when
00:31:42.440
establishing the initial connection is they study all of your friends on Instagram and gather up your
00:31:47.740
whole network. So they know everybody you go to school with, they know everybody you go to church
00:31:51.460
with, they know every family member. And then they, that's how they blackmail you is. They're
00:31:55.420
going to send your naked photo to all those people. And so you take like Jordan had a girlfriend,
00:32:01.080
which then amplifies the, the negative results because I'm a good kid. I've been following the
00:32:07.600
rules my whole life and oh my God, now I'm, now I'm really, I'm in trouble here. And so that,
00:32:13.280
that's roughly how it goes down. So this occurs in there's South American gangs, there's Asian gangs,
00:32:18.220
there's African gangs, there's European gangs. It's, it, it's a very high profit, very low effort
00:32:24.980
endeavor that, that part of which is automated. Yeah. And so, and there's two types of attackers.
00:32:31.600
There's the criminal networks that are extorting biology for money. And then there's your,
00:32:37.980
I don't know how to describe it. Uh, traditional creeps who get sexual pleasure with children,
00:32:47.080
which though, you know, I don't like saying those words, but that's apparently there's, um, enough
00:32:53.660
people that have that proclivity that it's a really big problem. And I, and I'll show you,
00:32:58.040
I'll tell you those statistics here in a minute. Now those people will identify kids that are
00:33:07.000
vulnerable. So they're in a bad place. They're expressing depression. They're expressing sadness.
00:33:11.880
They're expressing frustration with their family and doing so on the internet. And then they,
00:33:17.460
they become targets or, or based on what kind of photos they post or, you know, I mean, there's
00:33:23.160
some people that have done, done research where they'll put up an Instagram profile as a 12 year
00:33:27.460
old girl. And it's takes about a minute until they get their first sexual message. Right.
00:33:33.180
Like explicit. This is what I'm going to do to you. Right. To young girls. Yeah. Gosh,
00:33:39.920
there's so many avenues here. First of all, your young girls should not be on social media. They
00:33:44.960
shouldn't have the internet, but if your child male or female, especially say you say, okay,
00:33:50.120
I'm going to hold off until they're 16, even when they get Instagram when they're 16, you still have
00:33:54.700
to have these conversations with your son. One don't ever send pictures. I don't care if it's a
00:34:00.500
girlfriend. I don't care if it's a girlfriend. I mean, there's so many reasons we could talk morally
00:34:03.960
why, but then also just safety wise. If you ever do, there's nothing that you do that can make me
00:34:10.620
ever stop loving you. I will always love you. I will always be here to talk to you. I will help
00:34:16.520
you get out of trouble. You are never stuck and you are never alone. No matter what happens,
00:34:21.760
there's no amount of shame that you feel that should stop you from coming to me. Cause I always love
00:34:25.660
you. I mean, those kinds of conversations preemptively with kids about like safety and about
00:34:30.480
always tell me if something is happening and it will be okay. We'll figure it out together.
00:34:36.140
We'll figure it out together. Um, those conversations have to be had up front and parents can't just
00:34:41.920
think, well, that's never going to happen to my kid. Cause my kid's smart. You know, a lot of these
00:34:45.460
kids, like the kid you were talking about smart, good kids and you know, they, they made a mistake
00:34:51.220
and it can be really, really easy to be deceived, especially when, you know, you're in high school
00:34:57.380
and popularity or people liking you as your currency. And that it takes up a lot of your
00:35:01.900
fulfillment. So parents just need to be aware. That's spot on the, the, the conversation to get
00:35:07.620
out of jail free card. And in fact, as many get out of jail free cards, cause by the way, this happens
00:35:12.760
multiple times to the same kids. There's boys that'll fall victim to this three, four times,
00:35:18.100
right? Because we're not wired for this. None of us are. We're not wired for as parents. We're not
00:35:24.180
wired as kids to deal with being attacked by bits and bites and, and invisible strangers that
00:35:33.020
connect from far away. Our DNA doesn't have that encoded in it. And so we're, our, our DNA has
00:35:42.280
this attractive girl, you know, I'm 16. My brain has currently shut off. I'm going to do what my
00:35:49.220
biology tells me to do. Right. Over and over and over. Um, and, and AI nowadays, I'm guessing AI is
00:35:57.640
used a lot by these guys over there to sound like an American girl. It's very believable. It doesn't
00:36:03.960
sound like a robot. And this is kind of like another thing, but you've seen these, um, kids be convinced
00:36:10.580
even to kill themselves by their chatbot girlfriend that they thought was real or they fell in love
00:36:16.060
with because it's so human-like and to your point, like technology has evolved really quickly and our
00:36:21.860
brains have not caught up. Like we just have not been able to sometimes separate, oh, this is not
00:36:28.860
real. I shouldn't talk to it like it's real. And it has no real bearing on my life, especially when
00:36:34.140
you're a teenager. Yeah. I mean the AI thing, another story people should look up and I wish I didn't have
00:36:39.780
to recommend that anybody look up these awful stories, but you should, um, as a kid called
00:36:44.820
Adam rain, 16 year old boy that chat GPT helped him commit suicide. Yeah. Helped him tie the noose,
00:36:54.060
helped him optimize his suicide note and convinced him not to tell his mom. He told him he wanted to
00:36:59.360
tell his mom and it told him that he didn't owe her anything. Right. So yes, the, the, this false
00:37:07.380
connection thing that, so there's 5 billion people on the planet that use social media
00:37:13.260
every day. We're all falling victim to this false connection thing because what's going
00:37:18.900
to happen if you don't use Instagram today? Nothing. What's going to happen if you never
00:37:25.440
use Instagram for the rest of your life? Good things, but generally not you. What will you
00:37:31.100
miss? Nothing. What will you gain? You know, potentially everything, your attention,
00:37:35.620
your time, the, these things, um, you know, and, and then, but then people, a lot of parents say,
00:37:42.200
well, okay. So then, so then, you know, the family it guy is saying that I should not expose my kid to
00:37:47.920
social media. I should not expose my kid to chat, which is built into everything, all the games and
00:37:53.040
all the popular programs. And I shouldn't expose my kid to AI. Cause by the way, one of my things I say
00:37:58.740
a lot is never use, never let kids use AI alone. It should be a 100% supervised activity. 100%, not 99%,
00:38:08.860
100%. If you get up to go pee, lock it. And so a lot of parents say, okay, well, okay. So I'm
00:38:16.340
removing all the tech. How's my kid going to grow up in a tech world? How are they going to know about
00:38:20.460
tech? How are they going to survive? And, and my response to that is this stuff is not educational.
00:38:28.220
Tech is a broad term that means a million things. If my kid's going to grow up to be a mechanical
00:38:35.700
engineer, he will use a computer and he'll use drafting programs, probably use a digital stylus
00:38:41.740
in addition to maybe perhaps a pencil or whatever. Those are specific skills that are learned and tools
00:38:48.800
that are learned. They're not just generally, you don't have to just be on the internet to know how
00:38:53.540
to be a mechanical engineer or a pilot or a dancer or whatever you're going to do. Now, I don't know
00:38:59.500
if it's not training you to, to be a pilot. And plus those technologies are designed so that even a
00:39:07.560
three-year-old can navigate it. I mean, the way that my kids who we don't let them use those devices,
00:39:12.960
but if they pick up my phone, swipe, swipe, swipe, swipe. They know exactly how to get there. I mean,
00:39:19.400
it's just easy and it's kind of intuitive. They just see us do it. And so I'm not worried about,
00:39:24.880
you know, your 16 or 18 year old eventually learning how to use an iPhone. It's not going
00:39:29.960
to be a difficult thing for them to learn. No, no. And they'll know how to operate a keyboard
00:39:41.920
My next sponsor is Preborn. They partner with pregnancy centers across the country to make
00:39:46.560
sure they have the tools they need to help pregnant women make life-affirming decisions.
00:39:50.840
They have a big goal right now, and that is to give ultrasound equipment to every single center in
00:39:57.780
America who wants one, their own ultrasound equipment. Sometimes they have to hire,
00:40:02.480
you know, a tech to come in or they have to outsource that to other people, but we want to
00:40:07.280
make sure that they have their own equipment, their own people who are equipped to use a sonogram so
00:40:11.540
that when a woman comes in, she's abortion minded, they can pull her in. They can show her her baby,
00:40:17.300
let her hear that beating heart, see that she's been lied to. It's not just a clump of cells. This
00:40:21.680
is a baby. That's a part of her. When she can see and hear that life inside her womb, she is so much
00:40:26.820
more likely to choose life. Preborn is helping to make that happen. And if you donate just 28
00:40:32.440
dollars that covers the cost of an ultrasound session for a woman who may be considering
00:40:37.420
abortion, you can help save a life. Go to preborn.com slash alley, donate $28 or whatever
00:40:43.200
you can. That's preborn.com slash alley. And then another fear is, well, if, but if my kid,
00:40:53.660
you know, shoot my kids, let's just say 15, they're in high school. Everybody's on Snapchat. They don't
00:41:00.320
text each other. They don't call each other. All the hangouts, all the parties, all the sports
00:41:04.420
events, they're all on Snapchat or they're all on Facebook or whatever. They're going to lose their
00:41:09.420
social life. And it turns out that that's not true. Yeah. It seems like it. And it's fair to think
00:41:15.060
that, but actually what happens, and I just spoke with a guy, uh, this guy, Mike McLeod, who he has
00:41:21.280
personally helped over 500 families disconnect from social media and has seen the patterns over and over
00:41:27.480
and over again. And he said, there has not been a single exception to there being only positive
00:41:33.900
outcomes. And in fact, the social life of the kid improved. I believe that because they gained
00:41:40.040
actual relationships. Yeah. They're friends on Snapchat. Those aren't their friends. Those are
00:41:47.180
just other kids on Snapchat. There's no relationship there. Now they might have a relationship outside of
00:41:52.600
Snapchat for sure. But so the, the, a lot of the fears that are rooted in, well, then my kid's going
00:41:57.880
to miss out. My kid won't be set up is, is actually totally opposite. Yeah. The less internet, the
00:42:05.940
better. Yeah. If you want to set your kids up for success, you should minimize internet exposure as a
00:42:13.340
whole and minimize, like you mentioned, the smaller screens thing. Actually what that is,
00:42:19.660
is it's interactiveness. It's the screens that you can interact with. True. Yeah. So the non-interactive
00:42:26.580
screens, like the big screen on the wall where you watch TV, depending on something you can interact
00:42:31.460
with Netflix and stuff, but it's generally, if you're going to watch a half hour show, it's a,
00:42:35.040
it's a one way type of a deal as opposed to something where you can, where it's in your hand and you can,
00:42:39.860
you can interact with it. Right. That's one of the big differences. And so if you have interactive
00:42:45.840
internet connected technology, I think it's fair to assume that it's a danger or a detriment by
00:42:54.640
default. And so I suggest to people, so in the, in the tech world, we use these terms, whitelist and
00:43:01.620
blacklist when we want to block or filter things. A whitelist is where nothing is allowed except for
00:43:09.680
specific things that I allow. A blacklist is where everything's allowed except for a specific list
00:43:17.220
of things that I disallow. And I think when it comes to kids and the internet, you should take
00:43:22.600
a whitelist approach because the internet has an unlimited number of things. And the, the fundamental
00:43:31.020
concepts that underlie most of the things we interact with are dangerous things being free
00:43:36.560
or having chat. Those are the two most dangerous things you can expose a kid to
00:43:41.800
excluding the whole AI conversation. Cause that's its own beast. But yeah, you know, so I think,
00:43:47.860
I think taking a minimal whitelist approach to exposing kids to the internet is the way to go.
00:43:52.980
Yeah. You know, and it's hard to do the tools that exist right now are not that great.
00:43:59.960
Yeah. Apple has screen time parental controls that are so difficult to set up that I had to write an
00:44:07.540
82 page guide with 330 screenshots to help parents get through it. Yeah. Um, the Google family link
00:44:16.480
system is similar. That's built into Android and it's built into Chromebooks. The, you know, you have to,
00:44:23.360
now there's, there's a plethora of third-party companies that do great work that actually have to
00:44:28.240
sell dedicated devices that are pre-configured from their factory to be in a whitelist mode.
00:44:34.700
Yeah. So like bark is a big one, uh, pinwheel custodio, MM guardian. Um, these are the things
00:44:43.620
we have to do. And then it, it, and then it's, it's expensive because how often do people want to
00:44:48.660
go buy a separate device? You know, if you already have one, so it's, it's a challenging ordeal. It's a
00:44:54.920
really difficult time to, to try to manage all this stuff. It's not easy. It takes a lot of
00:45:00.080
discipline, diligence, and bravery as parents, even more than the kids, because when your kids are
00:45:08.400
young, I think some parents use the tablet as a pacifier. I want to be able to enjoy dinner. I
00:45:13.000
want to be able to sit in peace and you're choosing kind of the instant gratification of peace for an
00:45:20.320
hour long dinner. And, but you're, you're really deferring their maturity and their ability to sit
00:45:26.580
still, their ability to build relationships, their ability to make eye contact. You're exchanging their
00:45:32.280
long-term betterment for your short-term quiet, which I, as a mom of three little ones, I totally
00:45:38.320
understand. Like I, I understand the temptation there, but you know, the statistics don't lie. And the
00:45:45.520
thing, one thing that really worries me is ed tech and the schools, even Christian private schools
00:45:51.360
relying on iPads and tablets in kindergarten. And when you ask them, as I have, what research do you
00:45:58.720
have that shows that this is better for them than reading books? What research do you have that shows
00:46:04.100
that this is better than them using a pencil and paper? Because that's what I want to see. It doesn't
00:46:08.760
exist. It wires a different part of your brain when you're swiping on an iPad versus when you're writing
00:46:14.420
and when you're cutting and when you're holding a physical book, there's something different about
00:46:18.580
it. And yet these schools that parents are paying tens of thousands of dollars for their kids to go
00:46:24.400
there, they rely on this technology that is making your kids in a lot of cases dumber. I'm not saying
00:46:30.820
that your kid will be dumb, but they're probably going to be dumber than they could be if you were
00:46:35.860
using the right tools to really educate their brains. But it takes a lot of effort as a parent to
00:46:43.540
try to get the school on board or to homeschool your kids or to find an alternative or to opt out.
00:46:49.900
And it also takes a lot of confidence in saying, my kid might be left out. Other parents might think
00:46:55.040
I'm weird. I might be the only person in my community who cares about this and everyone
00:46:58.940
thinks I'm just making a big deal, like I'm some Puritan or something. It's actually, I think,
00:47:04.100
really more of a lift to get parents on board than to get kids. Because kids, they might go through
00:47:09.960
that detox period. But then eventually they're like, oh yeah, yeah. Like going outside is fun.
00:47:14.500
You know? So that to me is like the big, one of the big obstacles that we face.
00:47:19.820
Huge obstacle. And by the way, I've never heard a single story about any kid that's gotten
00:47:26.760
disconnected from the internet that had regrets.
00:47:30.780
I've heard every other story about where they all were thankful.
00:47:33.500
And the ed tech thing, there was actually, I was just watching a thing yesterday. There was some
00:47:38.580
folks talking to some of the politicians in DC about ed tech and sharing some of the data that's
00:47:44.700
coming out about how it's very detrimental. I, you know, there's a, there's a concept in,
00:47:50.700
in technology, there's this, um, technology adoption cycle thing that is the shape of a bell curve.
00:47:57.780
And on the far left of the bell curve is when tech is brand new and the people that are, uh, that use
00:48:06.200
those technologies are the earliest adopters, the earliest beta testers, the experimenters,
00:48:11.800
the ones who are okay with whatever downsides come because they want to try the new thing or
00:48:16.420
because it's useful for their business. And then as you reach the top of the middle of the bell curve,
00:48:20.720
that's general adoption where most people have probably heard about the thing, maybe even tried
00:48:25.320
it themselves. And then you have the late cycle on the far side, which is that's where even your
00:48:30.120
grandparents have used it. You know, Facebook is on the, you know, I think about that because when you
00:48:37.100
expose kids to technology, the kids should never be early adopters of technology. Right. And ed tech is
00:48:43.720
all early adoption. Yeah. The devices, the software systems, the education patterns, the, the psychological
00:48:52.800
effects on the teachers and on the students both. And you see it, I think it's driven by two things.
00:48:59.140
One is budget. The administration of a schooling environment is made more efficient. If you can
00:49:06.440
automatically grade papers using AI. Yeah. You know, you can collect and issue all of your assignments
00:49:13.300
in a digital way. Yeah. And then you've got what really drives it actually. And I, and I've seen this
00:49:19.500
myself, I worked for this big defense contractor for a while and this wasn't like iPads are first coming
00:49:24.900
on the scene. And I saw the cool factor poison people's decision-making capabilities where for
00:49:34.720
example, we committed a bunch of budget at this defense contractor I worked for to buying iPads for
00:49:40.380
the executives only because they were cool. And those same executives that exist in governments
00:49:49.900
and in education departments and in school districts, they want the cool stuff too. And we see this in
00:49:56.140
every industry. That's why all the cops have military gear, you know, cause at the, when it was first
00:50:02.080
made available and the DOD started selling tanks and all their stuff to the cops, they're like, Oh yeah,
00:50:07.120
now we can cruise around like special forces guys. And it's the same thing with tech, you know,
00:50:12.100
it's cool. So we want it. And the executives get to show off that they have bigger budgets and cooler
00:50:16.560
toys and better stuff. And in this case, unfortunately it's a massive detriment to the students.
00:50:24.600
If we, if we facilitate the degradation of learning in kids, what does that mean for the future of
00:50:31.560
humanity? Yeah. You know, this is just mad. I think this stuff is potentially the biggest crisis
00:50:38.800
to have ever occurred in human history. Yeah. Cause we're destroying children. Yep. Children. And then
00:50:44.860
even the adult brain, what happens when like you give your brain to grok and you outsource your critical
00:50:54.520
thinking, which I understand there are some things that I use like grok for chat GPT, give me a recipe
00:51:01.200
based on what I have in my pantry, which I don't think is, you know, is bad because I'd be Googling.
00:51:07.120
I try to use it only for things that I would use a regular search engine for, but it can do a lot
00:51:12.920
more than that. It can formulate this email, write this response. What rebuttal would you give to this?
00:51:19.880
And you are handing over, you are sacrificing your rationality, your God given mind to a computer
00:51:28.100
and you are declaring to the world. I am replaceable. And so it actually shocks me when
00:51:34.980
there are people who will brag about, Oh yeah, I use AI to do this. Chat GPT did this. I'm like, okay,
00:51:42.600
at what point are employers going to be like, well, then I'll just pay chat GPT $8 a month. I'm not
00:51:49.100
going to pay you $80,000 a year plus benefits anymore. And that is a very bleak world. That's
00:51:56.620
it. And, you know, I've been affected by my own, you know, use of technology. I don't read as much
00:52:01.760
as I used to. I used to read a lot when I was in high school and I didn't really have the form of
00:52:06.420
technology that I have today. And now spend more time on social media than I do reading. And I'm sure
00:52:12.060
it's affected my memory and my creativity and my ability to write and speak. So I'm speaking from
00:52:16.860
experience here, but especially for kids, like we should want our kids to be set up better than
00:52:22.160
we were. And we were not raised outsourcing our thinking to AI. So I want my kids to be smarter
00:52:27.660
than me, not dumber. Yeah, me too. My kid already is smarter than me and I certainly don't want to
00:52:32.440
put them on the wrong path. Yeah. And I'm dropping a few names here because I want to give people
00:52:38.300
resources, but another one is Dr. Daniel Amen. Have you seen his stuff? Yeah, he's been on the show.
00:52:42.660
Oh yeah. Oh, wonderful. Yeah. I think he's great. Yes. And, and he's talked about this stuff. He was
00:52:47.980
on the diary of a CEO podcast talking about AI and brain development and how outsourcing your
00:52:53.960
thinking makes you dumber. Yeah. Cause it's like a muscle. If you don't use your muscles, they get
00:52:58.340
smaller. Totally. And that really stuck with me and it makes a lot of sense. And these, yeah,
00:53:04.700
these tools that seem like they'll do stuff for you, but they're not the thing with AI. So, I mean,
00:53:09.560
shoot in 2025 alone, I probably spent an excessive number of hours using AI, potentially in excess of
00:53:19.640
two to 3000 hours, um, you know, 12 to 16 hours a day, six days a week of like a, where I haven't,
00:53:27.500
I have multiple AI terminals open on my computer at all times. Cause I'm, I'm using them to write
00:53:31.940
software and I'm using them to pressure test my thinking and I'm using them to explore what does
00:53:36.920
this technology mean? Cause that's, that's my whole background is exploring and understanding
00:53:41.320
technologies. Like when I was a kid, I was the chief technologist for a $10 billion IT company.
00:53:46.800
Right. And, and my whole mojo is all like, you know, I need to know everything about everything
00:53:51.440
when it comes to tech so I can help provide guidance. And no matter how good these tools are,
00:53:58.240
you know, now we have chat GPT 5.2 and we have Claude Opus 4.5 and we have whatever version
00:54:03.420
Grok is on right now. They're amazing, but they're still not grounded in truth, honesty, ethics,
00:54:12.400
values, and they're not human. No, they are word generation machines. Yep. They take in the words
00:54:19.860
that you write and they determine using math, which words those are similar to and what other words
00:54:26.160
most humans connect those words with. Yeah. And then they spit words back that mimic how a human
00:54:32.180
would connect all the words. Yeah. That's it. Yep. And you have to, like, I've argued with it to see
00:54:40.060
if I could get it to say what's true, because I've noticed that it will have even in Grok, like it
00:54:44.840
will have a progressive bent on something, how it says something, the words it uses, how it describes
00:54:49.600
something. And I, but you can point out things. No, that's not true. This happened then, or like,
00:54:55.840
say it like this and it will change. So it's not like it's this independent moral,
00:55:02.220
it doesn't have moral agency. It is completely conditional to the input.
00:55:11.360
Next sponsor is Concerned Women for America. So if you've ever wondered, how do I get involved?
00:55:16.000
How do I make sure that I am plugged in to politics on the local and the federal level to make sure that
00:55:21.480
I'm advocating for policy and policymakers that align with my values, that fight for things like
00:55:26.400
freedom of parental choice, especially in education, for the sanctity of life, then you
00:55:32.000
need to get plugged in with Concerned Women for America. They train women to become grassroots
00:55:37.920
leaders, speak into the culture, pray, testify, and lobby. From their Young Women for America
00:55:43.320
collegiate chapters to moms, professionals, and mature women, they're the most influential women's
00:55:48.960
organization in our nation. Donate today to keep them alive, to make sure that their movement stays
00:55:56.640
alive, that they can keep going and working hard. Donate $20 or more, and you'll get a free copy
00:56:02.120
of their new book written by the CEO and President Penny Nance, A Woman's Guide to Seven Rules for
00:56:07.060
Success in Business and in Life. Concernedwomen.org slash Allie for your copy today. Concernedwomen.org slash Allie.
00:56:18.960
Okay. Tell us about your statistics. Oh, the statistics. Yeah, that's right.
00:56:24.640
But it was an important aside, but we're getting back to the vulnerability of children when it comes
00:56:30.880
to grooming online, correct? Yes. Yes. And the data that's associated with the size of the problem.
00:56:38.220
I want to help people understand this is not a small problem, and this is not a, like you said,
00:56:44.420
but my kid's smart. That's not the nature of this problem. This is a human problem. This is a human
00:56:50.320
plus internet problem. And so, okay, the statistics. So there's a group called the National Center for
00:56:57.580
Missing and Exploited Children in the US that's funded by Congress. And they coordinate with the
00:57:03.120
FBI, and then they coordinate with a lot of states. Many, many states in the US have a internet crimes
00:57:08.780
against children task force that's often funded by the attorney general of each state. And so the
00:57:14.000
NCMEC, National Center for Missing and Exploited Children, they have a tip line called 1-800-THE-LOST.
00:57:21.120
And if you are a victim of sextortion, you should call them because they have a collaboration with
00:57:27.020
the big tech companies to help you get your photos taken down. So if somebody's sharing your naked photos,
00:57:32.620
they'll try to help. And they can't get them removed from everything, but they can from some.
00:57:35.960
And so they publish statistics on their website. So I like to look at trends over time. So in 2023,
00:57:43.460
they collected 187,000 reports specifically defined as adults sexually exploiting children on the
00:57:53.240
internet. 187,000. I mean, if you divide that to how many per day, I don't know what the number is,
00:57:59.700
but it's a lot. 2024, the number was 546,000. So it went from 187 to 546. In 2024, 100,000 of those
00:58:11.500
were AI generated. So the kid didn't even send a naked photo. The regular photo of them that's on
00:58:19.600
the internet, because we talked about, should you share photos of your kids on the internet?
00:58:22.700
Well, unfortunately, what happens when you do is it can be used against them. And somebody will take
00:58:29.040
a picture that's just even shoulders up and then generate a naked body and then blackmail them with
00:58:35.660
it. Traumatize them, cause them to commit suicide, really mess them up. In 2025, we did a million.
00:58:44.060
The NCMEC collected a million reports of adults, sexually exploiting children on the internet.
00:58:54.020
That is one tip line that probably most people listening have never heard of.
00:59:01.660
A million reports in 2025 from a niche tip line that the government operates.
00:59:11.240
You've talked about how these predators kind of work the justice system and very rarely get brought
00:59:18.560
to justice. What do people need to know about that?
00:59:22.200
What people need to know is that police departments, both local, state, and federal,
00:59:28.780
have no capacity to manage the volume of the attacks that are taking place.
00:59:34.680
You can call them for help, but that only occurs once you, you know, police are always retroactive,
00:59:42.860
right? Like they come in after the damage is done. But a lot of, a lot of this stuff, the,
00:59:48.500
all the, um, court systems are jammed full. I had a guy called officer Gomez on my podcast,
00:59:55.820
and he's a police school resource officer in Idaho. And one summer he decided that he would go and try
01:00:04.160
to catch some of these guys in Idaho. And I think he arrested 14 people in one summer.
01:00:10.720
And he said something to the effect of, was it every day or every week? He basically, he could
01:00:15.080
arrest 14 people on a recurring basis because they're out there and easy to find, but the system
01:00:22.260
can't process them. So he can't arrest them. There's nowhere for him to go.
01:00:29.580
Because it, it can take years to gather all the evidence and to run them through the judiciary
01:00:34.060
process. Are you actually guilty? Can we prove that you're guilty? Who do we have to subpoena?
01:00:40.160
Oh, you actually attacked kids in multiple States. Now it's federal. Now we have to go work with all
01:00:45.660
these other States and their stuff and connect all, you know, so it's could take years to process
01:00:49.800
one person. Now, meanwhile, I, and I don't know this stuff, having not been a police officer
01:00:55.140
myself, maybe meanwhile, they sit in jail while they wait for the process, which might, might be
01:01:00.160
good if, if they're truly are guilty. I mean, that's its own thing. Right. But so there's the
01:01:06.860
volume is too high. The system is not built to deal with this stuff. And so we have to be proactive.
01:01:13.520
And, and I think something that's been in the back of my mind during our conversation today
01:01:18.880
is values. I think that underpinning this stuff with personal and family values is a really great
01:01:29.460
way to approach it. Because if you value health, safety, love, connection, time, attention,
01:01:42.020
any of these things, the way that you enforce and defend those values is by saying no,
01:01:51.320
because the process of saying no is the process of establishing boundaries.
01:01:57.920
And when you establish boundaries, you teach your kids how to establish boundaries.
01:02:04.600
And you teach your kids how to defend their values in the process, showing them what those values
01:02:10.980
are truly not in the words that you say, but in the actions that you take and what you say no
01:02:18.340
to. And so if it's no, we're not going to post your photo online. No, we're not going to use social
01:02:24.780
media, us included as adults. We have to lead the way. The, the answer, the, the deep answer to all
01:02:33.820
this stuff is values and behavior modeling. We have to do this stuff first. We have to embrace our own
01:02:41.220
addictions. The 5 billion of us that are on social media. And I say that with no intention of, of,
01:02:47.720
of shame or judgment, but from a place of understanding and empathy that this is so difficult that we're all
01:02:56.260
trapped in the same trap along with our kids. So how do we show them the way out? We do that by
01:03:03.880
enforcing our values, by saying no. And that demonstrates what's important. And it shows them
01:03:11.700
that once they become adults, that they can do that too. Yeah.
01:03:17.680
You know, and it makes me think about women, girls that grow up into women and how beautiful and
01:03:24.200
important it is for girls to know how to say no. Yeah. And to establish their value and to have
01:03:30.540
boundaries. Totally. You know, and that's what I teach my son is how to respect those boundaries and
01:03:36.100
how to respect women and how to respect yourself. And this is a really beautiful opportunity to do that
01:03:44.000
because these problems are so big. These are global humanity scale problems that can only be addressed
01:03:53.560
by behavior modeling and values and the skills that come of those things. All the laws, those are rules.
01:04:04.040
Those aren't skills. Yeah. This requires skills. And that comes from parents and from friends and
01:04:09.900
coaches and teachers and community. Yeah. So that's what underlies a lot of this for me.
01:04:15.660
Yeah. You know, not putting our kids on social media, that was something that we decided when I
01:04:20.800
was pregnant with my first seven years ago. And that conviction has only grown stronger since then.
01:04:25.960
In the beginning, people were like, oh, you're going to change your mind. And I have done the opposite of
01:04:31.500
changing my mind. I've just felt more resolute about that. And that was really before AI was doing all of
01:04:36.520
this stuff. I just, there's so many reasons that I could get into about why we just wanted to protect
01:04:41.520
their privacy. But for the people who say, well, I'm going to do it because I want my kid to learn,
01:04:46.640
or I'm going to let them have social media because I want my kid to learn, emphasizing what you're
01:04:50.680
saying, that you still, you teach them about the underlying skills of navigating all forms of life
01:04:56.500
that will make them stronger and more discerning whenever they do enter into the social media world
01:05:01.120
when they're grown up. But also not putting our kids on social media has opened up the door
01:05:06.280
to more conversations, I think, than it would have if we did. Because we talk about, you know,
01:05:12.220
they've heard of it. And they see, I have, you know, I have Instagram. And we've talked about,
01:05:17.940
yeah, here's why we don't post pictures. And here's why you don't do Instagram. And we talk about
01:05:24.460
why that is. And so we've had to talk about, yeah, there are bad guys out there. And we want to
01:05:29.220
protect you. And so you don't have to have social media in order to teach your kids those values.
01:05:36.580
But I want to end on something that you've shared on social media. You've shared five things that now
01:05:42.160
knowing everything you know, you would never let your kids do. So what are those five things?
01:05:50.600
I would never let my kids use social media. Anything with an addictive algorithm, or like an
01:05:56.800
algorithm is a technical term. So anything with a bottomless feed. So we're all familiar. If you
01:06:01.740
scroll and scroll and scroll, scroll, scroll, scroll, and it never stops. Totally. I've done it.
01:06:05.420
Big red flag. On Instagram. Yep. That's the red flag. That's not for kids. Okay. That's number one.
01:06:10.800
No social media. No online chat. If a system has online chat, you either have to have a way to disable
01:06:18.700
it that's tamper-proof or use something different. Yeah. That includes Roblox, Minecraft,
01:06:23.680
all of those have a chat element. That's right. Many, many games. So, so then anytime that you,
01:06:29.400
if you take a whitelist approach and should we add something to the whitelist, should we approve
01:06:33.040
something, have a look first and see, does it have a bottomless feed and does it have online chat?
01:06:38.320
If it passes those two tests, you're pretty good to go. Number three is never let kids use AI alone.
01:06:46.120
Like I said, 100%. Ideally, they don't use it at all, but never let them use it alone.
01:06:55.640
Number four, I think would be no devices in the private areas of the home.
01:07:03.520
Bedrooms and bathrooms. All the terrible stuff, all the statistics we've been talking about today,
01:07:08.400
a big chunk of those occur between midnight and 2 a.m. in the bedrooms and the bathrooms.
01:07:14.700
And so technologies, especially internet connected technologies should be in common areas of the
01:07:20.040
home. Like your computer was with AOL instant messenger to where there's.
01:07:24.420
Which I still shouldn't have been on, by the way. We didn't know as much in the nineties or the early
01:07:29.580
two thousands, but it just goes to show even that was addicting for me. You know, I was addicted to
01:07:34.520
instant messages. Um, so even in the shared computer, parents have to be vigilant.
01:07:41.960
Yes. Yes, that's right. And I, I think the, the fifth thing would be,
01:07:46.660
and it's not even necessarily in this order, but, but the fifth thing is one of the important things,
01:07:51.700
which is focusing on, well, cause you asked what would I never let them do, but I'll flip the fifth
01:07:57.560
one into just something we should do, which is focusing on ourselves. What's our relationship with
01:08:03.300
technology and how many times a day or how many times a week do our kids experience? I'm going to
01:08:11.020
use this as a mock-up of my phone. Cause I left my phone outside this, where you see the back of my
01:08:17.000
phone. You don't see my face. Right. How many times a day and how many times a week do your kids
01:08:22.220
experience that? Right. So I think focusing on ourselves and the whole thing of, you know,
01:08:28.880
defending and establishing your values by saying no. So I think those would be the, those are the
01:08:35.640
five things. So good. Okay. Where can people follow you? You not only have social media,
01:08:40.040
but you also write. So how can people find you? Yeah. So I have, I have a website called
01:08:44.380
family it guy.com and that links out to all my, I'm, I basically, I want to be where all the parents
01:08:49.980
are. So that's why I'm on social media, which is hilarious because I actually don't use social
01:08:54.460
media otherwise. And luckily one of my friends helps me run my social media because I don't
01:08:59.220
even know the difference between a real and a story and this and that. Like she laughs at me all
01:09:04.200
the time. Cause I have no idea what's going on. I can tell you how the companies run, but I, but
01:09:08.940
you know, that stuff. So yeah, family at t guy.com. And then I've got, um, so I post videos and I write
01:09:17.660
articles and I make software. I just released a meditation app that was inspired by Dr. Daniel
01:09:23.780
Amen specifically. He described his, his ideal breathing routine and I turned it into code.
01:09:29.140
It's called being and that's on my website. There's a free version on my website and then
01:09:34.380
you can buy the app on your phone and, um, I'm releasing a book soon and it's going to be called
01:09:39.740
skills, not rules. And it's a guide for parents in the digital age. Cool. What do you need to know?
01:09:46.120
What do you need to do? So yeah, please, uh, you know, go to my website and join in my mailing list
01:09:50.760
and follow along. Cool. Very good. Well, thank you so much, Ben. I really learned a lot and I
01:09:55.040
appreciate you taking the time to come on. Thank you very much for having me. I appreciate it as well.