Order of Man - July 15, 2025


DR. ERIC COLE | How Cyberthreats Endanger Men and Their Families


Episode Stats

Length

1 hour and 1 minute

Words per Minute

184.8587

Word Count

11,443

Sentence Count

786

Misogynist Sentences

4

Hate Speech Sentences

1


Summary

Dr. Eric Cole has spent decades learning, understanding and teaching people how to protect themselves and their loved ones from cyber threats. Today we talk about the apps that are more dangerous than others, how best to keep your children safe from online predators, why passwords are dead, and why social media will never fully protect our children.


Transcript

00:00:00.000 Considering how often we utilize technology, guys, we just don't spend nearly as much time as we should protecting ourselves and our loved ones against the inherent vulnerabilities and bad actors who would exploit the very tools designed to improve our lives.
00:00:14.720 The fact of the matter is cyber security has become a huge issue and only to become more relevant as technology advances.
00:00:23.720 My guest today, Dr. Eric Cole, has spent decades learning, understanding and teaching people how to protect themselves and their loved ones from cyber threats.
00:00:33.080 Today, we talk about the apps that are more dangerous than others, how best to keep your children safe from online predators, what he calls cyber hygiene, why passwords are dead and what to do about it, and why social media will never fully protect our children.
00:00:48.560 You're a man of action. You live life to the fullest. Embrace your fears and boldly chart your own path.
00:00:54.680 When life knocks you down, you get back up one more time, every time.
00:00:58.740 You are not easily deterred or defeated, rugged, resilient, strong.
00:01:04.200 This is your life. This is who you are. This is who you will become.
00:01:08.400 At the end of the day, and after all is said and done, you can call yourself a man.
00:01:15.980 Gentlemen, welcome to the Order of Man podcast. I am Ryan Michler.
00:01:19.720 We've been doing this for 10 years.
00:01:21.760 Very proud to say that and want to first and foremost thank you for banding with us
00:01:25.560 and thank you for believing in the motto of protect, provide, preside.
00:01:29.780 That is what a man ought to do. It is what a man should be striving towards.
00:01:34.560 There's lots of ways to do it, but ultimately, we are to protect, provide, and preside over ourselves
00:01:39.760 and for our families and the people that we love.
00:01:42.620 So you guys are out there doing it.
00:01:44.080 Make sure you're banded with us on the socials, at Instagram, on Twitter, and also over on Facebook and YouTube.
00:01:50.080 If you'd rather watch this video, then listen to it.
00:01:53.760 Guys, we've got a good one on cybersecurity.
00:01:56.040 Before we do, I want to talk about another resource that helps make this show possible,
00:02:00.040 and that is my friends over at Montana Knife Company.
00:02:03.580 They're making 100% made and sourced in-American knives.
00:02:07.700 I've got a few of their tactical knives.
00:02:09.300 I've got their hunting knives.
00:02:11.160 I've got their culinary set, and they've got some new knives in the works as well.
00:02:15.740 So they are blowing up.
00:02:17.420 They're doing big things.
00:02:18.340 They're having a lot of success, and there's a reason.
00:02:20.600 They make great knives, quality knives, and they're all made in America.
00:02:23.940 Check it out over at montananifecompany.com.
00:02:27.340 And when you do, use the code ORDEROFMAN.
00:02:30.220 All one word, ORDEROFMAN at checkout.
00:02:32.840 You'll save some money at montananifecompany.com.
00:02:36.520 Guys, let me introduce you to Dr. Eric Cole.
00:02:39.180 He is a cybersecurity expert.
00:02:41.200 He's also a former CIA hacker.
00:02:42.960 We talk a little bit about that in the conversation.
00:02:45.360 He's a best-selling author, and he's got over three decades of experience in the field.
00:02:50.200 He's advised Fortune 500 companies, different government agencies, the U.S. military on how to protect infrastructure and all of the digital assets these organizations have.
00:03:00.900 He's also the founder of Secure Anchor Consulting, but he is on a mission to help organizations and individuals stay ahead of these cyber threats in an increasingly connected world.
00:03:13.280 His insight and his energy and his real-world experience make him a leading voice, and you're going to hear that in the conversation today.
00:03:20.180 He's also the author of Cyber Crisis, Protecting Your Business from Real Threats in the Virtual World.
00:03:25.560 Guys, enjoy this one.
00:03:26.880 Take notes, and make sure that you implement what he shares.
00:03:31.340 Eric, thanks for joining me on the podcast today.
00:03:33.240 Glad to have you.
00:03:34.200 Thanks for having me.
00:03:36.200 You bet.
00:03:37.240 It's interesting.
00:03:38.060 I don't know that I've ever had this conversation about specifically cybersecurity, but our motto, specifically as it relates to men, is to protect, provide, and preside.
00:03:47.440 And I think a huge part of that is protecting ourselves and our families from the threats that come in the digital realm, which I believe will probably become more prevalent as we move forward than it's ever been in the history of mankind.
00:04:03.680 You were spot on.
00:04:05.120 It's crazy because what we do to protect our kids in the physical world, we are doing the complete opposite in cyberspace.
00:04:12.520 I mean, I'll give you a quick example.
00:04:14.200 If you're walking down the street with your kids, and some person is taking pictures of your kids or following you or, like, getting real close, you would, like, get in their face and be like, that's not cool.
00:04:25.980 I mean, you'd even get aggressive.
00:04:27.400 But what do we do instead?
00:04:28.860 We take pictures of our kids, and we share them online with everyone.
00:04:33.740 I mean, it's a horrific world on the internet that we live in.
00:04:38.500 But these child predators, they're on Facebook, they're on Instagram, they're watching when you're on vacation, and you're posting pictures of your family and your kids, and they're using that to target and figure out what kids to go after and what kids to abduct.
00:04:52.420 And the crazy thing is we would never, ever tolerate this in the real world.
00:04:56.780 But for some reason in cyberspace, we're allowing ourselves and our family and our kids to be vulnerable.
00:05:02.140 And I agree with you.
00:05:02.840 As men, it's time we've got to step up and protect our families online just like we do in the physical world.
00:05:08.500 Well, it's interesting that we've placed so much trust in the internet, random people, random institutions that are quite literally, I think, preying upon our own personal information.
00:05:19.980 In this case, digital assets like pictures of our family members.
00:05:23.380 And even with social media, I've stopped posting pictures of my children for the past, it's been about almost three years now.
00:05:34.080 And it's pretty scary how many people will put pictures of their young children, their young sons, their young daughters, just to be completely exposed.
00:05:42.460 Is that information in those pictures, are those being exploited?
00:05:46.640 You talked about them being targeted.
00:05:49.520 It seems so out there that I think most men are probably like, oh, it's not that big a deal.
00:05:54.020 Nobody's coming after my kids.
00:05:56.860 The unfortunate reality is the people that don't think they're a target actually are.
00:06:03.140 Because these attackers are smart and they're organized.
00:06:05.900 The reality is why, if you're an attacker and you want to either abduct, exploit, or cause harm to either your kids or your spouse or your wife or significant other, whoever it is, why would they go after the rich and famous that actually have protection and security measures in place?
00:06:25.380 They're going to go after the most vulnerable.
00:06:26.760 They're going to go after the weakest folks, which unfortunately, and I hate to say this to a lot of men that are listening, but if we're not taking cybersecurity seriously and we're not actually monitoring and tracking what our kids are doing, limiting and controlling the photos, we are unfortunately are weak online.
00:06:42.760 We may be strong men in the real world, but online we're very, very weak because we're letting our kids do these crazy things.
00:06:49.600 And just another thing is you would not let some random stranger raise your kids, but go to a restaurant.
00:06:57.520 Most parents, most dads are giving their two or three-year-old an iPad that basically all these games for kids, and it's horrific, and I'm working to try to get more laws in place to protect this, but these have ads.
00:07:12.100 Every 30 to 40 seconds, there's ads playing.
00:07:15.140 Guess what?
00:07:15.880 Those ads are raising your kids.
00:07:17.740 You're not raising your kids anymore.
00:07:18.900 When I grew up, my kids, I would spend 30, 40, 50 minutes a day at the dinner table talking to them, engaging with them, raising them, and now what do we do at the dinner table?
00:07:28.900 We give them an iPad and let some crazy random company raise our kids while we have adult conversations.
00:07:35.260 How do you – obviously, I think that's a little bit more – it's not that it's not dangerous.
00:07:45.760 I think there obviously is a danger in that, and I think most men are probably aware that they want to be engaged, but it seems like there's two different conversations happening.
00:07:55.600 So there's the immediate danger of somebody either sexually exploiting our children or abducting our children, and then there's the long-term danger of grooming our children to believe, behave, and engage in behavior that is ultimately mentally and emotionally destructive to them for their long-term benefit.
00:08:13.200 Am I understanding that correctly?
00:08:15.140 Correct.
00:08:15.740 And just going back to the first point, because I know you were asking details on that.
00:08:20.380 When you're talking about exploiting your kids, it is horrific.
00:08:25.120 The United States of America, we are the strongest military superpower.
00:08:31.440 Do you realize over 60% of all child abductions and kidnappings occur in the U.S.?
00:08:39.780 It's not actually –
00:08:40.580 60%.
00:08:41.660 That's crazy.
00:08:42.560 Well, I've even read statistics of child sex trafficking and how much more prevalent it is currently today than it's ever been in the history of humankind.
00:08:52.740 I don't even know how that's possible.
00:08:54.040 It's because, unfortunately, with the internet and how we operate and social media, where we make all of our accounts public, we're putting our kids and our information out there, and we're raising our kids that you can trust anybody online, that, unfortunately, it's so easy and simple to commit.
00:09:12.560 And here's the unfortunate reality, and this breaks my heart to say it, but after it happens, the probability of you getting your child back is almost zero.
00:09:24.560 But the good news is prevention is very easy.
00:09:28.960 You have to basically focus on two things.
00:09:31.300 One, stop making your social media public.
00:09:34.880 The world should not see your personal life.
00:09:39.040 Have friends, right?
00:09:40.500 Make it private.
00:09:41.240 If you want to share your pictures with 30 friends, go for it.
00:09:44.060 But why are you letting the entire world – why are you letting your family be searchable by all these evil criminals that are out there that are targeting your family?
00:09:52.760 And then the second thing is you have to pay much more attention to who your kids are following online.
00:09:58.680 When I'm growing up, my parents met all my friends.
00:10:02.920 If I was going to go in the car with somebody, they had to come in the house.
00:10:07.020 My parents had to meet them and approve them.
00:10:09.160 My parents knew who my friends were.
00:10:11.080 There were no secrets.
00:10:12.420 My question to you, to people that are listening, do you know everybody who your kids are focusing on and following on social media, on Snapchat, on all these different apps?
00:10:22.460 Probably not.
00:10:23.720 I sat down – my kids are older, but I sat down with my daughter, and I just asked her, I'm like, how many people do you communicate with on a weekly basis on Snapchat?
00:10:34.320 She said about 85.
00:10:36.580 And I'm a cyber expert, and I sat there, and I could name seven of them.
00:10:42.700 That means there's probably 75 people that I don't know who they are.
00:10:46.880 And here's the crazy part.
00:10:47.920 With AI and all these things out there, you don't know who your kids are following.
00:10:52.380 They think it's another 19-year-old that has similar interests, but it could be some creep that's 40 or 50 years old that are targeting, monitoring, and tracking them.
00:11:02.640 And they are very suspicious and sneaky on how they do this.
00:11:08.640 They basically go in.
00:11:09.800 They'll spend about three, four months to build up this rapport, and then the thing you have to watch out for as a man is if they're trying to meet up with them.
00:11:19.940 So if your kids ever say, oh, I'm meeting somebody I met online, or I'm going to go have coffee, or, oh, they said they're going to be at the game, I want to say hi to them, every red flag should go up.
00:11:30.420 You should start seeing red, and you need to stop and cut that down because the reality is if it's a real kid on there, they're never going to really want to meet up with it.
00:11:40.380 It's these child predators that want to meet up, exploit, and examine.
00:11:44.400 So you need to really track and monitor.
00:11:46.380 If your kids are ever building a close relationship and giving out addresses or these people want to meet up with them, you've got to be super engaged and shut that down.
00:11:54.580 Yeah, I saw a video several months ago of a grown man who showed up at a woman's house, and she had her daughter, and I think the daughter was playing on the trampoline or something, if I remember correctly, and he just showed up at the house as if everything was just okay and comfortable.
00:12:11.260 Fortunately, the mom saw him pull up and questioned him, and he drove off.
00:12:15.840 It seems strange, though.
00:12:18.660 I mean, social media is wild.
00:12:20.340 It's got this crazy algorithm that can pinpoint and dial in your exact preferences.
00:12:27.940 I don't know if phones are listening to you.
00:12:29.520 I'd actually want to ask you that question.
00:12:31.080 It seems like they are.
00:12:33.280 There's a lot of hearsay.
00:12:34.420 There's a lot of questions about that.
00:12:36.040 But I'll tell you what.
00:12:36.800 I could have a conversation about some random thought about going to Disneyland in the summer, and all of a sudden, I get a bunch of ads for Disneyland.
00:12:46.700 So it's hard to believe that it's not listening.
00:12:49.000 But it's so sophisticated, and it can create a very customized feed for you.
00:12:58.900 But why can't social media apps seem to tamp some of this down and stomp a lot of this stuff out with fake accounts and predators and these sorts of things?
00:13:08.180 So the short answer is they could.
00:13:11.940 But the problem is right now is the unfortunate reality, these are commercial organizations.
00:13:20.000 They want to make money.
00:13:21.680 And if they shut down those features, it would impact their finances.
00:13:25.780 And because the public is ill-informed and we're not putting pressure on them or our lawmakers, we're allowing them to do that.
00:13:35.240 So they could do it, but they're not because of the money and the billions and billions of dollars they're getting from sales.
00:13:42.980 So this is where we need to sort of start stepping up, take matters into our own hands, and sort of start shutting it down.
00:13:49.260 The good news is you can go in to most of these social media apps and turn on high levels of security.
00:13:56.360 The good news is the security measures are there.
00:13:58.660 You can go in to your kid's Snapchat or your kid's Instagram and turn on security where unless it's verified individuals, unless it's somebody who's had an account for more than two years, your kids won't be able to interact or follow them.
00:14:12.460 So it will restrict who they can interact with.
00:14:15.080 So those features are out there, but we have to turn them on by default.
00:14:19.940 They're not turned on automatically.
00:14:21.220 And that's one of the things I'm pushing is that there should be laws that any social media platform that allows kids to operate, which is every one of them, needs to have security turned on and need to have alerts and mechanisms to protect parents.
00:14:35.140 I mean, it wouldn't be hard to do, but how good would it be if in social media, if you're under 18 and all of a sudden somebody is following your kids that is a suspicious person,
00:14:48.500 you get an alert as a parent on your account and you get notified so you can then go in and take action to help them.
00:14:55.380 I mean, that's not hard stuff to do, but the problem is because the internet came on so quick in social media, we forgot about security, we forgot about protection,
00:15:04.600 and it's just setting us up for real danger over the next three to five years if we don't start correcting it.
00:15:10.540 What would make – let's say that feature was available for parents, the suspicious person feature.
00:15:16.380 What would social media be looking for that would make that individual – or that account, I should say, a suspicious account?
00:15:25.660 So a couple of things.
00:15:27.420 One is if it's a new account that doesn't have a lot of followers because these stalkers and people that are abducting our kids,
00:15:38.880 they essentially are bouncing around accounts because we're finding out who they are and we are trying to shut them down.
00:15:45.800 So usually if an account is fairly new, they don't have a lot of followers, and you can go in and look at their pictures.
00:15:55.440 I mean, AI, it's a little harder, but you can sort of tell –
00:15:59.280 And it will be continually more difficult.
00:16:00.900 It's more difficult now, but you can sort of tell what's auto-generated and what's not.
00:16:05.220 You can see a picture that sort of has been fake or doctored or things along those lines.
00:16:10.700 So it's really just going in and just observing, analyzing it.
00:16:15.720 And the issue is it just takes time, and we're all so busy, but there's ways you can do that.
00:16:21.340 I mean, we can go in and look at a picture and run it through an algorithm and tell if it's AI generated.
00:16:27.180 So the social media platforms, they can go in and just do simple checks.
00:16:31.500 Any account that has less than 50 followers, any account that's within six months, any account where the images look AI generated,
00:16:40.040 they could also very quickly go in and look at an image and see if it's posted elsewhere
00:16:44.640 because what a lot of these evil criminals do is they're going to go and find photos of other kids,
00:16:52.400 and they're going to pose as them online.
00:16:55.540 So, I mean, these social media engines, they have everything indexed.
00:16:59.700 So them searching and saying, is this an original picture or one that existed before?
00:17:03.860 I mean, there's just easy things they can put in place.
00:17:06.660 But, like, once again, the awareness is not there, and we don't put enough pressure on them to do it.
00:17:12.140 Yeah, I mean, I can't tell you.
00:17:13.260 Every single week, I get a message from a bunch of people who tell me,
00:17:16.340 hey, here's a cloned account of your actual account.
00:17:19.960 So I know this is happening, and it's not a concern for me.
00:17:26.400 It bothers me, of course, but it's not a concern for my own safety.
00:17:29.920 But when it comes to our kids, it definitely would.
00:17:31.880 I do think that you hit on something, it takes time.
00:17:35.660 And unfortunately, the reality is that we are pressed for time.
00:17:41.100 We outsource raising our children to the TV.
00:17:44.100 And in my generation, our generation, to social media.
00:17:49.860 I mean, I've got four kids.
00:17:51.020 My oldest is the only one who has access to social media.
00:17:53.840 My three youngest kids from 14 down do not have access to social media.
00:17:58.020 It's not available on their phones.
00:17:58.960 Good on you, good on you, yep.
00:18:00.480 Yeah, it's crucial.
00:18:01.220 They don't have smartphones.
00:18:03.060 There's no iPhones.
00:18:03.960 That's not how it goes here.
00:18:05.600 And even my oldest has access to social media.
00:18:08.440 But it's on, my phone's over there on the counter.
00:18:10.400 It's on my phone and on his mom's phone.
00:18:12.900 So if he wants to get onto Instagram, for example, he can do it.
00:18:16.600 But he has to access it through my phone or through her phone.
00:18:20.120 And that's the only way we allow that to happen as a 17-year-old kid.
00:18:25.400 Ryan, I wish everyone followed your model.
00:18:29.080 I mean, what you're doing is very common sense, very straightforward.
00:18:32.600 But I will tell you, I mean, with friends and family members and going around, just look at restaurants.
00:18:38.020 How many 7-, 8-, 9-year-olds have smartphones, have iPads?
00:18:43.460 The parents aren't following it.
00:18:44.760 So, yeah, I mean, it's a simplicity model.
00:18:47.840 The kids don't need this tech.
00:18:50.100 We've sort of got into this where we treat smartphones and social media like we do oxygen.
00:18:56.120 You need it to survive.
00:18:57.680 And the reality is you don't, and our kids don't.
00:19:00.920 And if you don't believe me, and I give this challenge to everyone listening, go in and take your child, however old they are, take their smartphone.
00:19:11.140 If they have a smartphone, take it away from them for three days.
00:19:16.300 And it was like you committed the worst crime on the century.
00:19:20.340 Oh, man.
00:19:20.920 We're worried about drugs.
00:19:22.740 These kids are addicted to these smartphones.
00:19:25.100 These smartphones are worse than drugs, and we can't live without them.
00:19:29.520 So what if we actually just went in and started limiting the amount of time our kids had their phone?
00:19:35.240 What if at 6 o'clock in the evening their phone goes in a basket and they can't use it for the rest of the night,
00:19:41.440 and we actually make them be social, do what we used to do?
00:19:44.320 Actually, go outside and throw a football, play freeze tag, play badminton.
00:19:50.660 I mean, a lot of these simple skills of just outside nature interacting, all of a sudden we're living our lives so much online,
00:19:57.440 and the unfortunate reality is it's not secure and protected.
00:20:01.360 And the biggest message is to accept that it could happen.
00:20:05.180 Like you said in the beginning, most men I talk to are like, oh, it can't happen to me.
00:20:10.260 It's not a reality.
00:20:11.420 It's something you watch on TV shows.
00:20:13.260 But I can't tell you how many calls I get on a daily basis now.
00:20:18.280 It used to be monthly.
00:20:19.220 It used to be weekly.
00:20:20.040 Daily basis where kids are either bullied, abducted, kicked out of college.
00:20:27.440 I mean, just anything that could ruin their life because we have not taken proactive action of limiting and controlling who they interact with.
00:20:36.040 I would agree with everything you said.
00:20:38.000 The only issue I'd take is badminton.
00:20:39.700 I don't know how many kids were playing badminton when they were younger.
00:20:42.720 I don't know.
00:20:43.300 That's maybe a cultural difference between you and me.
00:20:46.480 But whatever.
00:20:47.440 We'll overlook that.
00:20:50.400 I'm a little geeky, Ryan.
00:20:51.780 So like tennis was too intense.
00:20:53.520 I had to go with badminton.
00:20:54.520 But you're probably – how about ping pong?
00:20:56.160 Does that work?
00:20:56.980 Or maybe a lacrosse.
00:20:58.420 We'll throw lacrosse in there.
00:20:59.320 Yeah, lacrosse.
00:21:00.540 Yeah, lacrosse for sure.
00:21:01.780 That one resonates a little bit more deeply with me.
00:21:03.660 Look, I hear what you're saying and I agree with what you're saying.
00:21:11.640 It seems like technology is so infused into our lives and I don't want to be a person who is out of touch or unreasonable.
00:21:19.280 Of course, I want to make sure my children and myself and my family are protected.
00:21:23.540 But there is some things to consider that we didn't have access to 25 years ago.
00:21:29.760 So 25 years ago, it wasn't a matter of if we had a phone.
00:21:35.240 People say, well, what would I do if I didn't have a phone?
00:21:37.320 What did you do 25 years ago?
00:21:38.580 I mean, I remember running around the neighborhood on my bike looking for where all my buddy's bikes were on the front yard and that's where they were hanging out.
00:21:45.780 I'd knock on the door and Mrs. Smith would answer the door and invite me in to play with Timmy and Tommy and Billy.
00:21:51.760 It's a little different now.
00:21:53.100 I wish it were more like that.
00:21:54.400 I just don't want to get locked into the nostalgia over confronting and facing the reality that this is a new way of operating because of the technologies that are introduced.
00:22:05.480 So I guess my question is, are there certain apps and programs or functions that are more dangerous than others that we ought to really pay attention to?
00:22:19.340 Men, I'm going to step away from the conversation briefly.
00:22:21.880 As many of you know, years ago, I went through a divorce.
00:22:25.580 And as I talked about the struggles I had and the issues that existed in my marriage, I've talked with more and more men who are going through similar and even worse experiences.
00:22:33.840 Some of these men are struggling so much that they found themselves overly anxious, obviously, but depressed and unfortunately, some of them even suicidal.
00:22:44.500 And that's why I'm launching a program called Divorce, Not Death.
00:22:47.700 Now, divorce is hard, but it doesn't have to be the end.
00:22:50.840 So in this program, we're going to teach men how to handle their finances, how to deal with co-parenting, how to learn to emotionally regulate themselves.
00:22:58.960 And even when the time comes, how to get back into the dating space when they're ready.
00:23:04.240 Now, this program is not just launched yet, but it will be soon.
00:23:07.600 So if you go to divorcenotdeath.com, you're going to be able to drop your email and get notified the second that we open this course up.
00:23:16.140 You do not have to handle all of the challenge and the struggle and the BS of divorce alone.
00:23:21.440 It's important for you to have the tools and resources you need.
00:23:24.780 It's also very important for your children to have all the tools and resources that they need in order to navigate through this very difficult and challenging time.
00:23:34.180 And also, banding with men who are in your shoes and those who are on the other side of it all is always a huge help as well.
00:23:41.740 So check it out, divorcenotdeath.com.
00:23:44.020 Unfortunately, this does not, or I should say, hopefully, this program does not apply to you.
00:23:50.700 I don't wish that on anyone.
00:23:52.000 But if you find yourself in the midst of a separation or a divorce, then this might be some information that will help you out.
00:23:58.660 Again, divorcenotdeath.com.
00:24:01.400 You can do that right after the show.
00:24:02.920 For now, let's get back to it with Dr. Cole.
00:24:08.000 Yeah.
00:24:08.780 Yeah.
00:24:09.440 And before I get to that, just jumping back to your point.
00:24:12.080 Now, one of my big words and big proponents, I'm a tech guy.
00:24:16.700 I mean, I love tech.
00:24:17.860 I'm an early adopter.
00:24:18.780 I mean, I'm testing and have robots now and sort of going through it.
00:24:22.120 I mean, I'm a big fan of it.
00:24:23.340 But one of my big words is balance.
00:24:25.520 And I think we've just gotten out of balance.
00:24:27.780 So I'm not saying that your kids don't have a cell phone for two months, right, and that's all they do.
00:24:33.420 But just put a better balance in there and in place.
00:24:36.940 Because in terms of security, the thing is, the social media apps are here.
00:24:44.780 Like, you're not going to go in and take TikTok or Snapchat or Instagram away from your kids.
00:24:51.580 I mean, depending on their age, they're going to live with that.
00:24:53.860 But what you can do is go in, as we said, is just better monitor control and watch your kids post.
00:25:00.500 Like, I give a lot of presentations at churches and PTA meetings.
00:25:04.520 And I always go in and say, you should be following your kids' social media.
00:25:09.400 And I can't tell you how many parents come up and go, Eric, my kids won't let me follow them.
00:25:15.200 Or I don't want to follow them because, you know, some of those pictures really, like, bother me.
00:25:19.800 And I'm like, did you hear what you just said?
00:25:21.820 Like, I mean, a 12-year-old, you should not be able to tell you you can't follow them online.
00:25:29.000 Guess what?
00:25:29.360 You don't get a phone then, right?
00:25:30.840 We've sort of lost control.
00:25:32.740 Our kids are raising us instead of us raising our kids.
00:25:35.640 And if you're watching your kids' photos on social media and you're horrified, that's a problem, right?
00:25:43.640 That's an issue you need to go in and do.
00:25:45.780 So a lot of it is accepting it.
00:25:47.840 The other thing, though, is be careful of a lot of these random free apps because most people don't realize free is not free.
00:25:56.460 When you're downloading a free app, most of them, in order to be free, you have to allow location tracking, access to your cameras, access to your microphones.
00:26:06.300 And when was the last time, and this is an exercise I urge everyone to do, when was the last time you went into your smartphone or your kid's smartphone, went under security settings, and looked at location tracking, and look at all the apps that are tracking your location?
00:26:20.780 My bet is you're not only going to be shocked and horrified, but I would say probably at least half the apps that are tracking your location, you don't need.
00:26:32.180 You're not using.
00:26:33.420 We've sort of got into this where apps are like candy on Halloween.
00:26:36.720 The more is the better, right?
00:26:38.400 But when I was growing up on Halloween, you want to get as much candy as possible so you can sort through it.
00:26:42.320 And most people have hundreds of free apps on their smartphone that they haven't used in 45, 60 days.
00:26:49.840 So another thing is any app that's older than 30 days, delete off your device.
00:26:56.200 Just get rid of these.
00:26:57.500 I mean, one of my favorite phrases is deleting an app a day keeps evil away, right?
00:27:02.460 And just sort of start cleaning up and getting rid of a lot of the extra stuff.
00:27:06.800 So it's a lot of what we call sort of cyber hygiene.
00:27:10.000 We have a lot of good physical hygiene, but when it comes to online and cyber, we don't.
00:27:14.620 Just start having hygiene where don't let apps track your location.
00:27:18.120 Don't let apps track your kids' location.
00:27:19.960 Delete apps you haven't used in 45 days and then monitor and follow your kids online and see what they're posting.
00:27:26.400 And here's the most important thing.
00:27:28.080 Not only see what they're posting, see the comments that they're getting back.
00:27:32.680 Because if you go in and your child posts a picture in Instagram of them on vacation or swimming, and there's like 50, 60 comments.
00:27:43.880 And one of the comments is, and this is, let's say, a 14-year-old.
00:27:48.300 And one of the comments is, oh, looking good, looking sexy.
00:27:51.000 See, you should look at who that person is and track that.
00:27:54.640 I mean, that's concerning.
00:27:56.420 The good part for us as men is these attackers, stalkers, leave clues.
00:28:02.520 They're not stealthy.
00:28:03.960 If you actually look, they're going to leave clues.
00:28:06.200 They're going to post creepy things.
00:28:07.900 They're going to post things that would be suspicious to you.
00:28:10.620 But we have to look and recognize and take action.
00:28:13.420 Yeah.
00:28:15.480 You know, when you were talking about these free apps, I've heard the phrase, if the app or if it's free, then you are the product.
00:28:22.940 Boom.
00:28:23.460 And that seems to have held pretty accurate.
00:28:26.320 They're looking for your location.
00:28:27.700 They're looking to prey on you.
00:28:28.860 They're looking to capitalize.
00:28:30.240 Some of it I don't even think is malicious.
00:28:31.900 I want to be honest about that.
00:28:33.340 But if it is free, then you are the product, not the offering or not the game or not the thing that you're engaged in online.
00:28:38.940 Yeah.
00:28:39.160 And like you said is, you're right.
00:28:43.900 I mean, a lot of these apps, they want to track you so they can give you promotions, so they can give you deals.
00:28:49.880 Like you said, if Starbucks or whatever coffee company you like is tracking your location, I mean, it's nice.
00:28:57.060 If you go and buy a Starbucks, you get a location.
00:28:58.820 Hey, there's a Starbucks nearby and you have a $3 discount.
00:29:02.060 That's all good.
00:29:03.180 But what we have to remember is all those places where your data is stored, what are they doing to protect or secure your information?
00:29:10.780 What are they doing to lock it down?
00:29:12.360 And the more areas your data is located, the more vulnerable you are.
00:29:15.660 So you always have to go in and people always go with me with cyber work.
00:29:19.540 You're the guy that's going to tell us no.
00:29:21.260 You're going to say to shut off our phones and go Amish and live in Pennsylvania.
00:29:25.140 But my rule is it's simple.
00:29:26.760 I'm never going to tell you no.
00:29:28.660 I'm just going to say ask additional questions.
00:29:31.360 Everyone goes, what's the value and benefit?
00:29:33.680 Oh, the value and benefit is I can get discounts on coffee.
00:29:36.460 Great.
00:29:37.020 But ask yourself another question.
00:29:38.680 What's the risk or exposure?
00:29:40.420 And is the benefit worth the risk?
00:29:42.680 And to me, the benefit's not worth the risk.
00:29:45.480 So I allow very few apps to track my location.
00:29:48.220 I essentially live my life on eight apps.
00:29:51.220 That's it.
00:29:51.940 The apps are locked down.
00:29:53.000 They're secure.
00:29:53.460 They're protected.
00:29:53.980 And I minimize that risk.
00:29:55.180 But you just have to ask yourself, there's always value and benefit.
00:29:58.540 But is the value and benefit worth the risk or exposure to you or your family?
00:30:04.160 Yeah.
00:30:04.700 Well, and there's also things that I think can be problems not just in the digital realm but in the real world.
00:30:10.320 You know, I learned this years ago where – because I do travel quite a bit for work.
00:30:14.520 And I've been a little less deliberate about this recently.
00:30:20.820 I should go back to this.
00:30:22.060 But when I go on a trip or go on vacation, I don't post about that vacation or that trip until I get back.
00:30:29.280 Now, I haven't followed that recently.
00:30:31.140 I should go back to that.
00:30:32.180 But it seems like not only are you opening and subjecting yourself to dangers in the digital realm, but now we're talking about subjecting yourself in the physical realm.
00:30:40.720 Because if I'm on vacation in Hawaii, for example, and I've got all of my family there and I'm posting in real time on my phone, then every bad actor knows Ryan's not at home.
00:30:51.680 We'll just go ahead and go to his place, take whatever we want, break into his place, do what we need to do and, you know, be on with it without having to worry about him being around.
00:30:58.880 Exactly.
00:31:00.700 What a lot of people don't realize, and you could check this, is most of us, like you said, before you go on vacation, have posted a picture at your house, right?
00:31:10.060 It might have been of your dog or the kids playing out back or something.
00:31:13.760 Well, all those photos that you take with your smartphones, by default, they have your location in there.
00:31:21.220 So if you go to the metadata, you can find the address.
00:31:23.740 So these attackers, you're right, are going in and they're looking for anybody that's saying, hey, just arrived at Dulles Airport, you know, and heading to Hawaii for two weeks.
00:31:33.500 I'm going to go into your account.
00:31:35.400 I'm going to find a picture that you took a month, two or three months earlier, and I'm going to look up what your home address is, and I'm now going to target you.
00:31:43.180 I'm now going to go after you.
00:31:44.440 There used to be a site, and it was taken down just because of so much public negativity, but there was technically nothing wrong with it.
00:31:53.620 It was actually called pleaserobme.com, and what they would actually do is they scour social media.
00:31:59.960 They find houses of people on vacation, and if you put in a zip code, it would give you a map of that area and show you not only every house that they're on vacation,
00:32:10.440 but they would pull up the tax records because that's public information, and they would tell you the value of the house.
00:32:16.880 So they would give you like $1 sign if it was $200, $2 signs if it was $500, and so basically anyone could go in and see the value of the house,
00:32:26.560 basically the income of that individual, and whether they're in town or on vacation.
00:32:31.000 And the scary, scary part is the current laws in the United States were one of the few countries that don't have unified federal laws on data privacy and cybersecurity,
00:32:44.020 so technically that's not illegal.
00:32:46.160 For me to go in, identify people that are on vacation and how much their houses are worth, believe it or not, in this country, it's not illegal.
00:32:54.060 In Europe, in Australia, it's illegal, but the United States, because our laws are behind, there's nothing that's illegal about doing that.
00:33:03.940 Well, I mean, let's be honest.
00:33:05.320 I don't know that our laws are behind.
00:33:07.120 I mean, you can take something like the Patriot Act, for example, and the government's welcome to have whatever information they want from you,
00:33:13.420 but that's not an issue, and everybody else then can have issued or access to it as well.
00:33:19.780 That's one of the frustrating things to me.
00:33:22.580 It seems like, though, when you would take a photo, let's say I took a photo of, I just mowed the lawn in front of my house, and I take a photo of it,
00:33:30.680 that when I transferred over to Instagram or Facebook, for example, or Twitter or X, that in that transfer, some of that metadata would get lost.
00:33:42.180 But you're saying it doesn't, is what I'm hearing you say.
00:33:45.500 Right, depending on the settings on your phone and how you take it, in a lot of cases, it's not.
00:33:50.280 It could be, and the good news is you can put a setting.
00:33:54.560 It's a universal setting.
00:33:55.800 I have it for all my photos where it doesn't track metadata.
00:33:58.960 It doesn't track location.
00:34:00.460 But once again, these features are there, but they're turned off by default.
00:34:06.660 Why are these social media vendors turning this on automatically?
00:34:09.620 So the good news is you can take 20 seconds and go into your kids and your phone and turn off auto-location metadata on all photos, and then you're covered.
00:34:20.420 But you have to take the action.
00:34:21.980 It's not done automatically.
00:34:24.380 What are some of the other threats that you feel like are real prevalent that we need to address?
00:34:29.580 Obviously, protecting our children is huge.
00:34:31.380 We've spent a bulk of our conversation up to this point talking about that.
00:34:34.600 I imagine access to financial institutions, financial assets is probably something you'd want to be a little bit aware of.
00:34:42.420 Are there other threats that we need to pay attention to?
00:34:45.380 Because my fear is that we could talk about all this stuff until we're blue in the face, and this is what you do for work.
00:34:51.420 And not to diminish or downplay or downgrade what you do for work,
00:34:55.560 but a lot of people just can't focus on all of the threats all the time in any given minute of every single day.
00:35:03.260 Yep.
00:35:05.460 So two things that are pretty simple and would actually stop a large percent of the threats that are harming you.
00:35:14.900 First is passwords are dead.
00:35:18.640 You should not use passwords today.
00:35:20.520 I jokingly say, if you're still using passwords, you might as well wear bell-bottom pants and listen to the Bee Gees, right?
00:35:26.660 Nothing against the Bee Gees.
00:35:28.100 I mean, I'm more of an ACDC fan.
00:35:30.200 Just saw them in concert.
00:35:31.540 I appreciate that.
00:35:32.600 I can appreciate that about you.
00:35:34.480 Exactly.
00:35:35.460 You're stuck in the 80s.
00:35:36.880 You got to use what they call multi-factor authentication.
00:35:40.880 That's where every time you log in, your phone gets a code and you type it in.
00:35:45.900 And I know what everyone says.
00:35:48.160 It's a pain in the ass.
00:35:49.260 Yeah, it's a pain in the ass.
00:35:50.360 It's an inconvenience.
00:35:51.560 You know what my response is?
00:35:52.940 You know what's a real inconvenience?
00:35:55.140 Getting your bank account wiped out.
00:35:57.120 You know what's a real inconvenience?
00:35:58.600 Getting credit cards written in your name and your credit score destroyed.
00:36:01.780 You know what's a real inconvenience?
00:36:03.160 Getting your identity stolen.
00:36:04.900 So the reality is we have to choose our inconvenience.
00:36:08.660 Do you want a short-term or a long-term inconvenience?
00:36:11.600 You have to move to two-factor.
00:36:13.960 Almost every one of these attacks where we're seeing cryptocurrency being wiped out.
00:36:17.940 And I just had this this morning where somebody, a colleague of mine, recommended a friend in Dubai who had $4 million in their crypto wallet protected by a password wiped out overnight.
00:36:30.820 Again, can't get it back.
00:36:33.780 The whole idea of crypto is it's –
00:36:36.200 Because you can't track it that way.
00:36:38.240 You can't track it.
00:36:39.280 So you've got to go multi-factor.
00:36:41.240 You've got to accept a little bit of an inconvenience.
00:36:43.600 And then the second one is never, ever, ever, ever, ever, under any circumstances, click on a link.
00:36:51.680 I mean, I'm sure you get them.
00:36:53.540 But if you've gotten the toll booth scams on your phone saying, I mean, we're updating our systems in Florida.
00:37:00.400 And over the last five years, we've noticed that you've been in Florida and had a speeding ticket.
00:37:04.740 And if you don't pay $8.99 within the next 48 hours, we're going to send law enforcement.
00:37:11.100 We're going to go after – I mean, and people panic and they click on the link.
00:37:14.880 And it's all malicious.
00:37:16.680 It's all scams to steal your credit card, steal your information, put malware on your device.
00:37:21.380 So don't click on a link.
00:37:23.540 But, Eric, what if I get a message from my bank?
00:37:25.920 Go to the app.
00:37:27.620 You need to have apps.
00:37:28.100 Yes, go directly to the source.
00:37:30.420 Directly to the source.
00:37:31.400 Go to the app.
00:37:32.460 Don't use links.
00:37:33.260 Don't use websites.
00:37:34.260 And then also recognize that the government does not communicate with you with texting and emails.
00:37:44.700 Most people don't realize this.
00:37:45.980 There's a huge scam out there from the IRS.
00:37:49.280 Do you realize the IRS will never communicate with the taxpayer via email?
00:37:54.540 They will send you a letter.
00:37:55.740 Yeah, I mean, I get mail all the time about how much I owe them and how much I forgot and what I owe in interest.
00:38:00.480 But it's never through text or email.
00:38:01.980 It's always paper.
00:38:03.460 Right.
00:38:03.820 So if you get a text or you get an email saying it's from the IRS and you need to go in, it's a scam 100%.
00:38:10.680 So just awareness of how they work and how they operate.
00:38:13.940 One of the other scams we just saw, it started early June, so it's been going for about a month now, is you actually get a phone call.
00:38:23.560 And they spoofed the number of FBI headquarters, so you think you're getting a call from the FBI, and they basically say, listen, you're being criminally investigated for these different crimes, but we need to go in and verify and validate your information.
00:38:45.800 And then we can clear your name right now if you can clear your name right now if you can do a five-minute interview with us.
00:38:50.660 And what do they say?
00:38:51.760 We just need to verify who you are, so can you give me your date of birth and your last four digits of your social security number?
00:38:58.100 Where people panic, people freak out, you think you're being arrested by the FBI, and they don't use common sense and they give the info away.
00:39:05.500 And then here's the crazy part, most of the people that fall for that, when they hang up the phone, within five minutes they go, oh, crap, I shouldn't have done that.
00:39:16.620 And then they call me or somebody else and go, what do I do?
00:39:18.900 And the problem is it's too little too late.
00:39:20.560 So just recognize, the FBI is not going to call you.
00:39:24.120 If they're going to arrest you, they're going to come to your house.
00:39:26.280 So just be aware the government and these entities are not going to operate with texting, email, and phone calls.
00:39:34.660 So just be really cautious.
00:39:36.300 And like I said, never, ever, ever click on a link.
00:39:38.760 If you do multi-factor, never click on a link and never accept phone calls from locations.
00:39:44.080 That takes away most of the vectors so quick and so fast.
00:39:48.580 What about some of these companies that offer online protection, insurances against your financial assets, the ability to protect and look up credit scores?
00:40:02.000 I think one is called LifeLock is one of them I've heard.
00:40:05.600 Are these worth their weight?
00:40:07.320 Are they worth considering?
00:40:08.840 Would something like that be valuable to have in place?
00:40:12.420 So what I always say is they're like airbags.
00:40:17.600 Airbags make the probability of surviving an accident much, much higher.
00:40:23.160 But here's the reality.
00:40:24.240 People die in automobiles with airbags all the time.
00:40:28.220 Airbags are only going to keep you safe if you're a safe driver.
00:40:30.800 So you could have the safest car on the planet.
00:40:34.440 And if you drive 100 miles an hour into a brick wall, you're still going to get injured and hurt.
00:40:39.300 So these services are good as an augmentation, but they can't replace bad behavior.
00:40:46.260 So the thing I always get concerned with with some of these services out there is people like, oh, if I go in and I use a service that monitors and protects me, I don't have to worry about security.
00:40:57.360 I can click on links.
00:40:58.280 I can download apps.
00:40:59.800 I can do crazy stuff.
00:41:01.080 And the reality is no.
00:41:02.000 If you don't click on links, you remove apps, you're careful of what you do, then these can provide an additional layer of protection like an airbag, but they can't replace good user behavior.
00:41:14.900 That makes sense.
00:41:16.040 I mean, that makes sense.
00:41:16.680 I think a lot of times we're just looking to outsource any sort of responsibility that we can.
00:41:21.520 And what you're saying is, you know, implement the two-factor authentication.
00:41:26.380 You know, don't click on links.
00:41:28.340 Be smart outside of that.
00:41:30.400 Like even with companies, like sometimes a bank will call and they'll say, hey, what's your – like they'll call me and ask for my information.
00:41:38.340 And my response is, well, you called me.
00:41:39.920 You've got my information.
00:41:41.380 Yep.
00:41:41.700 So if you want to verify it, we can do that a different way or I can call you back.
00:41:46.060 But you're the one that called me.
00:41:47.480 I did not call you.
00:41:49.240 But I think, yeah, people for whatever reason – I guess it's not for whatever reason.
00:41:54.200 We've just developed over the past quarter century now this unreasonable amount of trust in these little devices because we use them every day.
00:42:04.960 So we think they're safe, but they're actually – they're not.
00:42:07.260 They're the antithesis of it, I think.
00:42:10.300 Exactly.
00:42:11.080 And it's one of those things where I joke, but like if we go back 15 years ago or 20 years ago and I told you, hey, Ryan, in the year 2025, we're going to have implanted devices in you that track your location.
00:42:27.460 And we're going to have sort of a camera in your eye that records everything you do and monitors and tracks and listens in on your family and listens in on everything you say.
00:42:36.140 You'd be like, no way.
00:42:37.740 No effing way am I going to allow that.
00:42:39.480 But guess what?
00:42:39.940 That's a cell phone.
00:42:41.100 Like you said earlier, cell phones listen.
00:42:43.880 If you don't believe me –
00:42:44.780 I was going to ask you that.
00:42:45.600 They do?
00:42:46.820 You 100% are saying those cell phones are listening to you.
00:42:50.480 Yeah.
00:42:50.680 So two points.
00:42:52.640 One is when you say, Alexa, what's the weather?
00:42:58.700 Or you go in and say, Alexa, add an alarm to go off at 2 p.m.
00:43:03.160 How can it respond if it's not listening?
00:43:06.180 Yes.
00:43:06.940 Good point.
00:43:07.700 It has, right?
00:43:09.440 It can't.
00:43:10.520 And then here's the thing like you said.
00:43:12.780 Do this.
00:43:13.700 I guarantee the results is put your phone on the counter and just talk about a topic.
00:43:21.540 I think you brought up Disneyland.
00:43:23.420 So talk about, hey, Disney, I really want to take the kids to Disney.
00:43:26.720 I'd really love to go to Disney and just make it interesting.
00:43:29.500 It would be so cool to take a Disney cruise where we actually take a cruise from Orlando
00:43:35.060 and then we go to Disney World afterwards.
00:43:36.700 Now, you typically have to do it for about 60 to 90 seconds.
00:43:41.360 But I guarantee if you then go in and you start surfing the web, you'll get ads for Disney.
00:43:47.940 You'll get ads for cruises.
00:43:49.500 You'll get ads for airline rates.
00:43:51.100 And then here's the crazy part.
00:43:52.720 Now, go into Google and type the word W.
00:43:58.120 And you type the word W and you know how it starts to autofill for you?
00:44:02.360 Pre-population.
00:44:03.800 Yeah.
00:44:04.120 What is the cost of Disney?
00:44:05.620 What is the cost of a Disney cruise?
00:44:07.160 It automatically goes there.
00:44:08.780 My question is there's no coincidence.
00:44:10.880 The only possible way that's happening if it's listening.
00:44:14.600 So, yes, I can tell you firsthand that these devices are listening.
00:44:18.800 I could also tell you because I do expert witness work in these high-profile cases.
00:44:23.900 If you give me access to your phone, like during an investigation, I can go in and pull out all those recordings, all that information.
00:44:32.360 I was actually in a trial last year where, craziest thing, but we convinced a judge to let Alexa testify.
00:44:39.400 I was actually able to pull the records from their phone of Alexa that had all the recordings and all the tracking of what happened and basically showed what occurred and what crime committed.
00:44:50.540 And the judge actually allowed us to play that in court and use that as evidence.
00:44:54.620 So, it's a crazy world where your phone could be your greatest ally, your greatest enemy, and it's recording and tracking everything you're doing.
00:45:02.840 So, if you want to have private conversations, technology should not be around you is what you're saying.
00:45:07.480 I will tell you right now, when you go to government facilities and you're going in and having classified discussions, your cell phones stay on the outside.
00:45:20.000 I will also tell you I work with big companies on acquisitions and mergers, and they essentially have rules that when you're going into the corporate conference room,
00:45:30.040 the cell phones go outside in what we call a Faraday box, which basically stops and accesses any electronic communication.
00:45:37.480 So, executives and the government understand this, and they don't allow phones in those locations, but we, for some reason, do that.
00:45:45.760 So, if you want to have sensitive conversations or anything that's concerning, don't have your cell phone nearby.
00:45:51.720 Yeah, I mean, I had the opportunity and privilege to be able to go to Andrew's Air Force Base years ago and actually go tour and see Air Force One.
00:46:02.500 And that was, it was exactly what you're saying.
00:46:05.360 The, they took, they, not only did they say, hand us your phone, so we handed our phones, they checked to make sure that our phones and our devices were not near us.
00:46:14.620 And part of that was obviously taking pictures, but I imagine there's a lot of conversations that are taking place that should not be privy to anybody else or any other tech company that could otherwise have access to that information.
00:46:26.900 Exactly.
00:46:28.620 Exactly.
00:46:30.660 That is wild.
00:46:33.220 Yeah, I, it's, it's pretty crazy to think about that.
00:46:36.060 I think we've all kind of inherently maybe understood or thought that was the case, but to actually know that's the case.
00:46:42.000 What, you, you were doing some work for the CIA, and I don't know if you still do work for the CIA, but it seems like, based on what I read and understood,
00:46:49.120 part of your role was to act as one of these bad actors and basically hack your way into some of these databases and really try to break things digitally, if, if maybe there's a better way to say that, but that's what it seems like to me.
00:47:05.780 Yeah.
00:47:06.200 Yeah, we called it a professional hacking.
00:47:08.580 So I was, I was a professional hacker of the CIA.
00:47:12.640 And the interesting thing about it is, it was in the 90s.
00:47:16.580 So it was from 1991 to 1996.
00:47:21.220 And, and to put this in perspective, it's early to, you know, put it in perspective.
00:47:26.220 The worldwide web was not invented until 92.
00:47:30.560 E-commerce didn't come out until 97.
00:47:33.740 Google wasn't until 97.
00:47:35.440 Amazon wasn't until 1998.
00:47:37.400 And here's the shocking part.
00:47:39.440 Apple phone, the iPhone, the first smartphone, didn't come out until 2008.
00:47:44.100 So like, we're talking early on, but here's the crazy part.
00:47:47.640 2008, really?
00:47:50.060 The vulnerabilities and exposures we're seeing back in the 90s are the same things today.
00:47:55.880 So, so, so once again, sort of instead of sort of doom and gloom, the one thing I can tell you is, if it's a system with functionality, I can get in.
00:48:05.180 Because any system that has functionality is not secure.
00:48:08.960 So there's always vulnerabilities and issues.
00:48:10.640 The question is, what do you get access to?
00:48:13.200 So, so a couple simple things is any of your electronic devices, whether it's laptops, cell phones, or iPhones, you got to update.
00:48:21.040 Because when vulnerabilities are found on these devices, the vendors are really good at releasing patches and updates.
00:48:28.360 But if you don't apply it, the attackers are going to break in.
00:48:31.620 Most attackers break into your device by using a known exploit, which means if you just updated your software and updated your system, that would go a long way to protect and secure it.
00:48:43.020 The next thing is, go in, and I know it's a little bit of an inconvenience, but at least once a year, you should rebuild your system.
00:48:53.720 Like back your data up to the cloud, rebuild your system, because the probability that your device has malware on it over the course of 12 months is pretty high.
00:49:02.700 And most of the time when we see information stolen or a tax compromise, they've been compromised for one to two years and didn't realize it.
00:49:10.600 So if we just rebuild our devices periodically, that'll also go a long way.
00:49:15.600 And then back to what you said, not only monitoring services, but these endpoint security products, whether it's, I mean, there's so many out there, whether it's Symantec, whether it's Sophos, whether it's CrowdStrike, but these endpoint security products, they're going to run you about $79 for all your devices.
00:49:33.240 You need to install them on every device.
00:49:34.800 So these endpoint security products are not perfect, but they will catch and stop a lot of the malware and a lot of the attacks out there.
00:49:41.940 So any device, whether it's iPad, smartphone, laptop, go in and spend a small amount of money.
00:49:47.540 It's the price of a couple cups of coffee or a couple of beers.
00:49:51.100 It's worth it.
00:49:51.800 But make sure all your devices and all your family devices have endpoint security.
00:49:55.520 What about VPNs?
00:49:58.200 Is that something that you recommend as well?
00:50:00.160 Because the way I understand it is that these internet service providers essentially are collecting your data, your search history, just like we were talking about with these smartphones and tech recording conversations.
00:50:12.780 But VPNs seem to encrypt that data or not even store it, if I understand correctly.
00:50:18.020 Is that something you would suggest?
00:50:20.200 Yeah, huge fan of VPNs.
00:50:21.880 Many of the newer apps that we have today do encrypt at some level.
00:50:30.200 So it's better than it was five years ago.
00:50:33.140 But VPNs are either free or like $4.99.
00:50:37.360 They're super cheap for a year.
00:50:39.620 And they basically run in the background.
00:50:41.960 They don't use up a lot of resources.
00:50:43.500 And not only for home use, it's okay.
00:50:47.820 But the bigger issue is people don't realize airports, coffee shops, hotels, that public Wi-Fi is wide open.
00:50:56.440 They're monitoring, tracking.
00:50:57.920 They're seeing what you do.
00:50:59.220 So if you travel at all and use any public Wi-Fi, you've got to download a VPN and just run it on your device.
00:51:06.080 Do you recommend upgrading technology?
00:51:10.420 So, for example, I think we're on, I don't even know, iPhone, maybe 16 or 17 at this point, for example.
00:51:16.840 My iPhone is, I think it's maybe a 13 or four.
00:51:21.640 I don't even know what it is.
00:51:22.980 But I've had mine for two or three years.
00:51:24.720 So I'm sure it's outdated by two or three iterations at this point.
00:51:28.640 Is that something that consumers should look at as well?
00:51:31.020 I mean, obviously, look, I tend to believe that I don't need a new phone because it's just extra money, extra cost.
00:51:37.780 There's not many new features in with that.
00:51:40.020 But is that something you'd recommend?
00:51:42.700 I do with one caveat.
00:51:44.860 And it's the rule that I follow.
00:51:47.120 It's a 12-month rule.
00:51:49.080 I never use new tech.
00:51:51.320 Now, don't get me wrong.
00:51:52.980 I test out the tech.
00:51:54.600 So a lot of these companies, I'm doing beta testing.
00:51:57.340 So for some of the new smartphones and new iPads, I actually get them before they're publicly released and do security checking, pen testing, and help these companies protect and secure it.
00:52:07.280 But I will never use it in my personal life until it's been out for 12 months.
00:52:11.260 And the reason is tech is rolling out so quick.
00:52:15.660 The way these companies are doing it is they used to do testing in-house.
00:52:19.620 And then what they released to the public was very well tested.
00:52:23.120 Now they do partial testing and they use the public to test it.
00:52:27.360 So most of the vulnerabilities and most of the exposures in this new tech are found within the first 12 months.
00:52:34.160 So if you're using brand new tech, the probability of having exposure points and compromise is super high.
00:52:40.520 I typically wait about a year, year and a half, and then I upgrade.
00:52:44.860 I recommend don't go in more than two years because the problem is tech that's older than two or three years is a low priority to the vendor.
00:52:52.360 They're not keeping it up to date.
00:52:53.460 They're only keeping up their latest product, which is typically the product that's been out for about 24, 36 months.
00:53:01.240 So you sort of have this window where after about 12, 13 months, you need to upgrade, but don't wait more than two years.
00:53:07.580 And then that's going to put you in a sweet spot of having pretty secure, pretty locked down, and pretty protected tech.
00:53:14.360 That makes sense.
00:53:15.220 I think I've always intuitively looked at it that way with operating systems.
00:53:18.440 When Apple comes out with a new operating system from my Mac, I'm like, I'll wait a few months.
00:53:22.520 I don't wait a year, but I usually wait a few months.
00:53:25.360 But if I'm being honest, the reason is because I don't want any bugs that would hamper my ability to do work.
00:53:30.140 It's not really usually security-minded, if I'm being honest about it.
00:53:35.060 Yeah, wild stuff, wild stuff.
00:53:38.200 So as we move forward into the future, AI is obviously going to be a big part of this.
00:53:44.560 I think technology is going to be integrated into our lives drastically more than it is today,
00:53:51.220 almost to the degree that we can't even separate reality from the digital world.
00:53:56.380 I've heard crazy things like having monitors in your home that you could just put a prompt in
00:54:04.000 and AI will automatically build a movie featuring the actors that you want to have with the theme
00:54:11.180 and the script that you want written.
00:54:13.840 What are some threats that we need to be aware of moving into the future?
00:54:19.040 So the biggest one is recognizing that AI and tech is a tool.
00:54:24.180 It's not a replacement for humans.
00:54:26.380 If we go in and let AI replace us, we're downgrading our intelligence because the reality is
00:54:32.320 you're creating things in your likeness and image, but the creator is always more powerful than the creation
00:54:40.180 unless we give that power away.
00:54:42.480 So when I hear these executives go in and say, oh, AI is going to take your job, right?
00:54:47.520 AI is going to replace you as a human if we allow it, if we downgrade our intelligence and accept that we will.
00:54:53.860 But here's my reality.
00:54:55.420 AI shouldn't take your job.
00:54:56.920 It should enhance your job.
00:54:58.260 It should make your job better and more efficient.
00:55:00.420 But here's the reality.
00:55:01.900 AI can't have emotions.
00:55:03.940 It can't have feelings.
00:55:05.740 It can't have original thought.
00:55:07.260 It can't have this conversation where we're sort of connecting at, I mean, a human level because it's not human.
00:55:13.020 So if we go in and allow ourselves to talk to AI and replace of a human, it could actually take over our lives and make us irrelevant.
00:55:22.920 But if we don't accept that and say, hey, I'm going to use AI as a tool but not as a replacement, then we're actually going to be okay with that.
00:55:29.060 And you're right, the stuff that AI can do, I mean, it shocked me because I sort of knew about it, but I don't play that much in that space as these self-driving cars.
00:55:40.260 My son recently got a Tesla.
00:55:43.920 And I will tell you, Ryan, it was terrifying.
00:55:46.700 He picked me up in Virginia.
00:55:49.480 We drove to Philadelphia.
00:55:51.300 And it auto-drove us the whole way.
00:55:55.360 It changed lanes and modified lanes.
00:55:57.560 So it's one of those where that's an example where me and my son could have meaningful conversation because he was watching the road, but it took care of most of the normal safety features.
00:56:08.160 So to me, that's a tool that helps us.
00:56:09.960 But now if I go in and I use that self-driving car to drive me and without my son because I don't need him anymore to drive me, then that all of a sudden replaces humans and puts us in a dangerous spot.
00:56:21.680 So we just need to recognize it's a tool.
00:56:23.780 We can't allow it to replace human interaction.
00:56:27.060 Yeah.
00:56:27.680 That is interesting because I thought on the road, when I'm on the road, I might need to make a phone call or send a text or a message.
00:56:34.200 I'm like, man, I just wish this car could drive itself so I could sit in the back seat and do my work.
00:56:38.020 But, yeah, I think it's going to be a hard sell as these things come more and more online to keep ourselves from further isolation.
00:56:51.260 I mean, we already have it.
00:56:52.080 I've heard things like we're the most connected that we've ever been across the globe, but we can't connect with somebody across the dinner table.
00:57:00.900 And that's a real worry that I have with new technology, even though I use it every day and it's wonderful.
00:57:06.100 And it's part of the reason that you and I can have a conversation like this that we would not have been able to have 25, 30 years ago.
00:57:12.860 Yep.
00:57:14.500 Interesting.
00:57:15.400 Well, Eric, this has been fascinating.
00:57:16.900 I know there's a lot more to delve into and you've got a lot of information.
00:57:19.980 You've been in the game for decades at this point.
00:57:21.740 Where do the guys go to connect with you, to learn more about what you're doing?
00:57:25.540 My biggest concern is that there's so much information and so much to be aware of that people tend to, when confronted with too much information, tend to not do anything.
00:57:36.960 And that's one of my biggest concerns for myself and for other people, too.
00:57:39.960 And that's my whole philosophy is, I mean, I'm a tech guy and it's one of those, I mean, I could sit there and give thousands of things, but it's all about small bite-sized things.
00:57:52.100 Like, focus on for the next two weeks, switching all your accounts from passwords to MFA.
00:57:57.200 Okay, then, just as a daily practice, delete one app.
00:58:01.040 Then go in and don't click on any links.
00:58:03.620 So, it's small steps.
00:58:05.460 And what I tell everyone is, you'll spend three hours a night, and I have nothing against it, watching Netflix and been watching, but you won't spend 10 minutes a night focusing on protecting you and your kids.
00:58:15.960 So, I'm not saying you need to go in and get a master's degree or spend 20 hours, but what if you just spent 15 minutes a day?
00:58:22.620 Watch one less episode on Netflix and spend 15, 20 minutes on just checking with your kids, checking their phones, checking your devices?
00:58:30.200 It's small little steps that make the big difference.
00:58:33.380 And I know a lot of folks, when they come on different podcasts, they love selling or pushing things.
00:58:38.780 The thing I ask is, I'm on a mission to secure cyberspace.
00:58:42.140 I just want you to follow me and share the message with other people.
00:58:44.760 So, D-R-E-R-I-C-C-O-L-E, Dr. Eric Cole.
00:58:48.500 I'm on Instagram, I'm on YouTube.
00:58:49.920 I have podcasts, I give a lot of free tools, free videos.
00:58:54.760 I'm basically here to help secure cyberspace.
00:58:57.780 So, just follow me and start implementing these practices.
00:59:01.060 Share it with other folks.
00:59:02.240 If you want to read more, I have it behind me.
00:59:04.440 My book, Cyber Crisis, How to Protect Yourself in a Digital World, has a lot of these tips and tricks.
00:59:10.440 It's an easy read, about two hours, so you can do that.
00:59:13.880 And then if for some reason I can help you on a business front,
00:59:17.400 if you go to secure-anchor.com, that's my company website.
00:59:21.440 I'd love to hear from you and engage with you.
00:59:23.960 Awesome.
00:59:24.460 Well, we'll sync it all up.
00:59:25.700 I think this is an important topic, and I think it's one that's going to become more and more relevant
00:59:29.980 as we delve deeper and deeper into the world of technology, which, again, is wonderful.
00:59:34.680 And it sounds like you geek out on this stuff, and I like it too.
00:59:37.280 Probably not to the degree that you do, but we need guys like you who are on the forefront
00:59:42.080 of understanding what is there, what is possible, and what the dangers are.
00:59:46.500 So, thanks for joining me today.
00:59:48.720 Thanks for having me, my friend.
00:59:49.820 It was truly a pleasure.
00:59:52.120 All right, man.
00:59:52.860 There you go.
00:59:53.360 My conversation with Dr. Cole.
00:59:55.140 I know this one is probably not a topic that feels very exciting or thrilling,
01:00:01.720 but as I said earlier, it's our job to protect, provide, and preside, and part of protection
01:00:06.280 is making sure that our families and ourselves and our businesses are protected from every
01:00:11.080 threat that exists, and there are some very real-world threats as it relates to technology
01:00:18.300 and cybersecurity that we need to be aware of and stay out ahead of.
01:00:21.880 So, make sure you connect with Dr. Cole on the Gram, over on X, on Facebook.
01:00:27.020 Take a screenshot.
01:00:28.760 Maybe there's people you know who need to hear about this.
01:00:30.860 Maybe you've been a victim of some of these cyber threats or even family members being
01:00:35.780 victims of cyber bullying.
01:00:37.660 This stuff happens.
01:00:38.420 It's very common, and we need to get out ahead of it.
01:00:41.460 So, check it out.
01:00:42.500 And then also, remember, we've got our Divorce Not Death program coming out very soon.
01:00:47.360 So, make sure you go to divorcenotdeath.com if that applies to you, or if you know a man
01:00:51.760 who is going through a divorce, could be your brother or your father or a cousin or a friend
01:00:56.420 or a colleague or a coworker.
01:00:58.180 We want to get them back on their feet.
01:01:00.400 Divorce doesn't have to be the end.
01:01:01.900 It's hard, but it doesn't have to be the end.
01:01:03.460 So, check that out.
01:01:05.120 DivorceNotDeath.com.
01:01:06.460 All right, guys.
01:01:07.200 We'll be back tomorrow for our Ask Me Anything.
01:01:09.760 Until then, go out there, take action, and become the man you are meant to be.
01:01:13.380 Thank you for listening to the Order of Man podcast.
01:01:19.500 If you're ready to take charge of your life and be more of the man you were meant to be,
01:01:23.480 we invite you to join the order at OrderOfMan.com.
01:01:27.120 You're welcome.
01:01:28.500 You're welcome.
01:01:30.320 Take care.
01:01:30.720 Pardon the order of an order of a person.
01:01:32.780 I'm not a fan of the order of an order at Order of Man.
01:01:33.640 Everybody, always agree.
01:01:34.680 It's also a good idea.
01:01:35.340 Everything is interested in you.
01:01:35.640 You're welcome.
01:01:36.320 It's chess.
01:01:36.900 It's just day one last time.
01:01:37.200 Give irta to each other.
01:01:38.140 I have a babysitter.
01:01:39.240 We're fine.
01:01:40.620 I will be下次 to the order of a choice.
01:01:41.780 Enjoy TV.
01:01:43.240 I will be your daddy.
01:01:44.340 You're welcome.
01:01:45.040 You're welcome.
01:01:45.740 I'm rolling.
01:01:46.400 You're welcome.
01:01:47.320 I'm being Hate Theory.
01:01:48.320 Cheers.
01:01:49.220 You're welcome.
01:01:49.920 You're welcome.
01:01:50.380 The answer the year.
01:01:52.660 See you?
01:01:53.480 We're a refreshing.