00:02:23.160The technology, but also the victims are plentiful, apparently. And it's very lucrative. You know, you can make a few phone calls at the end of the day, you know, scam. You make 100 calls, maybe you could get three people to bite. And in those three people, there's thousands of dollars available, apparently.
00:02:45.860In my world, money is very valuable, but for some people, the pressure becomes too much and they can part ways with their beloved cash.
00:02:57.560So it's interesting. Most recently, we saw news about CRA scams going on and apparently CRA has had to block about $2.9 billion in payments people are making directly to CRA, which means that the scammers are probably getting an equal or greater proportion of the money being paid.
00:03:15.860uh requested out there as a scam and so now the police as we saw just before we started this
00:03:22.580are out there telling people you've got to be more aware you have to double check
00:03:26.740and he lends to that the uh power of ai kind of driving the forces of evil even further into our
00:03:35.940pocketbooks and that's interesting um because it makes perfect sense yeah yeah they can clone your
00:03:42.900voice now um so you know if you get a you you take my voice for example now use my voice to
00:03:50.340call my mother and father and and and make my voice say that hey i'm in jail and i need to be
00:03:55.940bailed out now uh jeremy the rest is easy for the record if you ever get a call from me and you hear
00:04:02.800me say jeremy i'm in jail and i need bailed out i want you to take it seriously i will between you
00:04:07.660and I, I know that if I get the call about you, I'm going to have to question it. Um, but you're
00:04:13.080right. Yeah. Just taking a quick sample of somebody's voice, accessing their call and
00:04:19.320contact list and placing calls to the mom and dad sister, you know, most frequent calls on your list
00:04:28.240and harassing them for money with your voice. Well, that's a frightening era. Yeah. Yeah. If
00:04:35.560If you have a gullible family member or a naive family member,
00:04:39.720you target everyone, maybe you'll get lucky.
00:04:43.320It seems like that's the method, and now it's so quick.
00:04:45.660It used to be, okay, we're on the phone.
00:05:18.040There's always going to be an objection as to why you can't send the money.
00:05:21.160If you can overcome that and get the person to keep talking, then trust is built.
00:05:25.580And the minute trust is built, you get a problem on your hands.
00:05:28.520Yeah, it's funny because AI, they're saying now scambots are programmed to handle the input of psychology via the statements that you're making.
00:05:39.220In other words, what you're typing in falls into psychological profiles that have responses built for those psychological profiles, which means that trust gets built faster with more people simultaneously.
00:08:29.480You know, requiring so much information from you to take part in this job offer that it's easy to access everything in your life and, you know, scam you.
00:08:39.720I think that's one of the major problems that we're seeing out there online as well is that that process of drawing you in is so convincing now.
00:12:59.280I mean, you write about crime every day and, you know, I hear most recently the stories that you have to cover are often violence related in crime, which is a shame.
00:13:09.520And that's an ever growing problem here in Canada and many parts of the world.
00:13:13.300But I have to ask you, how much of the crime writing or crime that you're seeing in your purview now is automated AI and digital?
00:13:25.360i'm i'm not seeing a lot of it i think in part because of what we just talked about there's a
00:13:33.380stigma involved with admitting that you've been duped yeah and so finding the people who are
00:13:40.220willing to speak up is is not that easy that's the first thing the second thing is that these
00:13:47.440are all we're all well aware of these scams and it's it's all over the media so people aren't that
00:13:53.780interested to read about it um so it just doesn't get covered in the same way but but it's it's a
00:14:00.060weird dichotomy that we're more aware of ever than ever about these scams but we're also falling
00:14:06.680victim to them more and more often so you you kind of wonder uh what the solution is i i don't really
00:14:14.300know yeah we can put all kinds of anti-scam laws in place but if they're still able to penetrate
00:14:18.620your email, your text messages, which is another one. The rise of online text message scams.
00:14:28.780You really must be careful out there now because it's so easy to click and give so much access to
00:14:35.600your phone. They're unsecure. You need to know your phone. The password on your phone means
00:14:40.220nothing. The access to your phone should be assumed. So what's on there, how you communicate
00:14:47.320on there what you do financially on there really you need to understand you're just carrying an
00:14:53.220open wallet and an open book to your finances to your contacts and even your your innermost thoughts
00:15:00.360in emails and text messages i think that's a foregone conclusion now yeah i you know i'm just
00:15:07.880thinking as we as we chat about it uh you know when someone asks for your credit card details
00:15:14.920in Canada over the phone I whip out my card and I give it to them I give them all the information
00:15:20.860I think that that needs to change there needs to be some secure device or setup that you can input
00:15:28.600these details so that it changes people's mindset don't just whip out your card you you need some
00:15:34.860sort of safeguard before you do that really maybe the credit card companies need to be the at the
00:15:39.260forefront of this saying okay credit card issuance is done this way now it's every transaction is a
00:15:45.960different code under your account yeah there is no one ongoing card there must be some future
00:15:51.700technology that combats what ai is making so impossible to protect well there already may be
00:15:58.260and that that's a another part of the discussion that doesn't get talked about enough is where are
00:16:03.260banks and all this yeah why why are our elderly people allowed to send fifteen thousand dollars
00:16:10.460to someone in india what why is that possible there should be limits on their accounts they
00:16:16.440there should be uh children of these people need to need to set up thousand dollar limit so they
00:16:22.300cannot send that money yeah and and there needs to be alarm systems in place there are for some
00:16:27.700scams, but if you're sending $1,500 a month, it may not get detected. But if you're doing a $1,500
00:16:35.460a month to India consistently, yeah, there might need to be an alarm set off where the bank has
00:16:40.540to step in and ask some questions. I don't know. It's not something I see a lot of, that's for
00:16:47.760certain. I think that the only scam system in place, I'll give you an example. I keep getting
00:16:55.040a charge from i'm going to name them like a mobile i don't know who you are i don't know
00:17:00.400why you're charging me 36 a month out of the uk but i my credit card company is not interested
00:17:07.020in stopping that in any way and i have no means of following up with who this is already i'm at
00:17:14.420a stalemate thinking okay i now live a life where i just pay 38 with tax a month to some company
00:17:21.340in the uk that needs that i need protection i need assistance at that moment yeah you know what i
00:17:29.380mean and really they're saying okay we'll deal with the uh i love this deal with the company
00:17:35.540that is selling you the product i don't think i don't think they exist uh could you do something
00:17:42.400no you've given them authorization to bill your credit card i didn't i'm sorry so maybe you just
00:17:50.420cancel your credit card i've had to do that oh okay and i will do that again like a mobile whoever
00:17:55.720you are that's unlikely that i know you i shouldn't be laughing but it is it is a serious
00:18:02.240issue it is laughable no it's laughable because it's me if it was you it'd be serious i understand
00:18:06.660how humanity works but that's the reality is that i'm even i'm accustomed to like okay once in a
00:18:12.740while i'm gonna have a nefarious bill no i i've adapted to theft being part of my existence uh