True Patriot Love - March 25, 2026


AI Scams: How Criminals Are Using Your Voice Against You


Episode Stats

Length

21 minutes

Words per Minute

168.65837

Word Count

3,566

Sentence Count

115

Misogynist Sentences

1

Hate Speech Sentences

1


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 well Michelle we heard from detectives today who talked about how sophisticated this is all
00:00:08.560 becoming because of AI will show you some video of course a lot of people are posting information
00:00:14.760 online on all kinds of different sites and what criminals are doing is using that information
00:00:20.340 against us and in some cases they're using AI to replicate a voice maybe of a family member or of a
00:00:28.080 banking representative to try to scam you out of thousands of dollars so what
00:00:33.260 police are saying is people need to be careful a pause before they hear from
00:00:38.100 anyone who calls us claiming to be someone in a family or a bank or a
00:00:43.200 financial institution and this sophistication is just getting worse and
00:00:49.320 it's costing millions and millions of dollars in losses to many people hi
00:00:56.400 thanks for joining us this is tpl media i'm mike and uh i read your comments we all do and they are
00:01:02.240 inspiration for shows and sometimes for what fashion we'll choose for the next episode so keep
00:01:07.760 the comments coming and go to tplmedia.ca on a daily basis for new content on all kinds of topics
00:01:14.560 we follow the news and we try to get the experts and we have a discussion about what's actually
00:01:19.280 going on in the news that's our mission today is no exception to that as we are at war and this
00:01:25.440 This triggers something that often happens during wartime.
00:01:29.840 It causes pressure on economies.
00:01:32.340 Economies' pressure puts pressure on people's earnings,
00:01:35.560 and the scammers begin to arrive.
00:01:38.460 And that's what's going on right now.
00:01:40.380 To talk to me more about that
00:01:41.960 and to discuss some of the modern-day scams going on,
00:01:46.680 Jeremy Grimaldi, how are you?
00:01:48.640 I'm well. How are you?
00:01:49.580 This is the two bald guys with a beard.
00:01:52.380 Let me ask you, do you put color in that?
00:01:54.440 Is that natural?
00:01:55.160 I can't seem to get it to happen.
00:01:57.000 Mine's natural, but it's fading.
00:01:59.340 The dark color is fading quick.
00:02:01.380 Don't get too close.
00:02:02.140 I think it might be contagious.
00:02:03.660 I don't want you to catch it for me.
00:02:05.220 Thanks for joining me about this.
00:02:07.640 Really, I could give a million reasons why we're seeing more scams out there,
00:02:12.680 but I think the biggest reason that we're seeing these scams more prevalent than ever is ease of scam.
00:02:21.420 The technology is there to help.
00:02:23.160 The technology, but also the victims are plentiful, apparently. And it's very lucrative. You know, you can make a few phone calls at the end of the day, you know, scam. You make 100 calls, maybe you could get three people to bite. And in those three people, there's thousands of dollars available, apparently.
00:02:45.860 In my world, money is very valuable, but for some people, the pressure becomes too much and they can part ways with their beloved cash.
00:02:57.560 So it's interesting. Most recently, we saw news about CRA scams going on and apparently CRA has had to block about $2.9 billion in payments people are making directly to CRA, which means that the scammers are probably getting an equal or greater proportion of the money being paid.
00:03:15.860 uh requested out there as a scam and so now the police as we saw just before we started this
00:03:22.580 are out there telling people you've got to be more aware you have to double check
00:03:26.740 and he lends to that the uh power of ai kind of driving the forces of evil even further into our
00:03:35.940 pocketbooks and that's interesting um because it makes perfect sense yeah yeah they can clone your
00:03:42.900 voice now um so you know if you get a you you take my voice for example now use my voice to
00:03:50.340 call my mother and father and and and make my voice say that hey i'm in jail and i need to be
00:03:55.940 bailed out now uh jeremy the rest is easy for the record if you ever get a call from me and you hear
00:04:02.800 me say jeremy i'm in jail and i need bailed out i want you to take it seriously i will between you
00:04:07.660 and I, I know that if I get the call about you, I'm going to have to question it. Um, but you're
00:04:13.080 right. Yeah. Just taking a quick sample of somebody's voice, accessing their call and
00:04:19.320 contact list and placing calls to the mom and dad sister, you know, most frequent calls on your list
00:04:28.240 and harassing them for money with your voice. Well, that's a frightening era. Yeah. Yeah. If
00:04:35.560 If you have a gullible family member or a naive family member,
00:04:39.720 you target everyone, maybe you'll get lucky.
00:04:43.320 It seems like that's the method, and now it's so quick.
00:04:45.660 It used to be, okay, we're on the phone.
00:04:48.760 I've got to get on the phone.
00:04:49.880 I've got to convince this person in the phone call
00:04:51.760 with my personality against their personality.
00:04:55.880 Automating that probably with AI,
00:04:58.440 I don't know how you even combat it.
00:05:02.420 We were just talking off air, you and I, about sales.
00:05:07.180 I used to be in sales.
00:05:08.380 And a lot of high-pressure sales involves dealing with objections.
00:05:16.120 And so it's the same here.
00:05:18.040 There's always going to be an objection as to why you can't send the money.
00:05:21.160 If you can overcome that and get the person to keep talking, then trust is built.
00:05:25.580 And the minute trust is built, you get a problem on your hands.
00:05:28.520 Yeah, it's funny because AI, they're saying now scambots are programmed to handle the input of psychology via the statements that you're making.
00:05:39.220 In other words, what you're typing in falls into psychological profiles that have responses built for those psychological profiles, which means that trust gets built faster with more people simultaneously.
00:05:54.560 Yeah.
00:05:55.200 And then the target groups remain the same, often seniors, people that are not tech savvy, and they take advantage of that.
00:06:05.080 Yeah, it's interesting just thinking about, I mean, this has touched everyone.
00:06:09.460 I'm sure you know people.
00:06:10.780 I know people who have fallen victim to these scams.
00:06:15.240 and it i feel though that someone uh you know in some of the countries where it's happening
00:06:23.380 or where it's coming from should should start taking it seriously because it affects other
00:06:29.240 it affects everything so you know my mom was on the phone the other day trying to sell some
00:06:34.660 traveler's checks trying to recoup money from traveler's checks from ages ago she was sent to
00:06:40.280 a call person in india and uh you know immediately started the warnings went up and and she she
00:06:47.720 started saying you know i i can't i can't do it because i just can't trust you know that that my
00:06:53.060 information started asking for a credit card number right you know i can't i can't trust that
00:06:57.600 my information is going to be safe over there i think your mom made the right choice frankly you
00:07:01.040 know uh and and that's in fact that's what law enforcement is saying to people now don't give
00:07:07.240 your credit card out to anybody do it only in person do it through secure verified online means
00:07:14.260 don't and the other one that they're the police are warning about is all of the tap and pay scams
00:07:19.840 that there's so much RF technology available now to retrieve information off your phone
00:07:26.620 that that tap technology which everybody seems do you use a wallet on your phone yeah I mean
00:07:32.360 it's always enabled yeah if there's any way around that making charges within proximity to people
00:07:39.900 is a real problem and it's growing yeah it's it's one of those things that you now have to be the
00:07:46.640 more and more technologically savvy we all become the more and more our lives are totally intertwined
00:07:53.900 with this technology the more you have to be be careful because it's it there's going to be a time
00:07:59.480 where guys from halfway around the world
00:08:02.320 are going to be able to take control of your vehicle,
00:08:05.960 hack into your vehicle.
00:08:07.240 They're going to hack into your phone.
00:08:08.480 They can mask phone numbers.
00:08:10.340 They can mask voices.
00:08:12.320 The further and further we get down this road,
00:08:15.240 the more and more this is going to be a problem.
00:08:17.420 AI is being taken advantage of, I think,
00:08:19.380 by this element of criminal society very quickly.
00:08:24.340 The voice adaptation is one thing.
00:08:26.340 Another one is false job offers.
00:08:29.480 You know, requiring so much information from you to take part in this job offer that it's easy to access everything in your life and, you know, scam you.
00:08:39.720 I think that's one of the major problems that we're seeing out there online as well is that that process of drawing you in is so convincing now.
00:08:50.420 Yeah.
00:08:50.780 An investment offer can look so convincing.
00:08:53.160 you know uh showing charts from nasdaq about how your stock is doing is so easily emulated now
00:09:01.260 you know to so demonstrating for scammers the net result or the fantasy that they're selling
00:09:08.960 is so much easier now it's it's easy and you know i i'm as much to blame as anyone i i keep
00:09:17.300 talking about seniors and elderly people, but it's not just seniors. I have a good friend who's
00:09:25.000 younger than me, so he's probably 42, very intelligent, very with it. He got caught up
00:09:31.760 in a stock scam and spent $5,000. And what it was doing was he spent $1,500. They got back in
00:09:40.940 touch with him, trying to get him to invest in this fake stock. And because he was already in
00:09:45.880 for a dollar he kept re-upping just because he needed to trust these guys that they weren't
00:09:52.260 going to they weren't going to take his money so he he got caught in this loop and he was so
00:09:57.780 embarrassed he didn't want to tell me he told me over some beers um and he admitted it and he just
00:10:04.380 he always almost broke down because you know his wife was like what kind of idiot are you
00:10:10.400 the negative effects of this aren't what's stolen it's a psychological damage that it does to
00:10:18.680 trusting yourself you start to think don't i have an intellect that would have prevented this but
00:10:26.400 no that's not the case it's the technology and and the practice and the means by which
00:10:31.640 it's manipulation manipulation is carried out now so uh so much so different when you start hearing
00:10:38.200 when i hear your voice in my ear saying hey man uh you know i just wanted you to know that i got
00:10:43.380 a really good uh tip bitcoin's gonna go up in the next couple of days i bought a bunch of it all
00:10:48.320 right give me a call if you get a moment i might just buy the bitcoin before i ever take a minute
00:10:55.360 and say dude i didn't know that you were into bitcoin i'm not why so you know what i mean when
00:11:00.580 it comes to anything financial now you have to be suspicious i think what your mom did
00:11:05.820 was exactly what needs to be advised to everybody take that moment when you're asked for the
00:11:11.540 information and really think it through yeah and and and luckily i was around at the time so i heard
00:11:18.520 her um saying oh my credit card information and i mean i you know i stepped in we we sort we sorted
00:11:24.440 it out but you know and and what what else doesn't really get discussed is what it does to to trust
00:11:31.920 which you just mentioned to people's trust of other people to people's trust of society to
00:11:37.560 people's trust the police that you know should they be scammed that something's going to be done
00:11:41.780 one in in i would guess 99 of these cases nothing's done and nothing can be done i think that's the
00:11:48.400 problem how do you react to something that is happening on a server in one country uh executed
00:11:54.280 by somebody in another country and get the authorities in between to operate with you in
00:12:01.140 tandem. I think that there's been some scam operations taken down, but those are mainly
00:12:04.960 call centers. There have been some data centers taken down in parts of India and Pakistan, as I
00:12:11.960 read, but not nearly what needs to be done to stop this. When a criminal in another country
00:12:20.340 can represent CRA in Canada effectively enough to move billions of dollars around,
00:12:27.280 then we really do need to have another look at how the law operates on this.
00:12:31.140 Yeah, I would love to, I'd love there to be a partnership.
00:12:35.680 I mean, but even when they're taking down these call centers,
00:12:38.680 what's happening to these people?
00:12:40.080 It's very lucrative.
00:12:41.720 Right.
00:12:42.380 And the sentences, the conviction rates are in favor of the criminals.
00:12:48.100 So even if they are taken down, what are all these people trained
00:12:51.980 in this sort of part of industry?
00:12:55.480 What are they doing next?
00:12:56.900 They're probably going back to it.
00:12:58.540 I've got to ask you.
00:12:59.280 I mean, you write about crime every day and, you know, I hear most recently the stories that you have to cover are often violence related in crime, which is a shame.
00:13:09.520 And that's an ever growing problem here in Canada and many parts of the world.
00:13:13.300 But I have to ask you, how much of the crime writing or crime that you're seeing in your purview now is automated AI and digital?
00:13:25.360 i'm i'm not seeing a lot of it i think in part because of what we just talked about there's a
00:13:33.380 stigma involved with admitting that you've been duped yeah and so finding the people who are
00:13:40.220 willing to speak up is is not that easy that's the first thing the second thing is that these
00:13:47.440 are all we're all well aware of these scams and it's it's all over the media so people aren't that
00:13:53.780 interested to read about it um so it just doesn't get covered in the same way but but it's it's a
00:14:00.060 weird dichotomy that we're more aware of ever than ever about these scams but we're also falling
00:14:06.680 victim to them more and more often so you you kind of wonder uh what the solution is i i don't really
00:14:14.300 know yeah we can put all kinds of anti-scam laws in place but if they're still able to penetrate
00:14:18.620 your email, your text messages, which is another one. The rise of online text message scams.
00:14:28.780 You really must be careful out there now because it's so easy to click and give so much access to
00:14:35.600 your phone. They're unsecure. You need to know your phone. The password on your phone means
00:14:40.220 nothing. The access to your phone should be assumed. So what's on there, how you communicate
00:14:47.320 on there what you do financially on there really you need to understand you're just carrying an
00:14:53.220 open wallet and an open book to your finances to your contacts and even your your innermost thoughts
00:15:00.360 in emails and text messages i think that's a foregone conclusion now yeah i you know i'm just
00:15:07.880 thinking as we as we chat about it uh you know when someone asks for your credit card details
00:15:14.920 in Canada over the phone I whip out my card and I give it to them I give them all the information
00:15:20.860 I think that that needs to change there needs to be some secure device or setup that you can input
00:15:28.600 these details so that it changes people's mindset don't just whip out your card you you need some
00:15:34.860 sort of safeguard before you do that really maybe the credit card companies need to be the at the
00:15:39.260 forefront of this saying okay credit card issuance is done this way now it's every transaction is a
00:15:45.960 different code under your account yeah there is no one ongoing card there must be some future
00:15:51.700 technology that combats what ai is making so impossible to protect well there already may be
00:15:58.260 and that that's a another part of the discussion that doesn't get talked about enough is where are
00:16:03.260 banks and all this yeah why why are our elderly people allowed to send fifteen thousand dollars
00:16:10.460 to someone in india what why is that possible there should be limits on their accounts they
00:16:16.440 there should be uh children of these people need to need to set up thousand dollar limit so they
00:16:22.300 cannot send that money yeah and and there needs to be alarm systems in place there are for some
00:16:27.700 scams, but if you're sending $1,500 a month, it may not get detected. But if you're doing a $1,500
00:16:35.460 a month to India consistently, yeah, there might need to be an alarm set off where the bank has
00:16:40.540 to step in and ask some questions. I don't know. It's not something I see a lot of, that's for
00:16:47.760 certain. I think that the only scam system in place, I'll give you an example. I keep getting
00:16:55.040 a charge from i'm going to name them like a mobile i don't know who you are i don't know
00:17:00.400 why you're charging me 36 a month out of the uk but i my credit card company is not interested
00:17:07.020 in stopping that in any way and i have no means of following up with who this is already i'm at
00:17:14.420 a stalemate thinking okay i now live a life where i just pay 38 with tax a month to some company
00:17:21.340 in the uk that needs that i need protection i need assistance at that moment yeah you know what i
00:17:29.380 mean and really they're saying okay we'll deal with the uh i love this deal with the company
00:17:35.540 that is selling you the product i don't think i don't think they exist uh could you do something
00:17:42.400 no you've given them authorization to bill your credit card i didn't i'm sorry so maybe you just
00:17:50.420 cancel your credit card i've had to do that oh okay and i will do that again like a mobile whoever
00:17:55.720 you are that's unlikely that i know you i shouldn't be laughing but it is it is a serious
00:18:02.240 issue it is laughable no it's laughable because it's me if it was you it'd be serious i understand
00:18:06.660 how humanity works but that's the reality is that i'm even i'm accustomed to like okay once in a
00:18:12.740 while i'm gonna have a nefarious bill no i i've adapted to theft being part of my existence uh
00:18:20.000 It is a weird place to be.
00:18:22.520 Yeah, it reminds me of my membership, my gym membership from university.
00:18:26.880 It just kept rolling and rolling until I finally mustered the courage to cancel it.
00:18:33.160 Really, guys?
00:18:34.260 It takes years.
00:18:34.560 Yeah, this has been many years.
00:18:35.980 Can you give me something back?
00:18:37.780 I obviously don't need to lose the weight anymore.
00:18:40.320 Can I?
00:18:42.240 Listen, before we go, let's maybe offer up a little bit of advice again.
00:18:46.020 The things we're seeing, AI emulation is making the scam look more real than ever.
00:18:53.000 You need to take a second look.
00:18:55.100 You need to have, maybe you need to consult a family member or if need be, even your bank
00:19:02.140 to see if what you're about to do is a scam or not.
00:19:05.480 If something happens to you, please report it because yes, you got scammed.
00:19:12.260 It's embarrassing, but it's not just you.
00:19:14.880 it's millions and millions of people and unless we all open our mouths and say i got scammed too
00:19:20.080 and here's how it's happening i don't think that much will change yeah and and i guess my my advice
00:19:27.540 not that i'm handing out advice all the time on this issue but my advice is always just if possible
00:19:34.160 take a tick like take a moment hang up the phone call back give me 10 minutes you know anything
00:19:42.400 like that to to sort of get your composure and start thinking about it and stop the manipulation
00:19:49.360 even if it's a legit call or a caller or call center just take the moment and and and maybe
00:19:57.500 maybe you can beat the scammers solid advice uh coming up uh one of our next discussions i think
00:20:03.560 is uh how violence and extortion is on the rise across ontario in different parts of canada you're
00:20:11.120 seeing it on your beat and we'll talk more about that next time great i look forward to it thanks
00:20:16.300 jeremy and thank you don't forget subscribe tell a friend uh many thanks to nick and the gang and
00:20:20.980 everybody who makes this possible every day uh if you subscribe you're helping us keep the wheels
00:20:25.720 on the cart uh in supporting the network thanks we'll see you next time
00:20:30.040 Patriotic means looking out for each other and fixing things together.
00:20:41.460 True patriotism is being in a country you love, surrounded by people you love, and great weather.
00:20:47.540 Being a patriot is being a part of your community and caring for it.
00:20:50.480 It doesn't matter who you are or where you're from, patriotism is the one thing we all share.
00:20:55.680 It's okay to be critical of government and still be a patriot.
00:20:59.600 It's gratitude to your country.
00:21:01.600 Of course I'm a patriot. I'm Canadian. It's my home.
00:21:04.600 Well, actually, true patriot love is the mission.