Western Standard - January 21, 2026


HANNAFORD: The silent thief


Episode Stats

Length

26 minutes

Words per Minute

163.29884

Word Count

4,372

Sentence Count

114

Misogynist Sentences

1

Hate Speech Sentences

3


Summary


Transcript

00:00:00.000 Good evening, Western Standard viewers, and welcome to Hannaford, a weekly politics show
00:00:21.640 of the Western Standard. It is Thursday, January the 15th. It's a bit of a cliché, I know,
00:00:28.260 but everybody has got their own story about their phone picking up the conversation and
00:00:34.220 then getting unsolicited advertising. Somebody is listening. But that's really surface stuff
00:00:39.840 these days. There's the whole issue now of artificial intelligence and the 6G, never mind
00:00:47.780 the 5G, the 6G surveillance state. There's Elon Musk's deepfake programs. The list goes
00:00:56.020 on and on. If you're the worrying kind, there's a lot on the internet to worry about. With
00:01:03.020 me today is Tim Jordan, an internet security expert with Shield Networks Inc. He will make
00:01:10.460 you worry even more, but he may have some suggestions as to how you can diminish your risk. Tim, welcome
00:01:17.260 to the show. Tim Jordan Thank you very much for having me.
00:01:19.260 Tim Jordan Well, it's a great pleasure. Tim, let's just talk a bit about where we're at
00:01:28.260 with all of this. Your business card says you're a cybersecurity apostle. You're obviously having
00:01:40.660 a little bit of fun with us there, but apostles tend to be popularizers of something. What
00:01:47.020 are you popularizing? Tim Jordan Yeah, well, I use that term because I feel
00:01:52.340 sort of like I'm one sent forth, as the term apostle means, to keep people protected and
00:01:59.060 business protected. And it feels like a divine mission, honestly.
00:02:02.660 Tim Jordan Really? Yeah. Well, protected against what?
00:02:05.660 Tim Jordan The bad guys, they're constantly, they're improving and you just got to always
00:02:12.180 be on the defense. Tim Jordan When we were talking earlier, you mentioned
00:02:17.660 6G. Now, I'm not into this stuff the way you are. The first time I've heard 6G, what does
00:02:26.660 it do that 5G is not going to do? Tim Jordan Yeah, so it's not just about faster internet.
00:02:31.740 Tim Jordan It's still maybe a few years away, maybe about 5 years away from actually being
00:02:36.740 a standard that's finalized, but different nations are jockeying for what it's going to
00:02:42.100 be. But they're talking about, it's more of a sensing network. It means the very fabric
00:02:46.620 of the network is the computer. And that means AI will be the fabric. Like it'll be the radio
00:02:52.140 waves coming through being sensed. Tim Jordan Well, am I going to think it and it's going to go
00:02:55.740 on the screen? Tim Jordan Yes. Tim Jordan Is that what this is?
00:02:58.740 Tim Jordan That's what this is. Tim Jordan This could be socially
00:03:01.620 disadvantageous to some people, I think. Tim Jordan Indeed. Tim Jordan All right. It's tempting
00:03:08.620 to be frivolous about this kind of thing, but the way you put it, you think it'll go on the screen.
00:03:13.620 Tim Jordan Does that mean your wife is going to know what you're thinking? Tim Jordan Yeah. I mean,
00:03:18.620 I hope there's some safeguards there to be able to stop this from being projected. I'm sure there will be.
00:03:24.620 I mean, no one will want to use the system that is uncensored thought from your brain. That would be
00:03:31.620 a disaster, obviously. Tim Jordan Okay. So, okay. To be totally serious now, this is something that's in
00:03:39.620 the works. It's going to do things that we don't anticipate. And that if you speculate, you sound
00:03:46.620 you sound like either a genius before your time or a total cracklot. But in all seriousness, what does 6G
00:03:57.620 mean from the angle of maintaining the security of your personal information?
00:04:03.620 Tim Jordan Yeah. I guess that's a tough question.
00:04:07.620 Tim Jordan And when is it coming in?
00:04:09.620 Tim Jordan Yeah. The standards won't be finalized for a few years. And again, like I said, each nation
00:04:14.620 is jockeying for, like they're doing this in their labs and they're trying to figure out the extent of
00:04:19.620 what it's going to be. Like it's, it's a sensing network. So the photons and the radiation coming off
00:04:25.620 the antennas will be inside your house, reading your movements and tracking. And so, yeah, from a
00:04:32.620 personal, like we're going to probably need laws and regulations to really stop the overreach.
00:04:39.620 Tim Jordan Okay. What, what, what could overreach? And I appreciate all you said that the standards
00:04:45.620 are still being set, but the fact that they upped the number from, you know, it used to be three
00:04:51.620 and then it was four, it was now five. Now they're talking about this quantum jump.
00:04:55.620 Tim Jordan Yes.
00:04:56.620 Tim Jordan Yes.
00:04:57.620 Tim Jordan What kind of things could, could we run into on this? And I'm talking about from the point
00:05:04.620 of preserving your personal information, we've already had, we're joking about the fact that it'll,
00:05:09.620 the wretched phone listens to your conversations and then start sending you ads. Well, all right,
00:05:15.620 that's just picking up your voice. What, what's this going to do?
00:05:20.620 Tim Jordan Yeah. Like you said, hard to speculate and it, it, it can do a lot more than,
00:05:26.620 then they would allow it to do. So they're going to be, they're going to be taming it down. Of course,
00:05:30.620 they'll be using probably in environments to spy on people like, Oh, I just wonder about this
00:05:34.620 Venezuela thing there and how they, how they knew the environment before they went in. And, uh,
00:05:41.620 they could have been using elements of 6g already for that operation.
00:05:45.620 Tim Jordan Really? Yeah. I mean, do you know that or are you just sleeping?
00:05:48.620 Tim Jordan Yeah, speculating, but I'm just kind of hearing the way they talk and the way,
00:05:51.620 what I've heard about how 6g is going. It's like very interesting.
00:05:55.620 Tim Jordan Well, okay. So let's, let's go, go a little further along this line. You have, uh, spoken,
00:06:02.620 uh, about, um, the, the, the phrase, the AI native, uh, to describe the future of the internet. What does that mean?
00:06:14.620 Tim Jordan So everything AI right now, we use it like, we sort of use it like an app. You pull up your Grok or your, uh, chat GPT app. And you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you, you.
00:06:17.620 Tim Jordan So everything AI right now, we use it like, we sort of use it like an app. You pull up your Grok or your, uh, chat GPT app and you ask it questions.
00:06:26.620 Um, but in the future, the near, very near future, it's not going to be an app anymore.
00:06:31.620 It'll be more like the operating system or the very fabric of the network. The network will be the AI.
00:06:36.620 So you won't necessarily go, you won't need an app to get this.
00:06:39.620 The AI will be generating it, like procedurally generating it for you as you need it. That's where we're going.
00:06:46.620 Tim Jordan Okay. Uh, can you give me an example of what that would look like in, uh, what today would be a normal office transaction?
00:06:54.620 You have to, okay, you've got to read a memo. You have to make a phone call, talk about it, make a decision, hang up.
00:07:03.620 Now, when, when it's AI native, what does that sort of trim transaction look like?
00:07:08.620 Tim Jordan It could look a lot like it doing it for you before it knowing what you need to have done and it doing it for you.
00:07:14.620 And just let reporting on your, on what it did later on the day for you.
00:07:19.620 Tim Jordan So I called Premier Smith and told her that.
00:07:24.620 And she said, and there you go, you're fired. What's the, uh,
00:07:31.620 Tim Jordan Well, yeah, a lot of things that's coming now is like, you could, you could spawn an AI persona of yourself,
00:07:37.620 you know, knowing everything and say the premier also does that.
00:07:40.620 And your AI's negotiate, come up with a solution and inform you of it later.
00:07:47.620 But this is, this is already happening. Like certain governments are doing this.
00:07:51.620 Tim Jordan Really?
00:07:52.620 Tim Jordan Yeah. They're, they're using a lot of AI to, to like, I think it was the government,
00:07:57.620 uh, Sweden, the leader of Sweden was criticized for using a chat GPT to make some policy decisions.
00:08:05.620 Tim Jordan Yeah, but I mean, there's one thing to call a chat GPT and say, give me the,
00:08:10.620 give me a summary of all the such and such, and then use that as your basic research.
00:08:16.620 I mean, I, I think a lot of people do that now.
00:08:18.620 Tim Jordan Yeah.
00:08:19.620 Tim Jordan Yeah.
00:08:20.620 Tim Jordan Yeah.
00:08:21.620 Tim Jordan Yeah.
00:08:22.620 Tim Jordan Yeah.
00:08:23.620 Tim Jordan Yeah.
00:08:24.620 Tim Jordan Yeah.
00:08:25.620 Tim Jordan Yeah.
00:08:26.620 Tim Jordan Yeah.
00:08:27.620 Tim Jordan Yeah.
00:08:29.620 Tim Jordan Yeah.
00:08:30.620 Tim Jordan Yeah.
00:08:31.620 Tim Jordan Yeah.
00:08:32.620 Tim Jordan Yeah.
00:08:33.620 Tim Jordan Yeah.
00:08:36.620 Tim Jordan Yeah.
00:08:51.620 head around this we've already got thanks to one of the the ai programs where people have
00:09:00.820 ai companions you know this guy's got a always available girlfriend she's completely fictitious
00:09:08.020 but she comes on there's a picture of her on the screen they they talk i guess with a long
00:09:12.960 long language model it probably isn't going to be too deep but at any rate there she is and he
00:09:18.460 takes some consolation from that and presumably there are males for for for females now
00:09:24.940 that's not what you're talking about when you're talking about an ai persona is it not really yeah
00:09:32.300 how does the ai persona go beyond what we already know how to do well i guess it would be like really
00:09:41.020 digging deep into you digitally cloning you in a sense putting your putting what you would saying
00:09:50.240 what you would say and doing what you would do and deciding how you decide so my ai persona
00:09:55.000 thinks it's time that i talk to somebody else's ai persona
00:10:01.500 and somehow somehow the two connect well okay i actually can picture that but isn't there
00:10:13.420 a problem with the whole idea of the ai language itself that is essentially like it doesn't think
00:10:23.180 everybody thinks it thinks what it does is if the long language model if i'm correctly informed and you
00:10:29.020 would know better than i if i am but as i understand it it has an enormous reservoir of knowledge
00:10:38.780 but it only knows what's in that reservoir there's not actually going to be able to think
00:10:46.700 and go outside of that outside of that gateway so the idea that it would think you know it's really
00:10:56.700 time that nigel had a talk with so and so about such and such a thing it's not going to do that
00:11:03.020 it's only going to uh wait for me to say i really should talk to so and so about such and such and
00:11:09.740 this is what i would want to get out of the conversation oh all right and then it goes off and
00:11:14.140 it contacts the other persona are you suggesting this thing can really actually think and and make
00:11:22.460 forward-looking decisions yeah so what you're describing there is what the all the companies
00:11:26.540 are going after they call it artificial super intelligence or agi general intelligence really
00:11:32.060 and they're talking about exactly that for an ai to be able to to think outside of the information
00:11:37.980 given it and create and they're all racing towards it i mean are they close i would say so really
00:11:45.020 so all right let's uh let's talk about uh let's talk about derek fildebrand who's the publisher of the
00:11:54.380 western standard he walks into his office one day the computer on his desk just knows that he's
00:12:03.980 coming in because it's been tracking his cup truck and uh yeah and it turns itself on and says good
00:12:11.820 morning derek um i had a little chat with so and so for you and the stuff's on the way we don't
00:12:19.660 want to order anything you know are we looking at those situations absolutely 100 why would we do that
00:12:28.700 i guess it's just it just makes your life easier more
00:12:33.180 less friction as you know it's no it knows you're going to want that uh egg mcmuffin this
00:12:37.820 morning or something like that studying your patterns and so you know a 50 pound bag of dog
00:12:44.300 food arrives but your dog died last night you know i can see problems with this um let's get to the
00:12:52.140 serious stuff here i mean there are two things i think that people are legitimately concerned about
00:12:58.060 as artificial intelligence invades our lives first it is an invasion we feel like we're losing control
00:13:06.220 but the most important thing is we feel that our personal information is at risk so i would love
00:13:14.940 you to tell us how it's at risk i'm thinking particularly of banking information but also
00:13:20.780 all the information you would need to secure personal identity everything you need when you
00:13:25.260 put in for a driver's license or any other anything else like that or a credit card
00:13:29.900 uh you know what what risks are we facing right now from artificial intelligence's ability
00:13:38.940 to scan yeah i think the biggest uh threat is impersonation it it can
00:13:45.820 people can act as you talk as you synthesize your voice like we could we could overlay something over
00:13:51.660 me right now and change me a different person different voice that i could be talking in real time
00:13:55.340 moving my hands i could be acting as you and you i could talk to your wife or whoever and say please
00:14:01.500 wire me this information or send us check or this information over here and so that's the biggest
00:14:07.180 it's impersonation and the so that's what we really have to think about that's going to be wrapping up
00:14:14.460 a lot in the next year or two and you'll be probably getting calls from people that you think you
00:14:19.420 know but it's not them or video even video calls and to verify it you think oh that's them well no
00:14:25.900 it probably isn't and they're going to be studying your movements they'll know that you might rub your
00:14:30.700 eye a lot or whatever you do so impersonation and well okay we already have i believe the ability for
00:14:41.340 somebody to fake a voice and claim to you know your ground i'm your grandson and i'm in prison in
00:14:47.900 venezuela you know whatever yeah um send money immediately for legal fees they can do that yeah
00:14:55.980 and people do fall for it in what way is this going to take it a step further and make you more vulnerable
00:15:04.060 just a lot harder to decipher if that's like a fake thing like trying to get you to
00:15:11.020 you know that that that since there there's you know people have fallen for it it's kind of getting
00:15:16.060 public those those attacks might not be you know they're easier to protect against but now you're
00:15:22.620 going to be falling for you know how do you know if someone's real if someone's not and well i mean
00:15:30.060 how do you know do you have any uh tips i do for example i had a client that called me uh from banff he
00:15:37.900 said oh forgot the password and i just call him back i said all right and i happen to know this
00:15:45.820 person from young i said so just to verify what what uh where did you live when you're young and then
00:15:51.900 he verified some other things that we did when we're young so something in ai wouldn't know if
00:15:56.780 it's not on the internet so they didn't know then but i bet it's listening in those next time that was
00:16:01.580 next time so you oh yes exactly you gotta always change the key you know while a lot of times they
00:16:07.660 talk about a safe word where you come up with it before but then you always have to change the key
00:16:11.900 right and also another thing if you do get a call you you don't you just end the call if they're
00:16:16.540 asking you to do something you end it and call them back through a different channel to you know it's
00:16:22.860 it's rare that the ai or the the attackers will be attacking you on multiple fronts now it can happen
00:16:28.060 but that's a good one of the good golden rules is if you call me asking me to do something okay i'll
00:16:33.340 call you back yeah and make sure that it's you and then do your other things yeah well i guess all
00:16:42.060 right fair enough that that's one thing yeah um there's uh what it what danger is there that
00:16:51.580 sophisticated scammers could get hold of your bank information simply by looking for it
00:16:59.420 i mean one of the things we've found here the western standard is if you can
00:17:03.820 ask one of the one of the chat gpt or grok or something like that and review all the
00:17:11.260 answered for the last five years looking for references to this particular pipelines or something
00:17:16.860 like yeah well it chalters away for a couple of minutes comes back with it and there there there's
00:17:23.340 all the information you still got to check it but at least you've saved yourself that step of searching
00:17:30.060 for it now in the let's not think about answered let's think about your banking information or my banking
00:17:38.460 information can they get past with with ai can they get past the basic bank security and look at your
00:17:49.580 information yeah so the the ais now are can act as like a user on a computer moving the mouse mouse
00:17:56.700 cursor around clicking on the things so if you if it if it has access to say you store your banking
00:18:02.060 information in a notepad on your phone or on your computer you could potentially go in and read that
00:18:07.900 plug the information to the site you transfer to contacts that you didn't want to so absolutely
00:18:14.060 yep could act as a human again how do you protect yourself as best you can
00:18:25.820 yeah with banking and and that that terms would be make sure you keep if you can if you if you're at
00:18:31.740 all possible don't keep your the keys to your account written down so to speak
00:18:37.740 in the digital realm for an ai to watch or to find if you can keep it in your head somehow or or create
00:18:44.300 a a series of keys on notes that you can if you do forget it you can piece it together um like manually
00:18:52.940 and analog world it's gonna just take a lot more thinking and a lot more well it obviously must be
00:18:59.340 possible to protect yourself somewhat because i haven't read any stories about people um losing their
00:19:06.300 bitcoin fortune because somebody worked out their password i've heard of people losing their password
00:19:12.540 forgetting it and being unable to access their bitcoin account um but i haven't heard of anybody
00:19:19.980 actually saying what just happened here like deciphering it i mean it would happen that they
00:19:24.860 could steal it if they had it written down right but uh if they don't have it written down digitally
00:19:29.740 then it would be tough but not impossible i suppose well i mean again if they're watching once you've
00:19:36.140 done one transaction now they know the keystrokes yep yeah and all the data is being collected and it's
00:19:44.620 never being forgotten so you keep changing your passwords i don't know whether i like the world we're
00:19:52.460 going into very much at all tim this is brutal stuff and if we're talking about things that we don't
00:19:58.380 like um talk to us about the uh the the uh the deep fake technology that's out there now it's extremely
00:20:08.860 sophisticated um mr musk has been making the the headlines with grok recently apparently there's some
00:20:17.500 um um software there that allows you to play around with images yeah um we live in a society that's
00:20:26.700 highly image driven we don't use words anymore what we do but you know what the same facebook for
00:20:32.380 example and various other social media there there are pictures of people and once you have one picture
00:20:41.660 you can start to manipulate it yeah um do you think we should stop putting our pictures on facebook
00:20:49.340 kind of do like i don't think it'll be a natural let's say that again you think it's time to stop
00:20:53.980 putting pictures on facebook i would say so um yeah because these programs this this ai can probably be sent
00:21:06.060 to look for pictures of individual people when you retrieve the image and you manipulate the image
00:21:13.340 yeah and in a way there's no real hiding from it either because like you go in public your your your
00:21:18.380 picture is being taken by all the uh surveillance cameras all around but yeah i think a lot of people
00:21:24.220 are going to start pulling back when they start to see the pictures themselves in various scenarios and
00:21:30.300 you know the bad guys are doing this there's no there's no guard rails you know some the bigger
00:21:35.020 companies started with the censorship and guard rails but that's quick those wheels are quickly
00:21:39.660 falling off as the private ai guys build their super clusters at home with no guard rails putting out
00:21:46.540 whatever they want okay and i had my this is why you were here you're here to straighten me out on
00:21:53.580 things i got the impression that when you uh use ai for whatever purpose you're dealing with a massive
00:22:02.380 computer set up somewhere else like your little desktop computer is not going to do what you want
00:22:07.900 it to do you are instructing a computer somewhere else now it seems like you're saying that there are
00:22:15.260 some people who have invested and been able to afford to invest in computers that'll do all kinds of dreadful
00:22:25.020 things yeah the the power is this common uh it's just emerging just emerging the power that you can buy
00:22:32.620 now you can get a super cluster of apple mac studio computers four of them and you can put them all
00:22:38.460 together and you can you can process you can churn out ai answers and ai uh processing almost as fast as
00:22:45.580 the big guys now this is the big guys now this is just going to get so so you have that in your house
00:22:51.500 doing your bidding and you can connect to the internet you can make it go out and attack certain
00:22:58.220 sites or go after certain people but on the on the on the bright side too like the good guys can use
00:23:04.300 this too like you need to be able to you know you know less censorship is generally a good thing so
00:23:11.340 well it's dangerous obviously and then but the good people can also use the same tools to protect so
00:23:20.300 well all right you we're good people we want to protect do we have to go out and buy a super cluster
00:23:27.340 to do it some yeah i mean i'm not saying we you and i have to do that like right now to get a get
00:23:34.860 something like that would be about seventy thousand dollars yeah which is not out of reach for some people
00:23:41.340 especially if you're going to be making money with it so yeah i don't i wouldn't say everyone needs
00:23:45.660 to do this but definitely uh the good guys need to keep thinking keep keep maybe stay up at night a
00:23:53.340 little bit later get up a little bit earlier and then and do what i guess strategize like it's it's
00:24:01.820 always been that way honesty and it and cyber security it's always been you're a defensive prop posture
00:24:07.580 you're you're reading what's out there and then you're going oh boy how do we protect so and then
00:24:12.860 you just you come up with plans and you build your defense well you're you're frightening me as i think
00:24:20.780 you've that i mean the question is if the government can't stop these these um robo callers and ai scammers
00:24:31.500 what can regular people actually do yeah and uh this was a what would what would you if they came to
00:24:43.100 a shoot your name shield network which is the name of your company yeah so if somebody came and said i
00:24:48.060 saw your interview with hannaford you got me really rattled i want you to help me what could you do for
00:24:55.020 them yeah so a company say a business would hire our company to uh to really like start take a look
00:25:03.420 at their it infrastructure kind of inherit it and manage it and shore it up against with defenses that
00:25:12.860 we have learned and that we're continually learning every day like there's it's non-stop it's getting
00:25:18.620 faster and faster so yeah it would be we people hire us and trust us to keep their systems out of the
00:25:28.060 hands of the bad guys yeah okay yeah well well that's your free ad but uh i wish you well with that
00:25:36.540 uh tim yeah um you know have you ever heard of flash paper flash paper yes it's it's a mafia thing
00:25:45.820 uh it's a paper that's been soaked in a highly inflammable uh chemical compound so the second
00:25:55.660 that the police burst through the door you hit it with your cigarette and it's gone there's no evidence
00:26:01.900 i think that that's the future of communications interesting yes i lended over in an envelope and
00:26:08.060 second you don't need it anymore it's gone tim this has been great thank you very much for coming on
00:26:12.860 coming on in it's you know it's a new world with different challenges i'm sure we will negotiate
00:26:18.940 them in the end if people can think of a way to do it they can probably think of a way to undo it
00:26:23.900 that is got to be our hope 100 yeah as we say goodbye thank you very much thanks for coming in
00:26:30.460 for the western standard i'm nigel hanafort