The silent thief
Episode Stats
Words per Minute
158.18784
Summary
If you're the worrying kind, there's a lot on the Internet to worry about. With me today is Tim Jordan, an Internet security expert with Shield Networks Inc., who will make you worry even more. He'll talk about artificial intelligence, the 6G surveillance state, Elon Musk's deep fake programs, and more.
Transcript
00:00:00.000
Good evening, Western Standard viewers, and welcome to Hannaford, a weekly politics show
00:00:21.600
of the Western Standard. It is Thursday, January the 15th. It's a bit of a cliche, I know, but
00:00:28.480
everybody has got their own story about their phone picking up the conversation and then getting
00:00:34.740
unsolicited advertising somebody is listening but that's really surface stuff these days there's a
00:00:41.000
whole issue now of artificial intelligence and the 6g never mind the 5g the 6g surveillance state
00:00:57.620
If you're the worrying kind, there's a lot on the Internet to worry about.
00:01:02.460
With me today is Tim Jordan, an Internet security expert with Shield Networks, Inc.
00:01:09.500
He will make you worry even more, but he may have some suggestions as to how you can diminish your risk.
00:01:21.300
Tim, let's just talk a bit about where we're at with all of this.
00:01:33.220
Your business card says you're a cybersecurity apostle.
00:01:39.400
You're obviously having a little bit of fun with us there,
00:01:42.320
but apostles tend to be popularizers of something.
00:01:49.260
Yeah. Well, I use that term because I feel sort of like I'm one sent forth, as the term apostle
00:01:55.820
means, to keep people protected and business protected. And it feels like a divine mission,
00:02:01.980
honestly. Really? Yeah. Well, protected against what?
00:02:05.740
The bad guys. They're constantly, they're improving and you just got to always be on the
00:02:12.780
defense like we've said when we were talking earlier you you mentioned 6g now i'm not a
00:02:19.500
i'm not into this stuff the way you are the first time i've heard 6g what um what does it do that
00:02:27.100
5g is not going to do yeah so it's not just about like faster internet um and it's still still maybe
00:02:33.180
a few years away maybe about five years away from actually being a standard that's finalized but
00:02:38.140
different nations are jockeying for what they're going to what it's going to be but
00:02:42.700
they're talking about it's more of a sensing network it means the very fabric of the network
00:02:47.180
is the computer and that means ai will be the fabric like it'll be the radio waves coming through
00:02:53.740
being so am i going to think it and it's going to go on the screen yes is that what you're
00:02:57.340
is that what this is that's what this is this could be uh socially disadvantageous to some
00:03:02.940
people i think all right um does that and it is tempting to be frivolous about this kind of thing
00:03:10.860
but the way you put it you think it it'll go on the screen does that mean your wife is going to
00:03:16.220
know what you're thinking yeah i mean i hope there's some safeguards there like to uh be able
00:03:21.340
to uh you know stop this from being projected i'm sure there will be i mean no one will want to use
00:03:26.300
a system that is uncensored thought from your brain that would be a disaster obviously but
00:03:33.900
okay so you know okay to be totally serious now this is something that's in the works
00:03:40.220
it's going to do things that we that we don't anticipate yeah and that if you speculate you
00:03:46.700
You sound like either a genius before your time or a total crackpot,
00:03:54.240
but in all seriousness, what does 6G mean from the angle of maintaining
00:04:09.940
Yeah, the standards won't be finalized for a few years.
00:04:13.140
And again, like I said, each nation is jockeying for like, they're doing this in their labs and they're trying to figure out the extent of what it's going to be.
00:04:23.040
So the photons and the radiation coming off the antennas will be inside your house reading your movements and tracking.
00:04:30.900
And so, yeah, from a personal, like we're going to probably need laws and regulations to really stop the overreach.
00:04:40.980
what what could overreach and i appreciate all you said that the standards are still being set
00:04:46.820
but the fact that they upped the number from you know it used to be three and then it was four it
00:04:52.600
was now five now they're talking about this quantum jump yes 6g uh what kind of things could
00:05:01.500
could we run into on this and i'm talking about from the point of preserving your personal
00:05:06.540
information we've already had we're joking about the fact that it'll the wretched phone listens
00:05:11.900
to your conversations and then start sending you ads well all right that's just picking up your
00:05:18.180
voice what what's this going to do yeah like you said hard to speculate and it it can do a lot more
00:05:26.380
than than they would allow it to do so they're going to be they're going to be taming it down
00:05:30.440
of course they'll be using probably in environments to spy on people like oh i just wonder about this
00:05:35.360
Venezuela thing there and how they knew the environment before they went in and they could
00:05:42.540
have been using elements of 6G already for that operation. Really? Yeah. I mean, do you know that
00:05:48.220
or are you just sleeping? Speculating, but just kind of hearing the way they talk and what I've
00:05:52.700
heard about how 6G is going, it's like very interesting. Well, okay. So let's go a little
00:05:59.760
further along this line, you have, uh, spoken, uh, about, um, the, the, the phrase, the AI native
00:06:10.180
to describe the future of the internet. What does that mean?
00:06:17.060
So everything AI right now, we use it like, we sort of use it like an app. You pull up
00:06:22.040
your grok or your chat gpt app and you ask it questions um but in the future the very near
00:06:30.140
future it's not going to be an app anymore it'll be more like the operating system
00:06:33.260
or the very fabric of the network the network will be the ai so you won't necessarily go
00:06:37.920
you won't need an app to get this the ai will be generating it like procedurally generating it for
00:06:44.120
you as you need it that's where we're going okay can you give me an example what that would look
00:06:49.940
like in what today would be a normal office transaction? You have to, okay, you've got to
00:06:56.600
read a memo, you have to make a phone call, talk about it, make a decision, hang up. Now,
00:07:04.720
when it's AI native, what does that sort of transaction look like?
00:07:09.100
It could look a lot like it doing it for you before it knowing what you need to have done
00:07:13.720
and it doing it for you and just reporting on what it did later
00:07:22.240
So I called Premier Smith and told her that and she said,
00:07:29.840
What's the, uh, well, yeah, a lot of things that's coming
00:07:34.480
now is like you could, you could spawn an AI persona of yourself,
00:07:38.120
knowing everything and say the premier also does that.
00:07:43.720
come up with a solution and inform you of it later.
00:08:00.700
for using ChatGPT to make some policy decisions.
00:08:09.820
and say, give me a summary of all the such and such.
00:08:24.880
They were just being criticized that you're just heavily using the AI too much for policy decisions.
00:08:35.220
Well, yeah, spawning an AI persona of yourself that intimately knows who you are and acts as you,
00:08:40.100
sending it out to negotiate for you that's something that's real and coming okay well
00:08:45.940
just to kind of help the poor old listener um wrap his eye or her a head around this
00:08:52.540
we've already got thanks to one of the the ai programs where people have ai companions you
00:09:03.620
know this guy's got a always available girlfriend she's completely fictitious but she comes on
00:09:09.240
there's a picture of her on the screen they they talk i guess with a long long language model it
00:09:14.440
probably isn't going to be too deep but at any rate there she is and he takes some consolation
00:09:19.640
from that and presumably there are males for for females now that's not what you're talking about
00:09:28.360
when you're talking about an ai persona is it not really yeah how does the ai persona
00:09:38.780
Well, I guess it would be like really digging deep into you,
00:09:47.800
putting what you would, saying what you would say
00:09:50.880
and doing what you would do and deciding how you would decide.
00:09:53.240
So my AI persona thinks it's time that I talk to somebody else's AI persona
00:10:01.460
it and some somehow the two connect well okay i actually can picture that but isn't there
00:10:13.380
a problem with the whole idea of the ai language itself that is essentially like it doesn't think
00:10:23.140
everybody thinks it thinks what it does is if the long language model if i'm correctly informed and
00:10:28.900
you would know better than I if I am. But as I understand it, it has an enormous reservoir
00:10:35.540
of knowledge, but it only knows what's in that reservoir. It's not actually going to be able to
00:10:45.220
think and go outside of that gateway. So the idea that it would think, you know,
00:10:56.260
it's really time that Nigel had a talk with so-and-so about such and such a thing. It's
00:11:01.780
not going to do that. It's only going to wait for me to say, I really should talk to so-and-so
00:11:08.660
about such and such. And this is what I would want to get out of the conversation. Oh, all right.
00:11:12.900
And then it goes off and it contacts the other persona. Are you suggesting this thing can really
00:11:19.700
actually think and make forward looking decisions? Yeah. So what you're describing
00:11:24.820
there is what all AI companies are going after. They call it artificial super intelligence or AGI,
00:11:32.020
And they're talking about exactly that for an AI to be able to think outside of the information
00:11:37.940
given it and create. And they're all racing towards it.
00:11:44.660
Really? So, all right, let's talk about Derek Fildebrand, who's the
00:11:52.980
publisher of the western standard he walks into his office one day the computer on his desk
00:12:02.900
just knows that he's coming in because it's been tracking his cup truck and yeah
00:12:10.180
and it turns itself on and says good morning derek um i had a little chat with so-and-so for you
00:12:34.500
It knows you're going to want that egg McMuffin
00:12:41.160
So, you know, a 50-pound bag of dog food arrives,
00:12:45.060
but your dog died last night you know i can see problems with this indeed um let's get to the
00:12:52.280
serious stuff here i mean there are two things i think that people are legitimately concerned about
00:12:57.780
as artificial intelligence invades our lives first it is an invasion we feel like we're losing
00:13:04.820
control but the most important thing is we feel that our personal information is at risk so i would
00:13:14.280
would love you to tell us how it's at risk, thinking particularly of banking information,
00:13:20.440
but also all the information you would need to secure personal identity, everything you
00:13:24.560
need when you put in for a driver's license or anything else like that, or a credit card.
00:13:32.920
What risks are we facing right now from artificial intelligence's ability to scan?
00:13:39.840
yeah i think the biggest uh threat is impersonation it it can it people can act as you talk as you
00:13:48.000
synthesize your voice like we could we could overlay something over me right now and change
00:13:52.800
me a different person different voice that i could be talking in real time moving my hands
00:13:56.140
i could be acting as you and you i could talk to your wife or whoever and say please wire me this
00:14:02.180
information or send us check or this information over here and so that's the biggest it's
00:14:07.520
impersonation and the so that's what we really have to think about that's going to be wrapping
00:14:14.280
up a lot in the next year or two and you'll be probably getting calls from people that you think
00:14:19.420
you know but it's not them or video even video calls and to verify it you think oh that's them
00:14:25.220
well no it probably isn't and they're going to be studying your movements they'll know that you might
00:14:29.820
rub your eye a lot or whatever you do so impersonation and
00:14:34.920
well okay we already have i believe the ability for somebody to fake a voice and claim to
00:14:43.340
you know your grounds i'm your grandson and i'm in prison in venezuela you know whatever yeah
00:14:50.560
um send money immediately for legal fees they can do that yeah and people do fall for it
00:14:57.420
in what way is this going to take it a step further and make you more vulnerable?
00:15:03.980
Just a lot harder to decipher if that's like a fake thing,
00:15:08.940
like trying to get you to, you know, that, that, that, since there, there's,
00:15:13.520
you know, people have fallen for it. It's kind of getting public.
00:15:20.640
they're easier to protect against, but now you're going to be falling for,
00:15:23.660
you know how do you know if someone's real if someone's not and well i mean how do you know
00:15:30.880
do you have any uh tips i do for example i had a client that called me uh from banff he said oh
00:15:38.600
forgot the password and i just call him back said all right and i happen to know this person from
00:15:46.540
young i said so just to verify what what uh where did you live when you're young and then
00:15:51.840
he verified some other things that we did when we were young. So something in AI wouldn't know
00:15:56.640
if it's not on the internet. So they didn't know then, but I bet it's listening and it knows next
00:16:01.120
time. And that was next time. So you, oh, yes, exactly. You gotta always change the key.
00:16:06.000
Yeah. A lot of times they talk about a safe word where you come up with it before, but then you
00:16:10.800
always have to change the key. Right. And also another thing, if you did get a call, you, you
00:16:15.200
don't use end the call. If they're asking you to do something, you end it and call them back
00:16:19.040
through a different channel to you know it's is rare that the ai or the the attackers will be
00:16:25.520
attacking you on multiple fronts now it can happen but that's a good one of the good golden rules is
00:16:31.040
if you call me asking me to do something okay i'll call you back yeah and make sure that it's you and
00:16:37.120
then do your other things yeah okay well i guess all right fair enough that that's one thing yeah
00:16:44.560
Um, there's, uh, what, what danger is there that sophisticated scammers could get hold
00:16:54.380
of your bank information simply by looking for it?
00:16:59.100
I mean, one of the things we've found here, the Western Standard is that you can ask one
00:17:04.460
of the, one of the chat GPT or Grok or something like that and review all the answered for
00:17:11.940
the last five years looking for references to this particular pipelines or something like that
00:17:17.440
well it chalters away for a couple of minutes comes back with it and there there there's all
00:17:23.600
the information you still got to check it but at least you've saved yourself that step of searching
00:17:29.960
for it now in the let's not think about Hansard let's think about your banking information or my
00:17:38.040
banking information, can they get past with AI, can they get past the basic bank security
00:17:51.560
So the AIs now can act as a user on a computer, moving the mouse cursor around, clicking on
00:17:59.080
So if it has access to, say you store your banking information in a notepad on your phone
00:18:04.840
or on your computer could potentially go in and read that plug the information to the site
00:18:09.240
you transfer to contacts that you didn't want to so absolutely yep could act as a human
00:18:15.380
again how do you protect yourself as best you can
00:18:23.620
yeah with banking and that terms would be make sure you keep if you can if you if you're at all
00:18:31.940
possible don't keep your the keys to your account written down so to speak in the digital realm for
00:18:38.960
an ai to watch or to find if you can keep it in your head somehow or or create a series of keys
00:18:46.420
on notes that you can if you do forget it you can piece it together um like manually and in the
00:18:53.380
analog world it's gonna just take a lot more thinking and a lot more well it obviously must
00:18:59.160
be possible to predict yourself somewhat because i haven't read any stories about people
00:19:03.560
um losing their bitcoin fortune because somebody worked out their password i've heard of people
00:19:11.460
losing their password forgetting it and being unable to access their bitcoin account um but
00:19:18.580
i haven't heard of anybody actually saying what just happened here like deciphering it i mean it
00:19:24.280
would happen that they could steal it if they had it written down right but uh if they don't have it
00:19:42.620
Yeah and all the data is being collected and it's
00:19:51.000
I don't know whether I like the world we're going
00:19:56.420
if we're talking about things that we don't like
00:19:58.840
Um, talk to us about the, uh, the, the, uh, the deepfake technology that's out there now.
00:20:10.760
Um, Mr. Musk has been making the headlines with Grok recently.
00:20:16.780
Apparently there's some, um, software there that allows you to play around with images.
00:20:23.300
Um, we live in a society that's highly image driven.
00:20:27.880
and we don't use words anymore than what we do,
00:20:30.460
but, you know, we'll say Facebook, for example,
00:20:40.100
and once you have one picture, you can start to manipulate it.
00:20:45.340
Do you think we should stop putting our pictures on Facebook?
00:20:52.920
Do you think it's time to stop putting pictures on Facebook?
00:20:57.880
yeah because these programs this this ai can probably be sent to look for pictures of
00:21:07.960
individual people when you retrieve the image and you manipulate the image
00:21:12.800
yeah and in a way there's no real hiding from it either because like you go in public your
00:21:17.700
your your picture is being taken by all the uh surveillance cameras all around but yeah
00:21:23.320
i think a lot of people are going to start pulling back when they start to see the pictures themselves
00:21:27.860
in various scenarios and you know the bad guys are doing this there's no there's no guardrails
00:21:33.440
you know some the bigger companies started with the censorship and guardrails but that's
00:21:38.200
quick those wheels are quickly falling off as the private ai guys build their super clusters at home
00:21:44.380
with no guardrails putting out whatever they want okay and i had i this is why you were here you're
00:21:52.600
here to straighten me out on things i got the impression that when you uh use ai for whatever
00:21:59.780
purpose you're dealing with a massive computer set up somewhere else like your little desktop
00:22:06.320
computer is not going to do what you want it to do you are instructing a computer somewhere else
00:22:12.540
now it seems like you're saying that there are some people who have invested and been able to
00:22:20.380
and afford to invest in computers that'll do all kinds of dreadful things.
00:22:26.760
The, the power is this common, it's just emerging, just emerging.
00:22:31.480
The power that you can buy now, you can get a super cluster of Apple Mac
00:22:36.220
studio computers for them and you can put them all together and you can, you
00:22:39.760
can process, you can churn out AI answers and AI, uh, processing almost
00:22:46.260
now this is just going to get so so you have that in your house doing your bidding
00:22:52.260
and you can connect to the internet you can make it go out and attack certain sites or
00:22:58.920
go after certain people but on the on the on the bright side too like the good guys can use this
00:23:04.520
too like you need to be able to you know you know less censorship is generally a good thing so
00:23:11.480
So while it's dangerous, obviously, and then, but the good people can also use the same
00:23:25.340
Do we have to go out and buy a super cluster to do it?
00:23:30.640
I mean, I'm not saying you and I have to do that.
00:23:32.740
Like right now to get something like that would be about $70,000, which is not out of
00:23:39.300
reach for some people especially if you're going to be making money with it so yeah i don't i
00:23:44.920
wouldn't say everyone needs to do this but definitely uh the good guys need to keep thinking
00:23:50.300
keep keep maybe stay up at night a little bit later get up a little bit earlier and then and do
00:23:56.900
what i guess strategize like it's always been that way honesty and it and cyber security it's always
00:24:04.860
been you're a defensive posture you're you're reading what's out there and then you're going
00:24:10.220
oh boy how do we protect so and then you just you come up with plans and you build your defense
00:24:17.340
well you're you're frightening me i think you've that i mean the question is if the government
00:24:24.620
can't stop these these um robo callers and ai scammers what can regular people actually do
00:24:33.820
yeah and uh this was a what would what would you if they came to a shoot your name shield network
00:24:44.860
is the name of your company yeah so if somebody came and said i saw your interview with hannaford
00:24:49.980
you got me really rattled i want you to help me what could you do for them yeah so a company say
00:24:58.380
business would hire our company to uh to really like start take a look at their it infrastructure
00:25:06.220
kind of inherit it and manage it and shore it up against with defenses that we have learned
00:25:15.020
and that we're continually learning every day like there's it's non-stop it's getting faster
00:25:18.940
and faster so yeah it would be we people hire us and trust us to keep their systems
00:25:27.660
out of the hands of the bad guys yeah okay yeah well well that's your free ad but uh i wish you
00:25:35.820
well with that uh tim yeah um you know have you ever heard of flash paper flash paper yes it's
00:25:44.540
it's a mafia thing and it's a paper that's been soaked in a highly inflammable
00:25:51.820
chemical compound so the second that the police burst through the door you hit it with your
00:25:58.680
cigarette and it's gone oh there's no evidence i think that that's the future of communications
00:26:04.140
interesting yes i sent it over in an envelope and second you don't need it anymore it's gone
00:26:09.720
tim this has been great thank you very much for coming on coming on in it's you know it's a new
00:26:15.920
world with different challenges i'm sure we will negotiate them in the end if people can think of
00:26:20.800
way to do it they can probably think of a way to undo it that is got to be our hope 100 yeah
00:26:26.560
as we say goodbye thank you very much thanks for coming in for the western standard i'm nigel