The Critical Compass Podcast - September 28, 2024


Your Protests Are Under Surveillance | A Critical Compass Discussion


Episode Stats

Length

16 minutes

Words per Minute

167.72386

Word Count

2,794

Sentence Count

144


Summary

A strange piece of equipment was found at the Million March for Kids protest in Edmonton, Alberta on September 20th, and it may have been used by the police to monitor and record the protest and the people on the other side.


Transcript

00:00:00.000 Hello, welcome back to the Critical Compass.
00:00:05.940 Today, James and I are going to be discussing a kind of a weird thing that we saw at the
00:00:12.580 Million March for Kids protest in Edmonton, Alberta, on Friday, September 20th.
00:00:19.580 Yeah, we came across this interesting piece of equipment on our walk to where the protest
00:00:28.300 is. I'll just pop it on the screen right now. And if anyone's ever seen this before, we
00:00:33.740 hadn't. When we looked at the closer, it looks to be sort of a, it's a video monitoring and
00:00:42.240 audio monitoring device that has some text on it. I'll switch over to the image with the
00:00:50.160 text. And if you take a look, this is on the opposite side of where that previous picture
00:00:56.720 was taken. And it says, this area may be monitored and recorded. And I'll, we'll read the text
00:01:01.580 on it later. But for now, James, why don't you, you've been doing a little bit of looking
00:01:07.040 into this. So what, what are your thoughts on what we saw at the protest?
00:01:11.720 Well, this wouldn't be the first protest that, that has some kind of monitoring. And we know
00:01:17.280 that there was a lot of video and like surveillance of Ottawa during the trucker convoy. But for
00:01:24.960 a small protest like this, that's not even that huge. So usually on one side, you have
00:01:30.580 parents protesting SOGI or sexual orientation and gender identity, um, that framework in
00:01:37.640 schools. So these parents are not happy about that. This protest was divided by a road. And
00:01:41.920 then on the other side, you had, um, like LGBT activists and at Antifa and sitting, one thing
00:01:51.940 to note, sitting closest to the Antifa side, um, was this, uh, was this monitoring device.
00:01:59.560 So the most charitable interpretation of this is that while the police want to be able to
00:02:06.620 have document anything, if violence does occur and the fact that they put it closer to Antifa,
00:02:13.780 you could say that, well, well, the, maybe they are legitimately concerned about the people in
00:02:21.700 head to toe in all black with their face covered. And some of them are wearing gloves and helmets and,
00:02:27.660 uh, and tactical vests. So, uh, that, that is a, that's the most charitable interpretation.
00:02:35.360 Yeah. And a lot of these things come from a standpoint of, well, it, it's rooted or it's
00:02:43.300 grounded in a, it's filling a need and maybe there's some good in, maybe there's some kind
00:02:51.820 of functionality. Either we, you have convenience or you, you have some sort of a tool that makes
00:03:00.340 something easier or you have some kind of a tool that can be used for good, but I can also see this
00:03:05.540 being used, um, more of a malicious way. Um, and the fact that a year ago, the same protest,
00:03:15.360 the police did not, they weren't treating everybody equally. They, you got pushed in the road.
00:03:23.280 Oh, I recall.
00:03:24.400 You were on the other side. We have, you have this documented, we have a video on our channel.
00:03:29.460 Um, people can watch it. You were surrounded by protesters on the Antifa side and some of them
00:03:37.360 were Antifa. They're dressed in the black. They were pushing you onto the road almost with their
00:03:42.940 hands up. Like, I'm not touching you. I'm not touching you. And, um, they were being the aggressors.
00:03:48.600 And in that case, the police talked to you and stop, they told you to stop, stop antagonizing
00:03:54.800 them. So in that case, we have examples and there are no shortage of other examples and
00:04:01.060 other protests of police, not, not doing what they should with Antifa members. So they're not
00:04:07.640 even, they've had plenty of opportunities. And even with video footage, um, of activists or
00:04:16.940 protesters being harassed by Antifa and the police do not do their jobs. So this is why I'm doubting
00:04:22.740 that the surveillance device will not be used for justice. It may be used for, um, targeting
00:04:30.380 protesters or targeting people who are maybe not speaking. They're not, they're not saying the
00:04:37.700 right things about the right people.
00:04:39.500 Right. Yeah. That's, you know what? I, I kind of had that feeling too. I mean, yeah, like you say,
00:04:44.020 it was, I suppose not comforting, but somewhat like, you know, at least reassuring that it
00:04:50.980 was, the device was more closely on the side with the, the people very visually more likely
00:04:57.540 to be the ones to pop anything off. Uh, but yeah, I just, yeah. I mean, at least based on
00:05:04.360 my experience last year, I just doubt that that was, you know, that was a serious consideration.
00:05:08.520 They also didn't really have too much of a police presence at this one. I mean, they were
00:05:13.940 there, but like, I don't know, maybe we saw, you know, two or three squad cars and just a
00:05:18.620 couple officers sort of like very in the distance last year. I mean, it was a much bigger protest
00:05:23.380 last year, but there was a lot more officers. They were kind of lining that, um, the, like
00:05:29.880 the, the grass, the grassy area between the roads there. And it was lots on bikes and like
00:05:35.540 just a much bigger presence. So I think they scale based on, like, I think they have obviously
00:05:43.780 marked police cars and then they have their SUVs and stuff and they were kind of parked
00:05:47.120 in some of the parking lots behind. And I think they were there ready to get out as like, as
00:05:54.340 the size grows, I'm sure they have kind of a framework of saying like, Oh, okay. Well,
00:05:59.040 if there's this many people, we try to get a certain amount on the ground and like, yeah.
00:06:05.980 And something that we noticed too, I don't know if we said this in the, uh, in the clip
00:06:09.460 we posted the other day, but they're in the kind of the businesses surrounding the area
00:06:14.360 that the protest was in. If you remember, they were, there were like, they weren't even
00:06:19.040 like disguised as security guards or anything. They were just straight up like police officers
00:06:22.700 kind of monitoring the parking lots of the businesses that were, uh, nearby. And, and
00:06:27.100 actually I remember there was one that was sort of standing at the entrance to a parking
00:06:30.980 lot that was, I assume was monitoring for people who would park in it to go walk across
00:06:36.560 the way, you know, adjacent to where that lot was to where the protest is. And actually
00:06:41.180 further down that road, there was a, um, not quite a blockade, but sort of a, almost like
00:06:47.000 a checkpoint it looked like. So yeah, there was definitely, uh, uh, I don't know what the
00:06:52.860 word would be like, um, a perimeter set up, I suppose. Uh, but yeah, so you know what,
00:07:02.540 let me just throw this on the screen here. This is the closeup that I took of that sign
00:07:06.020 that was on the, on the unit here. And I'll just, I'll switch my, uh, screen over and I'll
00:07:10.700 read this here. So it says closed circuit television camera system. This area may be
00:07:15.160 monitored and recorded. Uh, and smaller text says this information is being collected for
00:07:19.840 the purposes of law enforcement as authorized by section 33 of the freedom of information
00:07:24.880 and privacy act. And further down, it says for further information, contact the Evan police
00:07:30.400 service information and privacy coordinator at, uh, an address here and, uh, and the telephone
00:07:35.960 and a telephone number actually prior to this recording, uh, earlier in the day today, uh,
00:07:40.500 when we decided that we were going to be, uh, uh, talking about this, I did call that line
00:07:47.120 and it, um, it went to a, uh, just went to a voicemail. So I left a voicemail, but I didn't,
00:07:52.680 uh, I didn't hear back on anything. So we'll see if they call me back. I mean, um, I don't
00:07:59.080 suspect that they have any reason not to. Um, but, uh, if they, if they put the number out there,
00:08:04.740 unless they're just putting the number out there to see who, who would call, maybe they can,
00:08:09.500 you know, gather some information that way. But the message that I left was just, I, I did a,
00:08:15.300 I made a recording of the message that I left for just for posterity. But basically I said,
00:08:19.840 uh, yeah, I was just, uh, you know, I was, uh, at the, at the protest that day and I was just curious
00:08:24.900 what the, uh, you know, but some information about what that, what the equipment was that, uh, you
00:08:30.740 know, there was a sign on it. They had a phone number and if, if you had any questions you could
00:08:34.380 call. So that, that's all I'm doing. And, uh, yeah, we'll see what they say. If, if they say
00:08:39.180 anything or if they just put me on a list. Here's the thing where, uh, if, if you're
00:08:45.260 not already on a list, then you're doing something wrong. And I think people should
00:08:52.060 not fear about being on a list because, um, as these things change, what is deemed acceptable
00:09:00.360 gets smaller and smaller and smaller. So if you're afraid of ever being on the list or you're
00:09:07.920 afraid of ever being put in that territory, then you're going to be pushed into a corner of
00:09:14.920 a smaller, smaller, more acceptable range of what you can, can and cannot say.
00:09:21.220 Yeah, that's right.
00:09:22.160 So my, my biggest, my biggest concern about this is that, well, we have not a whole lot
00:09:32.500 of evidence to trust that any of these things will be used responsibly. We, I don't think
00:09:38.960 we have the mechanisms for quite the contrary, actually. Yeah. Like we, we don't have the
00:09:44.220 mechanisms for accountability. And even in other cases in government, the accountability
00:09:48.980 seems to be sidestepped. Either there's plausible deniability or they push it off to a third party
00:09:56.280 and they seem to find a way around ever after ever actually being able to pinpoint who's
00:10:03.020 responsible for any of these things. Um, and we have new bills coming in that at the time
00:10:11.820 of this recording, it's still being debated in parliament. Uh, we have bill C 63, which
00:10:18.400 is expanding what's an online, online harms act. We, we have talked about this in the past,
00:10:24.560 but, um, part of the bill is like safety for like pornographic images for children and websites
00:10:32.820 and reporting and that, but they've also expanded the def they're expanding the definition and
00:10:39.820 penalties for hate speech. Right. So how does that interact with this increased surveillance
00:10:45.840 is well, you have what people are saying on online, but you also have these deterrents
00:10:50.620 of, well, if there's going to be penalties for what people are saying and are doing, or
00:10:57.240 is hate speech, what if you have a sign, is that, is that punishable by any of these frameworks? Like
00:11:03.640 is, is somebody going to be, is there going to be facial recognition? And then, well, maybe they
00:11:10.280 don't deal with you at the protest, but they have you on camera and they took a recording of your,
00:11:15.300 they have a picture of you and your sign and they have your face and they have your phone
00:11:20.680 GPS and they say, you're at this protest and you, and this is hate speech and they show up at your
00:11:27.680 house later and they, so these things may like, we, we literally just saw that happening in the UK.
00:11:33.940 So it's not like it's, and, and sorry to interrupt, but just on that thought, that's what I was going
00:11:39.520 to say. In addition to the thing that we got to watch out for the usage of these things in the future
00:11:44.440 is that we don't become a dystopian shithole, like somewhere like London where there's, you
00:11:49.700 know, more closed circuit cameras than people almost. And, uh, and you know, if, if this, I, I
00:11:56.540 think, I think we both sort of maybe unspokenly, uh, acknowledge this when we saw that, um, device
00:12:03.340 there that we can't not talk about it because it can't be a thing where it's just accepted
00:12:10.420 that. Okay. Well, anytime there's a gathering of people, it's monitored and recorded and who
00:12:15.600 knows how that information is used. Yeah. Uh, unless there is a certain amount of reform
00:12:24.140 or trust, uh, like I, I could see a place in a high trust society, maybe something like
00:12:31.280 this would actually play the role of providing safety for a protest potentially, but we are
00:12:38.020 not in that scenario by, we're not there. Yeah. Yeah. No, because we know that the types
00:12:44.200 of people who would be most interested in knowing the identities of people at these protests are
00:12:50.080 not, like you say, they're not doing it because they're, you know, trying to keep law and order
00:12:55.400 what they're, what they're actually, what they actually would be most concerned with is,
00:13:00.080 uh, making sure that nothing outside of the party line is allowed to go unchallenged.
00:13:06.680 So, yeah. So yeah, I mean, without this, you, you, uh, and I mentioned before of examples where
00:13:12.120 somebody has an assault at a protest recorded and somebody's the face of the person assaulting
00:13:19.700 and the police are there and they have a chance to deal with it. And we have example after example
00:13:25.520 of the police not doing anything about the person who committed assault. But if you flipped,
00:13:31.900 if you flipped the script and the example is a different demographic against a different
00:13:38.160 demographic, you would, you would see that assault being, you'd see somebody in, in handcuffs
00:13:44.140 instantly. So, um, I think all we're asking is like, well, we haven't seen equal treatment.
00:13:51.980 And, and so we have an absence of evidence that, that these, the police are actually doing,
00:14:00.840 actually they're doing their job and that this would actually be used in a responsible way.
00:14:06.920 Yeah. Yeah. Well, that's, um, yeah, that's essentially it. I think, uh, we'll, uh, we'll make sure to,
00:14:16.880 uh, you know, add a, add another, another video here, you know, as a followup, if, uh, if I hear
00:14:22.280 back from, uh, that department, uh, I'll make sure to record that call too. Um, I don't actually
00:14:28.680 know, I should have looked it up if, if Alberta or if Canada is a, uh, a one-way consent state for,
00:14:36.420 uh, for, um, recording of phone calls. But the way that my phone does it at least is it makes a,
00:14:44.120 it makes a statement like it, it automatically plays a statement that this call is being,
00:14:48.800 is now being recorded. So I'd be curious if they say anything about that once, once I do it,
00:14:55.860 if I have somebody live on it as it was, I just gonna, if they don't phone back purely for that
00:15:00.600 reason. Yeah. Well, I hit the, um, I don't know if, I don't know if a call to a, to a police department
00:15:05.940 is, is recorded all the way through from the beginning, or if you get a voicemail, uh, cause I
00:15:10.940 started the, I hit the, uh, the recording and then it, it played that disclaimer as the recording,
00:15:16.500 as the voicemail message was playing. So I don't know if that would be, if they would have,
00:15:21.480 if they would hear that or if they would only hear my recording. So I don't know. We'll see
00:15:26.620 anyway, but we'll, we'll, we'll update you guys if we, if we hear anything, but, uh, yeah, uh, James,
00:15:31.660 any other thoughts before we kind of wrap up this short idea? Well, um, I'm just curious if we'll
00:15:38.260 see more of these or if people have seen other examples as well. So hopefully on X, maybe,
00:15:44.060 um, maybe we get some examples in the comment, like comments. I, I doubt this is the first use
00:15:52.440 of this. It's just some, the first time that I've seen it. Yeah. Yeah. It's, uh, yeah, it seemed like
00:15:58.520 too, they had it ready to go. It's a tiny protest. And I'm like, this is not like they, they had it
00:16:07.500 ready to go. This is not, this is not a, uh, the first time. Yeah, that's right. Okay. Well,
00:16:14.240 we'll, uh, we'll keep you updated. Thanks for listening guys. As always, uh, you can follow
00:16:17.980 us on YouTube, uh, rumble, Spotify, uh, X obviously. And, uh, we'll have links below.
00:16:24.860 Uh, we'll have, uh, we'll have these images uploaded too for reference for anyone who's curious
00:16:28.960 and, uh, thanks as always. And we'll see you in the next one. All right. Cheers. Cheers.
00:16:37.500 Cheers.