Which is more deadly: Coronavirus and Kangaroos on the highway, which is statistically likelier to happen? After that, I interview for a second time one of the contractors in charge of censorship, part of a three-year, $200 million contract to censor social media posts in Canada.
00:07:23.500Or maybe that's why you were put in a senior's home to begin with and why your pro-choice kids and grandkids instructed the senior's home to do not revive.
00:07:31.100That's what's really happening in a lot of those places.
00:07:47.760I don't see this statistic broken out.
00:07:50.400But looking at that graph in Australia, the average age appears to be 85, which is the same as it's been across the West in British Columbia, for example.
00:07:59.720The average person who died from the virus is exactly 85.
00:08:03.340I don't see these mortalities in Australia cross-referenced with pre-existing conditions, as we've seen in some jurisdictions in Canada, like B.C.
00:08:11.720The majority of people who die over age 85 have been seniors with two or three pre-existing conditions.
00:08:21.740So if you're dying from a flu at age 90, when you had three pre-conditions, just a regular flu, that would usually be called dying of old age.
00:08:34.920Now it's a pandemic because of political and financial reasons.
00:08:42.120And a flu is the fourth and final compounding function.
00:08:47.220That used to be called dying of old age.
00:08:48.680Sorry, please let me know if you disagree with my opinion on this.
00:08:53.820But if this were actually about medical science, can I ask you why the mask bylaws were all brought in across Canada, all within a week or two of each other, but only starting at the end of July and beginning of August?
00:09:05.840After the pandemic was over, the pandemic peaked in mid-April.
00:09:08.900Don't tell me that it's math or science that every jurisdiction brought in the mask laws at the same time, August 1st, basically.
00:09:15.840Pandemic's over, but the PR machine needs to rev up again.
00:09:32.700If you can be convinced to stay at home for two weeks to slow the spread, or two weeks to flatten the curve, and that 15 days soon becomes 15 weeks, and it looks like it's going to be 15 months.
00:09:45.880If you can be convinced of that, if you can be convinced to wear masks, even when it's absurd, like outdoors, you know, there's a rule in Banff to wear your mask outdoors in the fresh mountain air.
00:09:57.360If you can be made to fear your neighbour and snitch on your neighbour, what wouldn't you do?
00:10:04.300And really, would you even stand up to the police of all people?
00:10:08.500Sky-high surveillance as we battle to control COVID.
00:10:11.700Over the next week, Victoria Police will dispatch drones.
00:10:17.440They'll be keeping a watch on St Kilda and Port Melbourne Beach, making sure skate parks and playgrounds remain empty, and for those who head to the park, a mask is a must.
00:10:30.160So those are mask drones, which stop you from being outside, but outside is the healthiest place.
00:10:37.080How about run some drones in some senior zones, mate?
00:10:39.540Just two guys under age 40 have died in the entire country.
00:10:45.280A country of 25 million people, one of the largest countries geographically in the world.
00:10:50.440Well, they've got drones searching for mask rule breakers.
00:10:54.020Oh, and then there's this quip, the Prime Minister of Australia who calls himself Conservative.
00:10:59.240He's the leader of the Liberal Party, but that's the Conservative Party down there.
00:11:01.980The leftists are called the Labour Party down there.
00:11:03.900He's letting you know that it's mandatory, mate.
00:12:22.080Do you think Trudeau's going to make vaccines mandatory?
00:12:25.340He bought 37 million syringes, one for every man, woman, and baby.
00:12:30.380He signed a contract with the Chinese military to participate in their vaccine program.
00:12:34.140Here's Theresa Tam daydreaming about what to do with vaccine objectors.
00:12:37.580If there are people who are non-compliant, there are definitely laws and public health.
00:12:44.200And powers that can quarantine people in mandatory settings.
00:12:50.040It's potential you could track people, put bracelets on their arms, have police and other setups to ensure quarantine is undertaken.
00:13:00.080Yeah, I wonder what her advice to Trudeau is, eh?
00:13:04.880What do you think Trudeau's going to do?
00:13:10.100Stay with us for the most important interview I've done in the year 2020.
00:13:14.640Well, one of the most interesting and most popular video interviews we've done this summer was with a former censor who worked for a contractor working for Facebook.
00:13:38.580What I mean by that is Facebook hires companies around the world.
00:13:42.340They outsource their censorship duties to these companies.
00:13:46.060One of these companies has an office in Phoenix, Arizona.
00:14:29.460Did you know that your Facebook pages in Canada with your comments about the Canadian election were being deleted based on a handbook that Facebook operated with a censorship contractor?
00:14:44.820Well, it was a very interesting conversation.
00:14:46.960And last time we spoke with Ryan Hartwig, who worked as a contractor, we asked him if we could see this handbook.
00:14:55.480And today he's going to show us what it looks like.
00:14:58.480Joining us now via Skype from Phoenix, Arizona is Ryan Hartwig.
00:17:01.120I mean, in addition to the existing policies that we have, like the hate speech policy, this gives additional guidance on how to monitor and action things related to the election.
00:17:37.660It reminds me of the movie Minority Report where people are punished for crimes before they're committed.
00:17:42.260Yeah, expected violations, that would almost always, when they talk about expected violations, whether they're talking about Trump's State of the Union speech, where we're expected to see hate speech violations, or in this example, yeah, these are expected violations, hate speech, fake accounts, impersonation.
00:17:59.560So there's some things that are legitimate, voter fraud.
00:18:01.960For example, some people would say on social media, oh, you know, Democrats vote on the following Tuesday or give wrong information about the voting dates.
00:18:40.180Impersonations should probably be mopped up, good practice.
00:18:43.260But I remember when we talked to you earlier this summer, you pointed out an example of hate speech.
00:18:47.920You said that if a conservative were to call a left-winger a feminazi, which is sort of a harsh word for an overbearing feminist, that's hate speech.
00:18:58.040But if a leftist calls a conservative a plain old Nazi, that's acceptable.
00:19:05.020To me, that's the perfect example of the double standard.
00:19:07.740You can call a right-winger a Nazi, which, by the way, as a Jew, I find that trivializes the word Nazi.
00:19:14.600If you call everyone you don't like a Nazi, it starts to lose meaning.
00:19:17.680But you're not allowed to call a feminist a feminazi because that's off the table.
00:19:26.200And another example I didn't mention was similar to that is, for example, if I call you on Facebook, if I say Ezra is a Trump Humper, and you report it directly, that stays up.
00:19:39.160But if I call you a snowflake, that gets taken down.
00:20:27.940I can't help but saying that and chuckle.
00:20:29.660But let's get serious for a second because here in Canada, our government says, oh, we've got to be on alert for election hacking by foreign enterprises in, you know, Russia or China or whatever.
00:20:40.680But actually, what you're here to say is that you and 1,500 of your colleagues in Phoenix were overtly, explicitly and without, you know, hiding.
00:20:51.480You were censoring things in the Canadian election that Facebook told you to.
00:20:55.500And you're a foreigner and Facebook's a foreign company.
00:21:08.680And, you know, this the candidate that we we saw in that on one of the pages of the training deck under expected violations was, you know, there are some slides about hate speech towards towards Jagmeet Singh.
00:21:21.780So they were they were giving maybe, you know, they're basically explaining that Jagmeet Singh is protected under the hate speech policy because of his physical appearance, because he's wearing a head covering.
00:21:34.060So he that, you know, hate speech protects against attacks against someone's religion.
00:21:38.460So it's it's kind of this gray area, because if you're attacking him as a candidate and you have a picture of him, technically, any attack on him would be considered an attack against his religion because he's always wearing that head covering.
00:21:52.700So his religion is basically one of the same with with him as a person.
00:21:56.240So it's very easy to interpret an attack as a content monitor.
00:22:00.280It's easy to interpret an attack on him as a person as an attack on his religion.
00:22:05.440Well, that's a very interesting distinction.
00:22:45.400Those are all things I truly believe have nothing to do with his ethnicity.
00:22:49.160If I say those things, but it's associated with a picture of him, if there's a turban in the picture, which there always is because he always wears one, that, by definition, is hate speech because a Facebook censor can say, ah, it's linked to his picture of him in a turban.
00:23:13.180So if it's clear that it's just a political attack, we would interpret it as such, but there is a part of the hate speech policy that's visual hate speech.
00:23:21.020So if I have a cartoon character of someone with a turban and I say, that person's dumb, like with the caption, that person's dumb, and it's just a visual of a person with a turban, that gets deleted for hate speech because that visual of the person wearing a head covering would signify that person's religion.
00:23:39.600So they do try to separate the nuances of attacking a candidate versus a religion.
00:23:47.240But here, what it's really doing is it's training us as content moderators to look out for things.
00:23:52.340I mean, it gets kind of old because we see hundreds of pieces of content a day.
00:24:00.300So they're letting us know, hey, be on the lookout for any attacks about Jagmeet Singh.
00:24:04.020So we may ignore other attacks, more nuanced attacks on other candidates, but because Facebook's highlighting this, it's going to have the impact of giving additional protections to Jagmeet Singh because we're more aware of it.
00:24:17.660When we see thousands of pieces of content a day, Facebook wants us to make sure that we're deleting any attacks against Jagmeet and Jagmeet Singh.
00:24:25.300We're talking with Ryan Hartwig, who was a contractor censor with a company called Cognizant, who had a three-year, did you say $200 million censorship?
00:24:35.740That's a staggering $200 million censorship contract.
00:24:39.980You make a good point because in the last election, one of the candidates, the conservative candidate, who happened to be a Christian white male, so very boring, Facebook wouldn't really care, except his Christianity and other parts of his identity were regularly attacked, not just on Facebook, but in the mainstream media.
00:25:00.120What you're saying is only Jagmeet Singh was singled out for special protection, and it was so vigorous that even calling him dumb, if there was a picture of him with a turban, that religiousified the whole thing.
00:25:14.880So that was off base, but you were obviously never instructed to protect Andrew Scheer for his Christianity or to protect other candidates for attacks on them.
00:25:27.040I mean, I suppose if there was something just crazy over the top, you might do it, but you were not told to watch out for anti-Christian attacks on Andrew Scheer, right?
00:25:36.540No, no, we're not told to look out for that.
00:25:38.500And we see that double standard between the Christianity and also, for example, Islam in other examples.
00:25:44.300And I'll send you a PowerPoint as well.
00:25:45.980I've been speaking to this in Brazil because censorship is very important for them as well.
00:25:50.620And, for example, we would see very many memes, sexual memes about Jesus Christ on the cross that were allowed at Facebook.
00:26:00.380But on the contrary, if you have a meme about Muhammad and having a meme about sexuality with a goat, which, of course, is not cool.
00:26:10.820I abhor any kind of hate speech towards Muslims.
00:26:13.620But those memes about Muhammad with goats and a sexual nature were deleted.
00:26:19.500So the sexual meme of Jesus is allowed, but a sexual meme of Muhammad is not allowed.
00:26:24.840I think that's an important point here.
00:26:26.440I don't think you or I are pro-abuse, pro-hate or anything.
00:26:31.600I mean, but what we're talking about here is what is allowed by Facebook and what is deleted by Facebook.
00:26:37.460And, I mean, I myself would not attack Jagmeet Singh's religion or his turban.
00:28:07.880So we protect, yeah, the hate speech policy protects followers of a religion.
00:28:14.940So if I attack Mormons or Christians, if I say all Christians are horrible people, that would be taken down.
00:28:22.680But it's weird because those sexual memes of Jesus are allowed for some reason.
00:28:27.640Well, it's, I mean, we know what the reason is that Silicon Valley and Facebook, they are not just atheists, they're anti-religious, at least when it comes to Christianity.
00:28:39.060Now, let's get back to the chart for Canada.
00:28:42.480There was, I saw briefly there that calls for assassination of Trudeau.
00:28:47.520I support the idea of deleting those, whether it's assassinating Trudeau or anyone.
00:29:45.540But if someone is coming from New York State to Canada claiming they're a refugee and they've got a phone that's more expensive than mine, I would probably communicate this in a less dehuman.
00:29:56.640The word leeches is a prickly word that I wouldn't use myself.
00:30:01.500But the idea that this would be banned and that Facebook censors would be told explicitly to stop anti-immigration memes.
00:30:11.480Why don't you talk a little bit about that?
00:30:14.040So I think you were going to say the word dehumanizing and then that's the right word for it because and that's actually the word, the policy language.
00:30:20.300So the hate speech policy in tier one describes, you know, that any language that's dehumanizing towards a certain group, be it nationality, religion, sexual orientation is not allowed.
00:30:33.540So that's the reason why we would be deleting it is because it's a comparison to animal, which is dehumanizing.
00:30:40.580But it brings into question this greater debate of what should be allowed or should we be allowed to discuss immigration and politics?
00:30:51.040And so calling someone an opportunist to leech, I mean, it's describing their behavior, but it's within the larger debate of immigration.
00:30:58.440And so Facebook is essentially saying you can't have discussions about immigration, about things that are costing you money on a daily basis as a citizen.
00:31:07.820It's it's a it can be a burden on social programs, on welfare.
00:31:13.840And so Facebook's taking a stance and saying, hey, we can't call immigrants or refugees certain names.
00:31:19.560But this one is is more like gray area because, OK, you're calling an opportunistic leech.
00:31:26.000Yeah, it can't be dehumanizing. It depends how you define it.
00:31:29.340Is it subjective? It sure is. It's very subjective.
00:31:32.500Yeah. I mean, to call someone we use animal metaphors all the time.
00:31:37.140He's a pig at the trough. Those politicians are pigs at the trough.
00:31:43.400We do use animal metaphors all the time.
00:31:46.760And it was clear that the leech was a reference to, you know, profiting off a system, taking funds, taking free health care, taking taxation.
00:31:58.020I don't think it was you as a human are like vermin.
00:32:02.460Like I know that that can be very dehuman.
00:32:04.860You are vermin. You are less than human.
00:32:07.320I don't think that was saying you're less than a human.
00:32:09.680I think that phrase there and I get I wouldn't choose to use it, but I think that it's saying, oh, you're just coming to take advantage of our free health care and lacks immigration laws.
00:32:19.460What troubles me again, Ryan, is I'm not going to bat for that language, which I don't think I would use myself.
00:32:25.220But of all the hundreds of issues, the ones that are banned are the ones that conservatives care about.
00:32:32.820Liberals are allowed to say anything about their free issues, their favorite issues.
00:32:37.320But it's only the conservative issues that have these warnings on them.
00:33:01.640But yes, many of the policies, for example, allowing topless protests or allowing during the Pride Month, allowing topless protests of females.
00:35:35.760Now, any public figure who's a minor, including Barron Trump, you can't talk about them sexually, even if and so there's they have more protections than, let's say, a public figure who's an adult.
00:35:47.080So they already have a lot of protections.
00:35:50.720Now, this calling Barron Trump retarded would be allowed.
00:35:56.920Any minor public figure, that's not a lot of attack.
00:36:01.040Any minor under 18, you can call them that.
00:36:04.500So this is saying, hey, Greta Thunberg's special.
00:36:10.780And the screenshot you're looking at is a list of other exceptions they've made.
00:36:13.680They had to start documenting the exceptions they made after we had the civic audit from former Senator John Kyle and the Covington law firm.
00:36:21.940So so, yeah, this is giving additional protections to what's already stated in the policy to Greta Thunberg.
00:36:27.080And what's fascinating is when this happened, we got like we got jobs like this all day.
00:36:34.220I probably got for a couple for about a week.
00:36:36.580I probably had 50 to 60 jobs a day with this phrase that I had to delete.
00:36:43.120Facebook injected what's called a proactive pull.
00:36:47.020They injected those phrases or classifiers into our queue.
00:36:50.400And they did it so strongly that in our own internal messaging board like that, like that post, like we had posts about retarded and even brought up it filtered and pushed those posts into our queue to delete.
00:37:04.060Of course, we can't delete our own posts.
00:37:16.240I see here there's some days you were taking down three, four hundred.
00:37:19.040Let me ask you about that, because I've I've kept you so long here, but I could talk about this all day.
00:37:25.080I think you mentioned the Covington kids, the difference between how Nick Sandman, that young man who was just smiling in the face of an aggressive racial activist in Washington, he was demonized a hundred ways that they wouldn't happen to Greta.
00:37:39.960You said you were getting 50 to 60 of these Greta censorships a day.
00:37:44.400And that's out of what I think you told me, 200 a day.
00:38:04.420I find that pitiful and predictable, but we're happy to have the exclusive here.
00:38:10.220So if you were doing at one point 50 to 60 censorships per day to protect Greta Thunberg, and there were 1,500 staff at the Phoenix location of Cognizant, I don't know if that would be applicable to all 1,500 people, but that's a staggering number of censorships a day.
00:38:29.360How many censorships per day or per week?
00:38:32.980I'm calling them censorships, like a deletion or a little mini trial.
00:38:37.700I don't know what you call them, an action.
00:38:39.900During the Canadian election, your shop in Phoenix, do you think you did 100, 1,000, 1 million?
00:38:48.900Like if your shop alone was doing 300,000 censorship moments per day, all told, could you roughly estimate how many times you and your colleagues censored something to do with the Canadian election?
00:39:06.680I mean, if I were to guess, it wasn't super prevalent.
00:39:09.680There was probably a week when I had more content, probably leading up to the election where I had more Canadian content.
00:40:40.940But there's another one called the GenPact out of Dallas, Texas that also had a content moderation contract with Facebook.
00:40:49.920Your office alone, $50,000 to $100,000, and you were aware of two other offices that, as far as you know, you didn't have the exclusivity on Canada.
00:40:59.840I can understand for language reasons why Phoenix would be a place to censor Latin America.
00:41:37.200So if those other offices were operating at the same pace as you, if you estimate between 50,000 and 100,000 in your shop, so let's split the difference, call it 75,000 times three offices, it's not out of bounds to estimate that up to a quarter million Canadian Facebook posts were censored in the last two weeks of the 2019 federal election, 200,000 to 250,000 posts.
00:42:03.940Is that a reasonable estimate based on what you know?
00:42:09.040And that's not taking into account either the AI that Zuckerberg testified about about a month ago that automatically deletes posts before they're even posted.
00:42:18.220And that they have, like, an accuracy of 89% deletion.
00:42:23.680So, yeah, it's very – I think that's a nice rough estimate, maybe a quarter million posts regarding the Canadian election.
00:42:32.120Like I said, I didn't have access to those numbers, and we're extrapolating here based on estimates.
00:42:36.920But, yeah, I think that's fairly accurate.
00:42:38.400Last question, I know I've said that twice now.
00:42:41.820The AI system that you say has a high accuracy, would that have handled an even greater number or a less number?
00:42:49.780You say about a quarter million posts handled by humans in Dallas, Tampa, and Phoenix.
00:42:56.160Would the AI being able to work on the Canadian election, would it have had significant numbers as well?
00:43:03.040Yeah, I can't really say how many posts may have been deleted by the AI.
00:43:10.900But, you know, posts such as the one that we saw about, you know, calling immigrants – or speaking out about immigration and the effects of immigration, posts such as those could have been deleted without anyone knowing.
00:43:24.400Like before you even attempt to post, it would get taken down.
00:43:27.060So it's really hard to estimate how many get taken down.
00:43:34.340Just, do you know any idea of the scale of the AI side of things?
00:43:38.240If the human side was a quarter million interventions, would it surprise you, or would you be hesitant to say that an equal number or more was done by artificial intelligence?
00:44:28.880So for all we know, the quarter million acts of censorship on the Canadian election taught this AI well.
00:44:39.160And as far as we know to this moment, the AI is censoring in real time hundreds of thousands, perhaps millions, potentially, of Canadian political posts.