Rebel News Podcast


Australia says a vaccine will be mandatory. Do you doubt Trudeau will follow suit?


Summary

Which is more deadly: Coronavirus and Kangaroos on the highway, which is statistically likelier to happen? After that, I interview for a second time one of the contractors in charge of censorship, part of a three-year, $200 million contract to censor social media posts in Canada.


Transcript

00:00:00.000 Hello, my Rebels. Today, I start off with a story about which is more deadly if you're in Australia,
00:00:06.200 the coronavirus pandemic or running into kangaroos on the highway, which is statistically
00:00:12.580 likelier to happen. But stay tuned, because after that, I interview for a second time
00:00:19.140 one of Facebook's contractors in charge of censorship, part of a three-year, $200 million
00:00:26.880 contract to censor Facebook posts in Canada from his headquarters in Phoenix, Arizona.
00:00:35.300 That is must-watch TV, must-watch, must-listen to audio on a podcast. But I would really invite
00:00:44.800 you to become a subscriber to the video version, because I show with your eyes this censorship
00:00:52.940 training manual he's going to take us through. You've got to see it to believe it.
00:00:56.880 On the podcast, you will hear him say, I'm referring now to the censorship manual. But
00:01:02.480 I want you to see it so you can believe it. And you can see it by going to rebelnews.com
00:01:07.100 and becoming a subscriber to Rebel News Plus. It's $8 a month, $80 for the whole year. And
00:01:14.380 you get the video version. I want you to see this censorship document. All right, here's
00:01:21.220 the podcast.
00:01:24.820 Tonight, Australia says a vaccine will be mandatory. Do you doubt Trudeau will follow suit? It's
00:01:31.940 August 19th, and this is the Ezra LeVant Show.
00:01:36.040 Why should others go to jail when you're a biggest carbon consumer I know?
00:01:39.700 There's 8,500 customers here, and you won't give them an answer.
00:01:43.760 The only thing I have to say to the government about why I'm publishing it is because it's
00:01:48.160 my bloody right to do so.
00:01:54.260 Look at this. Warning after crash involving kangaroo leaves man critically injured. A man
00:02:04.400 is in a critical condition in hospital after a crash involving two cars and a kangaroo near
00:02:09.720 Lithgow on Friday night. New South Wales ambulance said at least 83 car crashes resulting in injuries
00:02:17.440 had involved kangaroo so far this year and warned of an expected sharp rise in that number over
00:02:23.320 winter. Now, that's from a year ago or so. Here's another story from a little earlier.
00:02:31.240 Kangaroo causes fatal crash after a man crashed into a tree in Cary near Gisborne. Or this story
00:02:39.640 even before that one. Australian driver killed in fiery crash after hitting kangaroo.
00:02:47.440 It's a problem. The same way moose are a problem on the highways of Newfoundland, there's just so
00:02:53.320 many of them, and they don't stay off the roads. Kangaroos, the biggest danger in animal crashes.
00:03:00.740 New data shows what many Australians already know. Kangaroos are the biggest danger on the road
00:03:05.100 when it comes to animal collisions. Here's how to avoid them. Kangaroos are involved in 8 out of 10
00:03:10.740 car crashes with animals. I read an actual scholarly research report on this. I'm not going to go
00:03:20.060 through it all, but a lot of deaths are hitting kangaroos. About half of all wildlife deaths to
00:03:26.260 people, in fact. And a lot of deaths are swerving to avoid the kangaroos. It's a big issue. I know it
00:03:32.660 sounds jokey, but it's not funny when you hit one. Every year, every Australian state counts their
00:03:39.240 kangaroo deaths. It wouldn't be pan... I wouldn't be panicky. If I was driving in Australia, I wouldn't
00:03:44.120 be panicky about this. I've driven in Newfoundland, at night even. You shouldn't be panicky about hitting
00:03:52.020 a moose, but you should be very alert and maybe slow down a little bit at night, okay? Don't be crazy.
00:03:57.660 Stay sane. Just be a bit careful is all, okay? And I tell you this, you can probably guess,
00:04:05.160 because if you are under the age of 40, kangaroos are a much larger threat to your life and health
00:04:11.700 in Australia than the coronavirus is. So is lightning, actually. That's how rare it is,
00:04:21.100 the virus in Australia. Australia's nickname is the lucky country. It has a population of 25 million.
00:04:26.960 So Canada is exactly 50% more populous. Australia has had only 450 deaths from the virus.
00:04:36.200 And only 13 of those deaths are from people under age 60. For comparison, Canada has had more than
00:04:43.580 9,000 deaths. So if it were proportionate, Australia would be at 6,000 deaths and not even at 600.
00:04:50.120 But they are panicking like nothing I have ever seen in the free world. Not just the politicians and the
00:04:58.260 health bosses, but the police. And the army is actually on mask patrol in the Australian state
00:05:05.020 of Victoria. I'm not kidding. Look at this craziness.
00:05:09.400 He's choking me! He's choking me! What the f***?
00:05:13.160 Come on.
00:05:15.480 Oh! Get off of me! Get off!
00:05:18.980 Let go of his vest. Let go of his vest.
00:05:21.780 He's f***ing choking me! He's choking me! What the f***?
00:05:24.920 Let go of his vest!
00:05:25.820 Get off of me! Get the f*** off me!
00:05:28.000 Get the f*** off me! What are you doing?
00:05:33.800 You just keep choking me, girl! What the f***?
00:05:39.060 You're f*** in the head!
00:05:40.960 You're f***ing...
00:05:41.960 What the f*** are you doing to me?
00:05:46.040 What have I done? What have I done?
00:05:48.600 You just keep talking me!
00:05:49.980 What have I done wrong?
00:05:51.900 You're just keeping...
00:05:52.820 Yeah, but you're choking...
00:05:54.600 You're choking her.
00:05:55.740 There's a man on a...
00:05:56.640 There's a man on a girl...
00:05:58.140 There's a man on a girl and you choked her.
00:06:00.900 For what? For a mask?
00:06:02.000 For not having a mask?
00:06:03.320 Look at her pathetic.
00:06:04.260 She doesn't have a mask. Are you serious?
00:06:06.360 Are you serious?
00:06:07.340 Just for not having a mask?
00:06:08.960 Yeah, I think they've lost their minds down under.
00:06:11.680 Well, we have up here too.
00:06:13.240 Look at this.
00:06:14.120 Made in America insanity.
00:06:18.080 Sorry, there's no medical basis for that. None.
00:06:21.020 No virology basis for that.
00:06:23.220 No epidemiology.
00:06:24.940 And of course, there's a strong mental health basis for not doing that.
00:06:28.860 That's what crazy people do.
00:06:30.920 And if you do it, it actually makes you crazy.
00:06:34.040 That's not a normal way to live.
00:06:35.940 Certainly not a normal way how to show kids how to live.
00:06:40.800 But Australia is what we're talking about today.
00:06:43.260 Let me show you the stats.
00:06:45.000 From their government's official pandemic page.
00:06:47.240 I'm looking at the chart called COVID-19 deaths by age, group, and sex.
00:06:52.520 Nobody under age 30.
00:06:53.920 Just nobody.
00:06:55.280 Now, two guys under age 40 and another two guys under age 50.
00:07:00.240 So that is a grand total of four people under age 50 in an entire country of 25 million souls.
00:07:08.440 Like I say, kangaroos and lightning are a greater risk.
00:07:13.120 It's true if you're 80 to 90 years old and you're in a senior's home, just like if you're in Quebec, just like if you're in New York.
00:07:19.760 Get out.
00:07:20.760 Get out.
00:07:21.320 They'll kill you.
00:07:23.500 Or maybe that's why you were put in a senior's home to begin with and why your pro-choice kids and grandkids instructed the senior's home to do not revive.
00:07:31.100 That's what's really happening in a lot of those places.
00:07:33.840 This virus doesn't even kill seniors.
00:07:35.780 It kills seniors who have been warehoused in pro-choice, pro-euthanasia facilities.
00:07:42.820 I'm not happy about it.
00:07:44.040 I don't want seniors to die.
00:07:45.760 But don't tell me it's a pandemic.
00:07:47.120 It's not.
00:07:47.760 I don't see this statistic broken out.
00:07:50.400 But looking at that graph in Australia, the average age appears to be 85, which is the same as it's been across the West in British Columbia, for example.
00:07:59.720 The average person who died from the virus is exactly 85.
00:08:03.340 I don't see these mortalities in Australia cross-referenced with pre-existing conditions, as we've seen in some jurisdictions in Canada, like B.C.
00:08:11.720 The majority of people who die over age 85 have been seniors with two or three pre-existing conditions.
00:08:21.740 So if you're dying from a flu at age 90, when you had three pre-conditions, just a regular flu, that would usually be called dying of old age.
00:08:34.920 Now it's a pandemic because of political and financial reasons.
00:08:38.460 People die.
00:08:39.440 You've got diabetes.
00:08:40.280 You've got heart disease.
00:08:41.000 You've got lung disease.
00:08:42.120 And a flu is the fourth and final compounding function.
00:08:47.220 That used to be called dying of old age.
00:08:48.680 Sorry, please let me know if you disagree with my opinion on this.
00:08:53.820 But if this were actually about medical science, can I ask you why the mask bylaws were all brought in across Canada, all within a week or two of each other, but only starting at the end of July and beginning of August?
00:09:05.840 After the pandemic was over, the pandemic peaked in mid-April.
00:09:08.900 Don't tell me that it's math or science that every jurisdiction brought in the mask laws at the same time, August 1st, basically.
00:09:15.840 Pandemic's over, but the PR machine needs to rev up again.
00:09:19.840 So masks are ideal.
00:09:21.120 They make people scared and alienated and alert again.
00:09:24.820 It's unnatural, so it agitates.
00:09:27.700 Agit prop is the Soviet word for agitation propaganda.
00:09:30.540 Masks are a form of agit prop.
00:09:32.700 If you can be convinced to stay at home for two weeks to slow the spread, or two weeks to flatten the curve, and that 15 days soon becomes 15 weeks, and it looks like it's going to be 15 months.
00:09:45.880 If you can be convinced of that, if you can be convinced to wear masks, even when it's absurd, like outdoors, you know, there's a rule in Banff to wear your mask outdoors in the fresh mountain air.
00:09:57.360 If you can be made to fear your neighbour and snitch on your neighbour, what wouldn't you do?
00:10:04.300 And really, would you even stand up to the police of all people?
00:10:08.500 Sky-high surveillance as we battle to control COVID.
00:10:11.700 Over the next week, Victoria Police will dispatch drones.
00:10:17.440 They'll be keeping a watch on St Kilda and Port Melbourne Beach, making sure skate parks and playgrounds remain empty, and for those who head to the park, a mask is a must.
00:10:30.160 So those are mask drones, which stop you from being outside, but outside is the healthiest place.
00:10:37.080 How about run some drones in some senior zones, mate?
00:10:39.540 Just two guys under age 40 have died in the entire country.
00:10:45.280 A country of 25 million people, one of the largest countries geographically in the world.
00:10:50.440 Well, they've got drones searching for mask rule breakers.
00:10:54.020 Oh, and then there's this quip, the Prime Minister of Australia who calls himself Conservative.
00:10:59.240 He's the leader of the Liberal Party, but that's the Conservative Party down there.
00:11:01.980 The leftists are called the Labour Party down there.
00:11:03.900 He's letting you know that it's mandatory, mate.
00:11:07.000 There's no vaccine yet.
00:11:08.640 There's nothing that's even been tested yet.
00:11:11.400 But it's mandatory.
00:11:12.800 And there's a long way to go for the Oxford vaccine.
00:11:15.660 We don't even know how long that protection may last or at what dosage.
00:11:20.420 A final contract for the Oxford vaccine should be signed within weeks.
00:11:24.080 But even if everything goes like clockwork, it won't be widely available here until the middle of next year.
00:11:30.440 And the government wants 95% of Australians to be vaccinated.
00:11:34.480 I have a pretty strong view on vaccines.
00:11:37.200 Being the social services minister that introduced no jab, no play.
00:11:41.440 And no vaccine, no liberation.
00:11:44.020 Chris Ullman, Nine News.
00:11:45.340 If you think that's bad, just across the water in New Zealand, they just cancelled their elections.
00:11:54.500 There are currently exactly five people in all of New Zealand in the hospital.
00:11:58.600 Five.
00:11:59.160 Not 500, but five, as in five fingers on your hand.
00:12:03.940 And they cancelled elections.
00:12:05.420 That's Jacinda Ardern's doing.
00:12:07.320 She's the Prime Minister there.
00:12:08.280 She's Justin Trudeau's best buddy.
00:12:10.560 She's a censor.
00:12:11.400 She's a gun grabber.
00:12:12.280 She's the little fascist who could.
00:12:14.200 She's cancelling elections because five people have a cough.
00:12:19.640 Hey, what do you think?
00:12:22.080 Do you think Trudeau's going to make vaccines mandatory?
00:12:25.340 He bought 37 million syringes, one for every man, woman, and baby.
00:12:30.380 He signed a contract with the Chinese military to participate in their vaccine program.
00:12:34.140 Here's Theresa Tam daydreaming about what to do with vaccine objectors.
00:12:37.580 If there are people who are non-compliant, there are definitely laws and public health.
00:12:44.200 And powers that can quarantine people in mandatory settings.
00:12:50.040 It's potential you could track people, put bracelets on their arms, have police and other setups to ensure quarantine is undertaken.
00:13:00.080 Yeah, I wonder what her advice to Trudeau is, eh?
00:13:04.880 What do you think Trudeau's going to do?
00:13:10.100 Stay with us for the most important interview I've done in the year 2020.
00:13:14.640 Well, one of the most interesting and most popular video interviews we've done this summer was with a former censor who worked for a contractor working for Facebook.
00:13:38.580 What I mean by that is Facebook hires companies around the world.
00:13:42.340 They outsource their censorship duties to these companies.
00:13:46.060 One of these companies has an office in Phoenix, Arizona.
00:13:50.100 The company is called Cognizant.
00:13:52.400 And it had 1,500 censors beavering away, deleting up to 200 posts a day.
00:13:59.920 My math tells me that's up to 300,000 censorship actions per day.
00:14:07.220 That's shocking news in itself.
00:14:08.980 But then I learned one more thing from our next guest when he appeared with us earlier this summer.
00:14:14.940 They're given a manual, a handbook on what to censor and what not to.
00:14:20.720 And they had a special censorship handbook for the last Canadian federal election.
00:14:27.880 Did you know that?
00:14:29.460 Did you know that your Facebook pages in Canada with your comments about the Canadian election were being deleted based on a handbook that Facebook operated with a censorship contractor?
00:14:44.820 Well, it was a very interesting conversation.
00:14:46.960 And last time we spoke with Ryan Hartwig, who worked as a contractor, we asked him if we could see this handbook.
00:14:55.480 And today he's going to show us what it looks like.
00:14:58.480 Joining us now via Skype from Phoenix, Arizona is Ryan Hartwig.
00:15:02.960 Ryan, welcome back.
00:15:04.100 You were such an interesting guest.
00:15:06.140 Did I accurately summarize the role of Cognizant and your role with that company as a contractor censor for Facebook?
00:15:15.200 Yes, that's accurate.
00:15:17.660 Yeah.
00:15:18.480 Cognizant had this three-year, $200 million contract with Facebook for content moderation beginning in 2017.
00:15:26.300 And yeah, I started there in March of 2018.
00:15:28.960 And I was there for just under two years.
00:15:31.960 Cognizant ended the contract early in February of this year.
00:15:36.500 But yeah, I did that on a daily basis for two years.
00:15:39.960 Reviewed content and deleted a lot of it too.
00:15:42.780 That's incredible.
00:15:43.440 So if I recall our conversation last time, there were handbooks for things that would come up frequently, things to watch for.
00:15:51.260 I take it that you're an American citizen.
00:15:53.620 You're not a Canadian expert or anything like that.
00:15:56.040 But this handbook was your guide to the election.
00:15:58.900 Is that accurate to say you're just a regular American who was working in Phoenix?
00:16:03.980 Yeah, that's accurate.
00:16:05.000 We did primarily North American content, but we also did, you know, Canadian elections to a certain extent.
00:16:11.960 So that training deck that I shared, yeah, it has pages concerning different candidates and how to treat hate speech in Canada.
00:16:21.680 But yeah, North America included, you know, obviously Canada.
00:16:24.960 And so we did that.
00:16:26.980 We did monitor those elections.
00:16:28.800 That's incredible.
00:16:31.080 Well, let's go to it.
00:16:32.280 I mean, you were kind enough to get access to this handbook.
00:16:37.280 And you've sent some images to us.
00:16:39.480 I'd like to walk through it.
00:16:40.420 So this is a snapshot of your screen.
00:16:43.860 It says table of contents.
00:16:45.820 And maybe we can zoom in a little bit here.
00:16:50.460 This is the table of contents that is the rule book by which you and the 1,500 censors would operate.
00:16:59.180 Is that right?
00:17:00.460 That's correct.
00:17:01.120 I mean, in addition to the existing policies that we have, like the hate speech policy, this gives additional guidance on how to monitor and action things related to the election.
00:17:10.880 All right.
00:17:11.160 Well, let's start going through them.
00:17:12.260 So here's a close-up of it.
00:17:14.820 Election overview, voting information, prominent candidates, expected violations.
00:17:22.620 What's an expected violation?
00:17:24.460 You can see country named Canada, voting dates, October 21.
00:17:30.220 What kind of expected violations were they looking for?
00:17:34.460 Yeah, expected violations, that's kind of a funny phrase.
00:17:37.040 Do you think about it?
00:17:37.660 It reminds me of the movie Minority Report where people are punished for crimes before they're committed.
00:17:42.260 Yeah, expected violations, that would almost always, when they talk about expected violations, whether they're talking about Trump's State of the Union speech, where we're expected to see hate speech violations, or in this example, yeah, these are expected violations, hate speech, fake accounts, impersonation.
00:17:59.560 So there's some things that are legitimate, voter fraud.
00:18:01.960 For example, some people would say on social media, oh, you know, Democrats vote on the following Tuesday or give wrong information about the voting dates.
00:18:11.080 So that's legitimate.
00:18:11.940 But, yeah, hate speech seems to always be, it always seems to be attacking people on the left and coming from people on the right.
00:18:22.040 So it just kind of goes along with that whole media slant that all right-wingers are racists.
00:18:29.860 It seems to reinforce that.
00:18:33.000 Disinformation about where and when to vote.
00:18:34.900 I happen to agree that that should be mopped up.
00:18:37.940 That's probably good practice.
00:18:40.180 Impersonations should probably be mopped up, good practice.
00:18:43.260 But I remember when we talked to you earlier this summer, you pointed out an example of hate speech.
00:18:47.920 You said that if a conservative were to call a left-winger a feminazi, which is sort of a harsh word for an overbearing feminist, that's hate speech.
00:18:58.040 But if a leftist calls a conservative a plain old Nazi, that's acceptable.
00:19:05.020 To me, that's the perfect example of the double standard.
00:19:07.740 You can call a right-winger a Nazi, which, by the way, as a Jew, I find that trivializes the word Nazi.
00:19:14.600 If you call everyone you don't like a Nazi, it starts to lose meaning.
00:19:17.680 But you're not allowed to call a feminist a feminazi because that's off the table.
00:19:21.800 That's an example from your handbook.
00:19:23.860 Am I right?
00:19:25.020 Yeah, that's a perfect example.
00:19:26.200 And another example I didn't mention was similar to that is, for example, if I call you on Facebook, if I say Ezra is a Trump Humper, and you report it directly, that stays up.
00:19:39.160 But if I call you a snowflake, that gets taken down.
00:19:43.780 The word snowflake gets taken down.
00:19:45.720 Snowflake gets taken down.
00:19:46.460 But Trump Humper stays up.
00:19:47.780 Yeah.
00:19:48.280 Now, I've never heard the word Trump Humper before.
00:19:50.720 Frankly, it sounds a little funny.
00:19:52.460 It sounds like the kind of thing a child would say.
00:19:54.700 It doesn't hurt my feelings.
00:19:56.020 If someone calls me a Trump Humper, I'd sort of say grow up.
00:19:59.580 But the word snowflake is even less hurtful to my, I mean, snowflake, it's not even a mean word.
00:20:08.520 And you're telling me that Facebook's rule is you're a conservative who calls a liberal a snowflake.
00:20:15.740 That will be deleted.
00:20:17.800 Is that what you're saying?
00:20:19.100 That's correct.
00:20:19.840 Yeah, that's correct.
00:20:21.040 So once again, the double standard.
00:20:22.820 Yeah.
00:20:23.340 All right.
00:20:23.760 Well, those are generic issues.
00:20:25.300 Feminazi, snowflake, Trump Humper.
00:20:27.360 I'm sorry.
00:20:27.940 I can't help but saying that and chuckle.
00:20:29.660 But let's get serious for a second because here in Canada, our government says, oh, we've got to be on alert for election hacking by foreign enterprises in, you know, Russia or China or whatever.
00:20:40.680 But actually, what you're here to say is that you and 1,500 of your colleagues in Phoenix were overtly, explicitly and without, you know, hiding.
00:20:51.480 You were censoring things in the Canadian election that Facebook told you to.
00:20:55.500 And you're a foreigner and Facebook's a foreign company.
00:20:59.620 Phoenix is a foreign city.
00:21:01.780 You were doing it.
00:21:03.720 That's correct.
00:21:04.720 Thousands or hundreds of thousands of times.
00:21:07.680 Yeah, that is correct.
00:21:08.680 And, you know, this the candidate that we we saw in that on one of the pages of the training deck under expected violations was, you know, there are some slides about hate speech towards towards Jagmeet Singh.
00:21:21.780 So they were they were giving maybe, you know, they're basically explaining that Jagmeet Singh is protected under the hate speech policy because of his physical appearance, because he's wearing a head covering.
00:21:34.060 So he that, you know, hate speech protects against attacks against someone's religion.
00:21:38.460 So it's it's kind of this gray area, because if you're attacking him as a candidate and you have a picture of him, technically, any attack on him would be considered an attack against his religion because he's always wearing that head covering.
00:21:52.700 So his religion is basically one of the same with with him as a person.
00:21:56.240 So it's very easy to interpret an attack as a content monitor.
00:22:00.280 It's easy to interpret an attack on him as a person as an attack on his religion.
00:22:05.440 Well, that's a very interesting distinction.
00:22:08.500 I mean, he does wear his turban.
00:22:10.720 He is very proud of it.
00:22:12.320 I happen to think he's very fashionable.
00:22:14.300 I actually enjoy seeing his turbans.
00:22:17.140 He takes some fashion sense to it.
00:22:19.240 Sort of cool, actually.
00:22:21.140 Yeah.
00:22:22.420 But.
00:22:24.400 I've got 100 criticisms of Jagmeet Singh that have nothing to do with his turban.
00:22:28.340 What you're saying is, if I were to make one of these sharp criticisms of Jagmeet Singh, I think he's an empty suit.
00:22:35.800 I think he's not well-breathed.
00:22:37.720 I think he plays footsie with violent extremists.
00:22:42.140 I don't think there's any doubt about that.
00:22:43.940 I think he's a socialist.
00:22:45.400 Those are all things I truly believe have nothing to do with his ethnicity.
00:22:49.160 If I say those things, but it's associated with a picture of him, if there's a turban in the picture, which there always is because he always wears one, that, by definition, is hate speech because a Facebook censor can say, ah, it's linked to his picture of him in a turban.
00:23:06.760 That's hate speech.
00:23:07.380 Is that what you're saying?
00:23:08.060 Did I understand you right?
00:23:09.680 Yeah.
00:23:10.060 And we look at each job case by case.
00:23:13.180 So if it's clear that it's just a political attack, we would interpret it as such, but there is a part of the hate speech policy that's visual hate speech.
00:23:21.020 So if I have a cartoon character of someone with a turban and I say, that person's dumb, like with the caption, that person's dumb, and it's just a visual of a person with a turban, that gets deleted for hate speech because that visual of the person wearing a head covering would signify that person's religion.
00:23:39.600 So they do try to separate the nuances of attacking a candidate versus a religion.
00:23:47.240 But here, what it's really doing is it's training us as content moderators to look out for things.
00:23:52.340 I mean, it gets kind of old because we see hundreds of pieces of content a day.
00:24:00.300 So they're letting us know, hey, be on the lookout for any attacks about Jagmeet Singh.
00:24:04.020 So we may ignore other attacks, more nuanced attacks on other candidates, but because Facebook's highlighting this, it's going to have the impact of giving additional protections to Jagmeet Singh because we're more aware of it.
00:24:17.660 When we see thousands of pieces of content a day, Facebook wants us to make sure that we're deleting any attacks against Jagmeet and Jagmeet Singh.
00:24:25.300 We're talking with Ryan Hartwig, who was a contractor censor with a company called Cognizant, who had a three-year, did you say $200 million censorship?
00:24:35.740 That's a staggering $200 million censorship contract.
00:24:39.980 You make a good point because in the last election, one of the candidates, the conservative candidate, who happened to be a Christian white male, so very boring, Facebook wouldn't really care, except his Christianity and other parts of his identity were regularly attacked, not just on Facebook, but in the mainstream media.
00:25:00.120 What you're saying is only Jagmeet Singh was singled out for special protection, and it was so vigorous that even calling him dumb, if there was a picture of him with a turban, that religiousified the whole thing.
00:25:14.880 So that was off base, but you were obviously never instructed to protect Andrew Scheer for his Christianity or to protect other candidates for attacks on them.
00:25:27.040 I mean, I suppose if there was something just crazy over the top, you might do it, but you were not told to watch out for anti-Christian attacks on Andrew Scheer, right?
00:25:36.540 No, no, we're not told to look out for that.
00:25:38.500 And we see that double standard between the Christianity and also, for example, Islam in other examples.
00:25:44.300 And I'll send you a PowerPoint as well.
00:25:45.980 I've been speaking to this in Brazil because censorship is very important for them as well.
00:25:50.620 And, for example, we would see very many memes, sexual memes about Jesus Christ on the cross that were allowed at Facebook.
00:26:00.380 But on the contrary, if you have a meme about Muhammad and having a meme about sexuality with a goat, which, of course, is not cool.
00:26:10.820 I abhor any kind of hate speech towards Muslims.
00:26:13.620 But those memes about Muhammad with goats and a sexual nature were deleted.
00:26:19.500 So the sexual meme of Jesus is allowed, but a sexual meme of Muhammad is not allowed.
00:26:24.840 I think that's an important point here.
00:26:26.440 I don't think you or I are pro-abuse, pro-hate or anything.
00:26:31.600 I mean, but what we're talking about here is what is allowed by Facebook and what is deleted by Facebook.
00:26:37.460 And, I mean, I myself would not attack Jagmeet Singh's religion or his turban.
00:26:43.940 Neither bothers me.
00:26:45.120 In fact, I'm sort of pro-Sikh.
00:26:46.620 I know their important role in the British Empire.
00:26:50.720 I think that Sikhs disproportionately are patriotic fighters for the West.
00:26:56.760 I'm pro-Sikh, to be honest.
00:26:59.060 I don't like anti-Sikh animus.
00:27:02.240 But it's the selectivity.
00:27:04.980 It's the bias.
00:27:05.680 You can't mock a turban, but of course you can mock a cross.
00:27:10.860 You can't mock Muhammad in a graphic way.
00:27:13.820 And I wouldn't advocate for that.
00:27:16.620 But, hey, have at it with you.
00:27:17.900 Especially, I'm sure, that great Christ the Redeemer statue in, I think that's in Brazil.
00:27:23.920 I can only imagine that was a symbol for tremendous anti-Christian hate.
00:27:29.880 But you're saying that specifically was exempted by Facebook.
00:27:32.600 Yeah, so I came across this a lot when I viewed memes in Mexico and Latin America.
00:27:39.460 They had a lot of memes mocking Catholicism, mocking the Catholic religion.
00:27:43.820 So it showed some of them were pretty graphic.
00:27:45.420 I got cartoon imagery with Jesus and a very explicit sexual pose with another individual.
00:27:51.200 And often with, like I say, very, very graphic sexual imagery.
00:27:55.420 And it was allowed.
00:27:56.960 And it's interesting because there's a part of the bullying policy that mentions blasphemy.
00:28:02.960 Like blasphemy of a religion, attacking followers of a religion.
00:28:06.500 And it's not enforced.
00:28:07.880 So we protect, yeah, the hate speech policy protects followers of a religion.
00:28:14.940 So if I attack Mormons or Christians, if I say all Christians are horrible people, that would be taken down.
00:28:22.680 But it's weird because those sexual memes of Jesus are allowed for some reason.
00:28:27.640 Well, it's, I mean, we know what the reason is that Silicon Valley and Facebook, they are not just atheists, they're anti-religious, at least when it comes to Christianity.
00:28:39.060 Now, let's get back to the chart for Canada.
00:28:42.480 There was, I saw briefly there that calls for assassination of Trudeau.
00:28:47.520 I support the idea of deleting those, whether it's assassinating Trudeau or anyone.
00:28:54.960 I think we should take those down.
00:28:57.200 I don't think that was a hot death threat that would rise to a criminal charge of uttering a death threat.
00:29:03.860 I know the test for that, it has to be imminent.
00:29:06.480 It has to be credible.
00:29:07.520 It has, you know, I don't think that that is of a criminal nature.
00:29:10.880 But I think it's good hygiene for Facebook to take that down.
00:29:13.820 So I won't quarrel with that.
00:29:14.840 But there's a next screen.
00:29:17.880 Let's look at the next one here.
00:29:20.780 Here it is.
00:29:22.880 I'll just read it.
00:29:24.200 Look closely at this photograph.
00:29:26.360 And it's people crossing into, I think this is Roxham Road.
00:29:30.080 Do these people, do these look like refugees?
00:29:33.360 Or are they opportunistic leeches coming to take advantage of Canadian kindness?
00:29:37.800 Now, the word leeches, a leech is an animal.
00:29:40.100 And we don't like to dehumanize people.
00:29:41.700 And I myself would not use that word.
00:29:45.540 But if someone is coming from New York State to Canada claiming they're a refugee and they've got a phone that's more expensive than mine, I would probably communicate this in a less dehuman.
00:29:56.640 The word leeches is a prickly word that I wouldn't use myself.
00:30:01.500 But the idea that this would be banned and that Facebook censors would be told explicitly to stop anti-immigration memes.
00:30:11.480 Why don't you talk a little bit about that?
00:30:13.900 Yeah.
00:30:14.040 So I think you were going to say the word dehumanizing and then that's the right word for it because and that's actually the word, the policy language.
00:30:20.300 So the hate speech policy in tier one describes, you know, that any language that's dehumanizing towards a certain group, be it nationality, religion, sexual orientation is not allowed.
00:30:33.540 So that's the reason why we would be deleting it is because it's a comparison to animal, which is dehumanizing.
00:30:40.580 But it brings into question this greater debate of what should be allowed or should we be allowed to discuss immigration and politics?
00:30:51.040 And so calling someone an opportunist to leech, I mean, it's describing their behavior, but it's within the larger debate of immigration.
00:30:58.440 And so Facebook is essentially saying you can't have discussions about immigration, about things that are costing you money on a daily basis as a citizen.
00:31:07.820 It's it's a it can be a burden on social programs, on welfare.
00:31:13.840 And so Facebook's taking a stance and saying, hey, we can't call immigrants or refugees certain names.
00:31:19.560 But this one is is more like gray area because, OK, you're calling an opportunistic leech.
00:31:26.000 Yeah, it can't be dehumanizing. It depends how you define it.
00:31:29.340 Is it subjective? It sure is. It's very subjective.
00:31:32.500 Yeah. I mean, to call someone we use animal metaphors all the time.
00:31:37.140 He's a pig at the trough. Those politicians are pigs at the trough.
00:31:41.200 Oh, he he lied. He's a snake.
00:31:43.400 We do use animal metaphors all the time.
00:31:46.760 And it was clear that the leech was a reference to, you know, profiting off a system, taking funds, taking free health care, taking taxation.
00:31:58.020 I don't think it was you as a human are like vermin.
00:32:02.460 Like I know that that can be very dehuman.
00:32:04.860 You are vermin. You are less than human.
00:32:07.320 I don't think that was saying you're less than a human.
00:32:09.680 I think that phrase there and I get I wouldn't choose to use it, but I think that it's saying, oh, you're just coming to take advantage of our free health care and lacks immigration laws.
00:32:19.460 What troubles me again, Ryan, is I'm not going to bat for that language, which I don't think I would use myself.
00:32:25.220 But of all the hundreds of issues, the ones that are banned are the ones that conservatives care about.
00:32:32.820 Liberals are allowed to say anything about their free issues, their favorite issues.
00:32:37.320 But it's only the conservative issues that have these warnings on them.
00:32:41.580 Am I right?
00:32:42.820 Yeah, that's more very, very accurate.
00:32:45.080 The policy is designed that way.
00:32:47.640 And to the simple observer, it wouldn't be clear.
00:32:50.960 But I mean, this is coming from someone who studied the policy for two years.
00:32:55.160 And even after two years, a lot of it was still very confusing.
00:32:58.720 But yeah, it's so nuanced.
00:33:00.080 It wouldn't be readily apparent.
00:33:01.640 But yes, many of the policies, for example, allowing topless protests or allowing during the Pride Month, allowing topless protests of females.
00:33:13.660 It's one example.
00:33:15.500 And allowing attacks on straight white males, allowing them to be called filth for not supporting LGBT.
00:33:20.000 That's another great example.
00:33:22.340 Now, I just want to show a couple more slides from your slide deck because it's so important.
00:33:26.900 I know we've kept you here longer than we originally planned, but it's so interesting.
00:33:31.240 Let's put up some images and I'll just.
00:33:35.200 There was one on Greta Thunberg.
00:33:37.580 And Greta is not a Canadian story.
00:33:39.860 She did come to Canada during the Canadian election.
00:33:43.560 And during the election season, that is.
00:33:47.800 Yeah.
00:33:48.100 And here, this is a screenshot.
00:33:51.080 This was after the election that this policy came into effect.
00:33:54.020 I see it says start date, December 2019.
00:33:57.120 Remove instances of attacks aimed at Greta Thunberg.
00:34:02.400 Now, again, this phrase, retarded, retard or retarded.
00:34:06.700 I don't like those words myself, but it is a fact that she has mental illness.
00:34:12.580 She talks about it.
00:34:13.740 It's part of her backstory.
00:34:15.560 She gave a TED talk where she outlined her mental illness.
00:34:18.560 And in fact, last I checked, it's in her Twitter biography that she has autism.
00:34:24.620 So she plays that card and she uses it as an excuse for when she comes across very weird.
00:34:31.380 I don't I don't think the word retarded should be a go to insult.
00:34:36.020 But I don't see that when it's used against people like Barron Trump, who has had every awful name called at him.
00:34:46.520 And he's a he's not much older than Greta.
00:34:48.500 Why don't you speak to who are the preferred and protected young people in the world of Facebook?
00:34:54.620 Would Barron be protected?
00:34:56.780 Would with the daughters of political candidates on the right be protected?
00:35:02.920 Daughters and sons.
00:35:04.220 Tell me about that.
00:35:05.000 Yeah, I mean, by bed, by definition, technically, yes, they they would have protection.
00:35:10.540 But this is carving out a specific exception to the policy.
00:35:15.720 So so the policy for minors.
00:35:17.900 So distinguishes between public minors who are voluntarily famous and involuntarily famous.
00:35:23.080 She's voluntarily famous.
00:35:24.580 She's on her own, you know, voluntarily.
00:35:27.740 She's going public.
00:35:29.080 She might have some influence from her parents, unfortunately.
00:35:30.840 But yeah, so she is a public figure.
00:35:35.760 Now, any public figure who's a minor, including Barron Trump, you can't talk about them sexually, even if and so there's they have more protections than, let's say, a public figure who's an adult.
00:35:47.080 So they already have a lot of protections.
00:35:50.720 Now, this calling Barron Trump retarded would be allowed.
00:35:56.920 Any minor public figure, that's not a lot of attack.
00:36:01.040 Any minor under 18, you can call them that.
00:36:04.500 So this is saying, hey, Greta Thunberg's special.
00:36:07.700 We're making a policy exception.
00:36:10.780 And the screenshot you're looking at is a list of other exceptions they've made.
00:36:13.680 They had to start documenting the exceptions they made after we had the civic audit from former Senator John Kyle and the Covington law firm.
00:36:21.940 So so, yeah, this is giving additional protections to what's already stated in the policy to Greta Thunberg.
00:36:27.080 And what's fascinating is when this happened, we got like we got jobs like this all day.
00:36:34.220 I probably got for a couple for about a week.
00:36:36.580 I probably had 50 to 60 jobs a day with this phrase that I had to delete.
00:36:41.320 So they prioritized it.
00:36:43.120 Facebook injected what's called a proactive pull.
00:36:47.020 They injected those phrases or classifiers into our queue.
00:36:50.400 And they did it so strongly that in our own internal messaging board like that, like that post, like we had posts about retarded and even brought up it filtered and pushed those posts into our queue to delete.
00:37:04.060 Of course, we can't delete our own posts.
00:37:06.920 But this is the priority of Facebook.
00:37:08.920 So instead of deleting other things, Facebook is prioritizing deleting retarded.
00:37:15.220 That's incredible.
00:37:16.240 I see here there's some days you were taking down three, four hundred.
00:37:19.040 Let me ask you about that, because I've I've kept you so long here, but I could talk about this all day.
00:37:25.080 I think you mentioned the Covington kids, the difference between how Nick Sandman, that young man who was just smiling in the face of an aggressive racial activist in Washington, he was demonized a hundred ways that they wouldn't happen to Greta.
00:37:39.960 You said you were getting 50 to 60 of these Greta censorships a day.
00:37:44.400 And that's out of what I think you told me, 200 a day.
00:37:47.120 Let me ask you one last question.
00:37:48.960 It's about Canada.
00:37:49.760 There's so much interesting here to people all around the world, but I don't think any other Canadian media has talked about this.
00:37:55.540 Even though we put your video up in public and we circulated it, no other Canadian media has contacted you, have they?
00:38:03.140 No, no, they have not.
00:38:04.420 I find that pitiful and predictable, but we're happy to have the exclusive here.
00:38:10.220 So if you were doing at one point 50 to 60 censorships per day to protect Greta Thunberg, and there were 1,500 staff at the Phoenix location of Cognizant, I don't know if that would be applicable to all 1,500 people, but that's a staggering number of censorships a day.
00:38:29.360 How many censorships per day or per week?
00:38:32.980 I'm calling them censorships, like a deletion or a little mini trial.
00:38:37.700 I don't know what you call them, an action.
00:38:39.900 During the Canadian election, your shop in Phoenix, do you think you did 100, 1,000, 1 million?
00:38:48.900 Like if your shop alone was doing 300,000 censorship moments per day, all told, could you roughly estimate how many times you and your colleagues censored something to do with the Canadian election?
00:39:06.680 I mean, if I were to guess, it wasn't super prevalent.
00:39:09.680 There was probably a week when I had more content, probably leading up to the election where I had more Canadian content.
00:39:16.100 So it wasn't all the time.
00:39:17.540 I would say in that week before the election, let's just say during those two weeks, I would say maybe 10 a day, 10 to 15 posts a day.
00:39:29.500 So like a conservative estimate would be 10 a day times 10, 100 during that two-week period times, what's 100 times 1,500?
00:39:41.280 I mean, we're talking about, you probably, you could say between 50,000 and 100,000 posts.
00:39:47.400 It's very, very possible.
00:39:48.980 That's probably a conservative estimate.
00:39:50.680 Conservative estimate of 50 to 100,000 posts regarding the Canadian election were censored.
00:39:56.020 Were there any other offices either by Cognizant or other Facebook censorship contractors that you know about that covered Canada?
00:40:05.200 Or were you the only office that covered Canada?
00:40:09.080 That's a good question.
00:40:10.040 I know Facebook had other third-party contractors that were doing content moderation.
00:40:14.440 It's very possible they did that as well.
00:40:16.840 As far as I know, there was no exclusive group that worked on Canada.
00:40:21.040 The group that I was a part of at Cognizant that was exclusive was the Spanish content moderation for Latin America.
00:40:27.960 We were the only ones in the U.S. doing that.
00:40:29.960 But I think there were a number of other companies in the U.S.
00:40:33.780 I mean, we had the Cognizant office in Phoenix.
00:40:35.440 There was also a Cognizant office in Tampa, Florida.
00:40:38.080 So at least those two that I know of.
00:40:40.940 But there's another one called the GenPact out of Dallas, Texas that also had a content moderation contract with Facebook.
00:40:49.920 Your office alone, $50,000 to $100,000, and you were aware of two other offices that, as far as you know, you didn't have the exclusivity on Canada.
00:40:59.840 I can understand for language reasons why Phoenix would be a place to censor Latin America.
00:41:05.460 There's a lot of Spanish speakers.
00:41:07.120 But English language censorship was also happening in Dallas and Tampa.
00:41:10.920 And as far as you know, they would have been doing English language censorship for the Canadian elections, too.
00:41:17.320 You have no reason to doubt that they were doing the same work.
00:41:20.660 Yeah, it's just very, very possible.
00:41:22.280 I mean, it was kind of a mixed bag.
00:41:23.680 Like, we would get just kind of a random assortment of jobs in our queue.
00:41:28.440 But, yeah, all the North American content moderation companies, it was fair game.
00:41:34.600 Like, we all got a lot of the same content.
00:41:37.100 Yeah.
00:41:37.200 So if those other offices were operating at the same pace as you, if you estimate between 50,000 and 100,000 in your shop, so let's split the difference, call it 75,000 times three offices, it's not out of bounds to estimate that up to a quarter million Canadian Facebook posts were censored in the last two weeks of the 2019 federal election, 200,000 to 250,000 posts.
00:42:03.940 Is that a reasonable estimate based on what you know?
00:42:07.700 Yeah, I'd say that's reasonable.
00:42:09.040 And that's not taking into account either the AI that Zuckerberg testified about about a month ago that automatically deletes posts before they're even posted.
00:42:18.220 And that they have, like, an accuracy of 89% deletion.
00:42:23.680 So, yeah, it's very – I think that's a nice rough estimate, maybe a quarter million posts regarding the Canadian election.
00:42:32.120 Like I said, I didn't have access to those numbers, and we're extrapolating here based on estimates.
00:42:36.920 But, yeah, I think that's fairly accurate.
00:42:38.400 Last question, I know I've said that twice now.
00:42:41.820 The AI system that you say has a high accuracy, would that have handled an even greater number or a less number?
00:42:49.780 You say about a quarter million posts handled by humans in Dallas, Tampa, and Phoenix.
00:42:56.160 Would the AI being able to work on the Canadian election, would it have had significant numbers as well?
00:43:03.040 Yeah, I can't really say how many posts may have been deleted by the AI.
00:43:10.900 But, you know, posts such as the one that we saw about, you know, calling immigrants – or speaking out about immigration and the effects of immigration, posts such as those could have been deleted without anyone knowing.
00:43:24.400 Like before you even attempt to post, it would get taken down.
00:43:27.060 So it's really hard to estimate how many get taken down.
00:43:31.400 Yeah, I'm not sure.
00:43:33.320 Got it.
00:43:34.340 Just, do you know any idea of the scale of the AI side of things?
00:43:38.240 If the human side was a quarter million interventions, would it surprise you, or would you be hesitant to say that an equal number or more was done by artificial intelligence?
00:43:49.820 No, it wouldn't surprise me at all.
00:43:51.280 So when they hired us, they told us that our job as content moderators was to train the AI.
00:43:56.020 So we would, for example, certain imagery, for example, there was imagery of cleavage or, you know, a woman in a bikini.
00:44:05.020 We would mark it with a certain label so that the AI could be trained.
00:44:08.700 And so that way you could filter your settings later in Facebook if you didn't want to see certain types of content.
00:44:13.540 We were training the AI to do so.
00:44:15.460 So I think the overall goal was to turn over more content moderation to the AI and allow that to do our jobs.
00:44:22.780 So I wouldn't be surprised at all if the AI did more deletions than us.
00:44:28.040 Isn't that incredible?
00:44:28.880 So for all we know, the quarter million acts of censorship on the Canadian election taught this AI well.
00:44:39.160 And as far as we know to this moment, the AI is censoring in real time hundreds of thousands, perhaps millions, potentially, of Canadian political posts.
00:44:50.560 And we simply wouldn't know.
00:44:52.600 Yeah, it's very possible.
00:44:53.740 And, you know, an AI can't strap a camera to itself like I did and uncover this corruption and an abuse.
00:45:01.600 Incredible.
00:45:02.440 Well, Ryan Hartwig, you've been very generous with your time.
00:45:05.120 I really appreciate you talking to us again.
00:45:07.600 We will do our best to promote this news and share it with our viewers and widely.
00:45:13.300 But I believe that my question earlier to you that you answered, no other Canadian media cared.
00:45:20.260 I predict that will be the same case here.
00:45:22.820 Only Canadians care about this story, not the Canadian media.
00:45:27.420 I think, rather, they sort of agree with the censorship.
00:45:31.260 And that's the sad state of affairs up here.
00:45:32.980 We wish you good luck.
00:45:34.100 And we hope we can continue to talk with you, especially if you have more information you can share in the future about our country.
00:45:40.640 Yeah.
00:45:41.100 Thanks for having me on, Andrew.
00:45:42.300 It's always a pleasure.
00:45:43.100 Well, thank you very much.
00:45:44.080 It's been very educational.
00:45:45.160 Well, there you have it.
00:45:46.320 Ryan Hartwig, who for years was a censorship contractor with Cognizant, a Facebook censorship company based in Phoenix, Arizona.
00:45:55.540 Stay with us.
00:45:56.320 More ahead on Rebel News.
00:45:57.440 Hey, welcome back.
00:46:09.920 On my monologue last night, Paul writes,
00:46:11.920 The supposed journalists calling for the censorship of other journalists are not journalists.
00:46:15.800 Yeah.
00:46:16.160 That's my whole chickens for Colonel Sanders thing, isn't it?
00:46:20.040 Rush writes,
00:46:20.740 Yeah, look, I mean, where are the Canadian Association of Journalists when they should be fighting for journalists?
00:46:36.380 I just don't see them in the battlefield for free speech.
00:46:39.360 And I think I would know because I'm out there, along with True North, for example.
00:46:43.020 We both fought against Justin Trudeau's censorship in the Elections Commission.
00:46:46.980 It's funny that the CAJ claimed they were on our side there.
00:46:50.060 But I checked with our lawyer.
00:46:51.500 I said, Did you hear anything from them?
00:46:53.880 Because I don't recall it.
00:46:54.880 Did you see a statement, even a tweet?
00:46:56.980 I know they weren't there in court, and he didn't hear.
00:46:58.780 I think that they have been completely rented.
00:47:02.520 I used the word concubine last night.
00:47:04.820 That's a fancy way of saying prostitute.
00:47:07.260 You know, my friend Pamela Keller uses the phrase prostitute.
00:47:11.140 Oh, I don't like to say that.
00:47:14.000 But where's the error?
00:47:16.800 On my interview with Manny Monagreen, O.B. writes,
00:47:19.140 Trudeau and Morneau have thrown Canada under the bus.
00:47:22.240 Yeah, I thought that was a good point by Manny.
00:47:24.520 Boy, what an ugly, ugly thing.
00:47:26.700 But think about it.
00:47:27.660 Trudeau, on his own terms.
00:47:29.240 Like, I mean, I don't like Trudeau.
00:47:30.420 I don't like any of his team.
00:47:31.680 But let's say you take Trudeau at face value.
00:47:34.220 Here's my winning team.
00:47:35.640 Jane Philpott, best, regarded as one of the best cabinet ministers in Trudeau's cabinet.
00:47:40.860 Sacked from the health ministry because she was too ethical.
00:47:45.140 Jody Wilson-Raybould, sacked.
00:47:47.280 Gerald Butts, sacked for ethics problems.
00:47:50.280 Bill Morneau, sacked.
00:47:51.840 Don't believe his resignation.
00:47:53.780 Clerk of the Privy Council, sacked.
00:47:55.860 Don't believe it was a resignation.
00:47:57.480 So the top people all around Justin Trudeau, by his own description,
00:48:02.860 had been run out of town for ethics violations.
00:48:05.520 And now we see Katie Telford, his chief of staff,
00:48:08.740 and her husband's involvement in things too.
00:48:10.900 Um, this government's being gutted.
00:48:13.920 And yeah, uh, Chrystia Freeland is a high-energy chatterbox.
00:48:17.720 She's not an expert in anything.
00:48:19.740 And simply saying, oh, Chrystia will be finance minister
00:48:22.380 and intergovernmental finance minister, and, and, and.
00:48:25.320 Governments don't work that way.
00:48:26.720 And this one is not working well at all.
00:48:28.100 We're in the middle of a crisis.
00:48:29.640 And the finance minister just quit because of a quarrel with the prime minister.
00:48:34.060 Oh, well.
00:48:35.300 Hey, what do you think of our interview with that Facebook censor?
00:48:38.380 Shocking.
00:48:38.780 Even more shocking is, no one else in Canada in the CAJ journalist world seems to care.
00:48:46.660 That's our show for today until tomorrow.
00:48:48.740 On behalf of all of us here at Rebel World Headquarters to you at home,
00:48:51.340 good night and keep fighting for freedom.