Rebel News Podcast - July 04, 2020


You can’t open your business — but Canada’s top doctor says millionaire baseball players can


Episode Stats

Length

42 minutes

Words per Minute

168.56078

Word Count

7,238

Sentence Count

521

Misogynist Sentences

6

Hate Speech Sentences

11


Summary

You can't go to church or synagogue, and you can't open your business, but Canada's top doctor says millionaire baseball players can. What's an essential service in a pandemic? Who gets to decide? And why would a politician get involved?


Transcript

00:00:00.000 Hey Rebels, today is one of the most important shows. Holy cow. I have a monologue about masks
00:00:05.840 and the pandemic and the special exemption for the Blue Jays. You know, one rule for
00:00:11.300 the fancy people, one rule for the rest of us. But I think the heart of today's show is a lengthy
00:00:16.700 interview with one of Facebook's censorship contractors who worked in a censorship army
00:00:23.300 of 1,500 people working around the clock, each of them blocking about 200 Facebook posts a day.
00:00:32.780 That's just such a staggeringly large number. And they had a special target painted on Canada
00:00:39.400 during the election. It's news. It's a bombshell. And I get into it in today's podcast. I'm very
00:00:47.200 excited. Hey, one of the reasons you're hearing about that is because we're independent, because
00:00:51.880 we don't take money from Justin Trudeau. You're just not going to hear news like this
00:00:55.240 at a media outlet that takes money from the liberals. So to keep going, we rely on contributions from
00:01:02.260 listeners and viewers. I'd appreciate it if you went to rebelnews.com and signed up for a
00:01:07.100 subscription. It's eight bucks a month. That's less than Netflix or 80 bucks for the whole year.
00:01:12.620 You get the video version of this podcast, plus a couple other shows, Sheila Gunn-Reed and David
00:01:17.240 Menzies. And it helps us give you the kind of news I know for a fact you can't find
00:01:21.480 anywhere else. That's all at rebelnews.com. Okay, here's the podcast with my monologue
00:01:26.940 and then a blockbuster interview.
00:01:33.460 Tonight, you can't go to church or synagogue. You can't open your business. But Canada's top
00:01:49.580 doctor says millionaire baseball players can. It's July 3rd, and this is the Ezra LeVant Show.
00:01:56.920 Why should others go to jail when you're a biggest carbon consumer I know?
00:02:00.660 There's 8,500 customers here, and you won't give them an answer.
00:02:04.740 The only thing I have to say to the government about why I publish it is because it's my bloody
00:02:09.620 right to do so.
00:02:10.580 What's an essential service during the pandemic? Funny enough, the government says rebel news is
00:02:20.940 an essential service. We're a media company and an internet company. Both were deemed essential by
00:02:27.320 the government a few months ago when they put everyone else under a kind of house arrest. I
00:02:31.200 appreciate the flattery, but the guy in the office next to us, he's in the clothing business. Why isn't
00:02:36.040 he essential? His business sure is essential to him and his family and to his employees and their
00:02:42.420 families and to his would-be customers. Not that you couldn't buy clothes these last few months. You
00:02:47.740 just had to buy them from, say, Walmart or other retail giants that were allowed to open. Funny how
00:02:52.880 they were allowed to stay open as essential services, but my neighbor wasn't. If you think that's
00:02:59.500 stupid, there are places where it got even stupider these past months. Here's a photo from a Walmart
00:03:04.440 in Michigan. The Democrat governor there said you could go to the megastores, but you could only buy
00:03:09.960 items in the store that were deemed essential. You would be right there in the store anyways,
00:03:15.260 and that was apparently medically fine. But the police would arrest you if you bought things from
00:03:21.080 this aisle that weren't essential as opposed to that aisle that were. Bizarrely, seeds for growing food
00:03:28.040 were considered non-essential. Same thing happened in Vermont. Very weird. Does that make any sense? No,
00:03:34.820 it does not. Here's a sign from a U.S. drugstore. Can you please tell me what is or isn't essential in
00:03:40.340 a drugstore? My list of essentials in a drugstore is probably different from yours. Who gets to decide?
00:03:46.340 Well, how about the customer and the storekeeper? Why would a politician get involved? Why would the
00:03:50.060 police get involved? Do you doubt for a second that politicians don't buy exactly whatever they want
00:03:55.400 whenever they want it? Look, you're in the store anyways. What's the problem here? Is there a virus
00:04:01.000 risk if you buy perfume? Maybe you're cooped up in an apartment for two months and things aren't
00:04:06.000 smelling their freshest. I think this whole you can't buy certain things is an obvious
00:04:10.940 and weird attempt to say to guys like my neighbor, hey, we're putting you out of business, sure,
00:04:17.540 but we're going to suppress some of your competitors selling your stuff while they're open,
00:04:22.700 so maybe that'll make you feel better about you going bankrupt. It's so weird. It's so stupid.
00:04:29.900 None of it with any basis in law or medicine or science. So here we are in July. Ontario and many
00:04:35.860 other jurisdictions declared a state of emergency on March 17th, which just happened to be St. Patrick's
00:04:40.640 Day. St. Patrick, who is credited with banishing snakes from Ireland. We could have used some of that
00:04:48.200 to banish the virus, don't you think? Instead, we let the virus come right on in. For example,
00:04:54.280 we let the airports remain open to this day. More than a million travelers continue to come into
00:04:58.560 Canada. Flights from China never stopped. Foreign migrant workers, foreign international students
00:05:03.080 continue to come into the country. Still do, but you and I were put under house arrest. Just for two
00:05:08.500 weeks, though, just to be flattened the curve, we were told. Two weeks, eh? Just give us two weeks.
00:05:12.880 Well, today is the 108th day of the two-week lockdown. And even though the actual pandemic
00:05:20.720 is pretty much over, politicians are hoping for a second wave. Here's John Tory, Toronto's mayor,
00:05:26.100 saying masks are now mandatory. He didn't say this 108 days ago. He's saying it now in July.
00:05:34.000 Well, good morning. Today, excuse me. Today, City Council will consider a report from Dr. Eileen
00:05:48.880 Davila to effectively make it mandatory for people to wear masks or face coverings inside businesses
00:05:54.720 or public facilities. Dr. Davila's recommendation worked out with city legal after careful consideration
00:06:01.340 of the legal landscape would give clear direction on face coverings to help stop the spread of COVID-19.
00:06:08.340 It's a second wave, not of the virus, but of virus panicking. That's actually what the masks are for,
00:06:13.620 to keep you in a pandemic state of mind, keep you scared. And look, everyone's wearing masks. It's proof
00:06:19.140 that we're still in an emergency. Stop the return to normalization. Condition you to obeying and conforming.
00:06:25.680 As if politicians themselves are going to wear those masks in the summer heat when the cameras are off.
00:06:29.900 But look at this. This is actually my news for today that got me going. It's in the Washington
00:06:35.480 Post. Blue Jays granted exemption to train in Toronto. Public Health Agency of Canada spokeswoman
00:06:43.780 Marie-Pierre Burel said that the players and staff have been issued an exemption to the mandatory
00:06:49.840 isolation order on national interest grounds. Oh, so that's a, that's a thing. That's a medical thing.
00:06:58.580 There's essential and non-essential. It's pretty obvious that a game of baseball isn't essential
00:07:04.720 because in Ontario and in Toronto in particular, parks and playgrounds and children's sports are banned.
00:07:11.420 So how do you say that baseball is essential when it's played by millionaires in a league owned by
00:07:18.720 billionaires, but it's banned for the little people? You have to come up with a different thing.
00:07:23.020 You can't say essential or non-essential anymore. So you come up with this medical thing,
00:07:28.480 national interest. I didn't know that was a medical thing. I don't know if they're teaching that
00:07:33.260 med school, even though it's an American based league with many American based players paid in
00:07:39.920 U.S. dollars. So yeah, did you get that guys? Your kid can't play baseball, but foreign millionaires can
00:07:47.580 come up here to play baseball because of the national interest. Now I get it. I'm not dumb. It's big
00:07:53.460 business. So just say so. Just say there's a lot of money on the line and we listen to powerful people
00:07:59.160 with money. Just say so. Don't literally have the public health authority making the announcement.
00:08:03.960 A doctor! It is not a medical decision. It is a political decision. Oh, did I mention
00:08:08.420 the Toronto Blue Jays, which are valued at about $2 billion U.S., are owned by Rogers Communications,
00:08:15.040 a TV and internet company that, oh, wouldn't you know, a Toronto mayor, John Tory, used to run. That's
00:08:19.620 a cozy little coincidence. See, you'll be fined $1,000 for going to the park with your kids, but
00:08:25.860 John Tory's company, Rogers, will get a special legal exemption in the national interest for
00:08:31.660 itself, according to the doctors in charge. Here's the spokesman for Theresa Tam. The government
00:08:37.400 of Canada has issued an exemption to the mandatory isolation order on national interest grounds
00:08:42.040 for team members and staff of the MLB, she said. Okay, I got it. I wonder why Theresa Tam didn't
00:08:49.160 tweet that out herself. I wonder why she just told it to the U.S. media. Well, you know,
00:08:54.260 one rule for you, one rule for them. They're special. You're not. Hey, guys, just a few more
00:08:59.940 weeks, okay? Just until we flatten the curve, okay? Just until there are no more cases, okay?
00:09:06.640 Just until we have a vaccine, okay? Just a few more weeks, okay? You don't think a mandatory
00:09:12.260 vaccine is next? I know it is. Look at this. Here's the Toronto Star. Trudeau ordered 37 million
00:09:21.060 syringes. That's enough for every man, woman, child, and baby in this country. I say again,
00:09:26.340 the average annual death toll for pneumonia and the flu is about the same for the total death toll
00:09:31.140 of the pandemic. If you take out seniors' homes, which are a problem on their own, the pandemic has
00:09:35.780 been less deadly than the annual flu. But do you doubt you'll still be on lockdown and still be fined
00:09:44.700 needed for months to come? The masks are not needed. Now, if they were needed, and Taiwan's
00:09:50.720 experience suggests that they're a good way to stop transmission in close quarters, especially
00:09:54.980 indoors, if they were needed, they were needed two, three months ago when the virus hit its peak.
00:10:02.320 That's done. That's over. Back then, the politicians were saying, no, no, no, no, you don't need masks
00:10:07.380 because they gave them all the way to China. The masks now, they're needed for your fear, for your
00:10:14.040 obedience, to keep you in a pandemic state of mind as a public marker for who complies and who does not.
00:10:21.080 Never forget the mindset of the people who are enjoying this pandemic health theater a bit too much.
00:10:27.740 I think the public has to know this is one of the worst case scenarios in terms of an infectious
00:10:33.040 disease outbreak in that their cooperation is sought. If there are people who are non-compliant,
00:10:39.680 there are definitely laws and public health powers that can quarantine people in mandatory settings.
00:10:49.780 It's potential. You could track people, put bracelets on their arms, have police and other setups to
00:10:57.300 ensure quarantine is undertaken. Yeah, as I say in my new book, China Virus,
00:11:02.380 I got it right here. Let me show you. The real danger isn't from the bug. It's from the tyrannical
00:11:09.140 ideology that has used the virus as an excuse. If you doubt me, explain to me why millionaire and
00:11:16.300 billionaires get to play baseball and travel to and from America on private jets, but you can't
00:11:22.140 play ball in a playground and you can't even go to a family funeral across the board. Stay with us.
00:11:29.140 I have a most revealing interview for you. A bombshell exclusive, if you know. That's next.
00:11:34.900 It's inhumane. It's inhumane, but if it's going to save the country, why not do it?
00:11:55.460 I feel like $80 million would do a lot of good money.
00:12:02.220 I'm just saying.
00:12:04.460 We should just hand him over. Take the money as a country.
00:12:09.600 That's what I'm saying. If we hand him over, our country would be sick.
00:12:13.160 Just saying.
00:12:13.720 Yeah, I mean, it's a bargain, right?
00:12:22.720 I'm just saying. Take him. Y'all can keep your $80 million or you can give it to us and we can put it in our debt.
00:12:28.640 Like, just save in the U.S. Like, come on. Yeah, that's it. They just want one person.
00:12:36.720 Why not take one for the team?
00:12:40.240 That was a clip from a Facebook whistleblower published by Project Veritas.
00:12:46.780 As you know, James O'Keefe has an organization called Project Veritas that relies on hidden camera footage
00:12:52.720 to tell us the truth about government organizations and increasingly government-like organizations,
00:12:59.720 namely the massive Silicon Valley oligopolies that control so much of our speech.
00:13:06.340 Well, today, Ryan Hartwig, who worked for a Facebook comment moderating company,
00:13:14.080 joins us to talk about what really happens behind the scenes.
00:13:17.520 And Ryan joins us now.
00:13:18.720 Nice to meet you. Tell me a little bit about yourself.
00:13:21.900 You didn't work for Facebook directly.
00:13:23.800 You worked for one of the outsourced censorship companies that Facebook hired, right?
00:13:29.600 That's correct, yes.
00:13:31.380 Once again, my name is Ryan Hartwig. Thanks for having me on.
00:13:33.780 I was a bilingual content moderator for Cognizant, but we had the contract with Facebook for content moderation.
00:13:43.140 So I started in March of 2018, and the project ended in February of this year.
00:13:48.260 So I was there for nearly two years.
00:13:49.820 And I noticed many flagrant examples of bias that would, you know, favor liberals and basically censor target conservatives.
00:13:59.940 Yes.
00:14:00.620 So when you say target conservatives, you're not just speaking metaphorically.
00:14:04.520 Just before we turn the camera on, you told me that there's a hot list or a fire list that anything referring to these people in a positive way is automatically deleted.
00:14:14.160 And you mentioned that our old friend Tommy Robinson and actually a couple of other rebel alumni, including Gavin McInnes, are on that list.
00:14:21.540 What does it mean when Tommy Robinson, who has never been charged with a hate crime, let alone convicted of one, I know his case pretty well because, you know, we crowdfunded his legal defense when he was charged with contempt of court.
00:14:34.320 But what does it mean within Facebook when someone like Tommy Robinson is on a blacklist?
00:14:41.100 Yeah.
00:14:41.360 So the section of the policy, which I studied and was very familiar with for, you know, I studied it for two years, essentially.
00:14:48.040 The policy is called dangerous individuals and organizations.
00:14:51.640 So within that policy, you have, you know, criminal organizations, cartels.
00:14:56.780 You also have terrorist organizations.
00:14:58.580 And then along with that, you have your hate figure list, which includes, you know, Adolf Hitler, Joseph Goebbels.
00:15:06.520 So on that list, literally right underneath Adolf Hitler, Tommy Robinson is listed.
00:15:11.520 Now, this means that anyone on that list, you're not allowed to do.
00:15:15.140 It's called PSR.
00:15:16.420 You cannot praise, support or represent that individual.
00:15:20.400 So the only way I can mention him on my Facebook is if I condemn his quotes or his actions.
00:15:27.180 Otherwise, even if I just share his name or anything related to him, it's an automatic delete off the platform.
00:15:35.780 That's incredible.
00:15:36.800 You know, we heard rumors of that.
00:15:38.180 I mean, but Facebook is typically being somewhat secretive about it.
00:15:44.340 So are you saying there was an actual list provided to companies like Cognizant?
00:15:48.180 How many people would be on that list?
00:15:53.020 You mentioned Goebbels and the Nazis.
00:15:54.620 Goebbels, of course, was Hitler's propaganda minister in the 30s and 40s.
00:15:59.360 He's dead now, obviously.
00:16:01.320 Right.
00:16:01.920 How many current existing groups?
00:16:04.980 Well, I mean, tell us a bit about that list.
00:16:06.400 I find that fascinating.
00:16:07.480 I mean, is Mao Zedong the largest killer in the modern era?
00:16:12.060 Is he or his Little Red book on the list?
00:16:13.900 I think I know the answer.
00:16:14.920 Who else is on the list?
00:16:15.820 Yeah, so there are quite a number of individuals, both present and past, on the list.
00:16:22.000 I would say there's roughly 200 individuals on the list.
00:16:25.560 There's also hate domains.
00:16:28.080 So websites such as there's avoiceformen.com is on that list.
00:16:34.020 There's also InfoWars.
00:16:35.760 And I was on with Alex Jones last year.
00:16:37.840 And when they banned InfoWars, it was an emergency update.
00:16:42.140 The post, the guidance they gave literally said, emergency update.
00:16:46.920 So forget about, you know, the child pornography, beheadings, cartels.
00:16:50.740 Like, Alex Jones for them is an emergency.
00:16:53.600 Huh.
00:16:54.260 Yeah.
00:16:55.680 I mean, we haven't been banned from Facebook ourselves.
00:17:00.140 But of course, we're always under pressure from them.
00:17:05.860 How do they treat conservative sites in the U.S. like Breitbart.com?
00:17:11.060 Did that ever come up when you were working for Cognizant?
00:17:13.760 So there was no specific blacklisting of Breitbart news that I was aware of.
00:17:19.700 And I had access to most of the training material.
00:17:22.720 And because I would be deleting, you know, groups, pages, posts, videos on both Instagram and Facebook.
00:17:30.740 And the same policy applied to both platforms.
00:17:33.900 But I did not see any specific mention of Breitbart.
00:17:35.820 Breitbart, I do know that they do have a way to filter what we get.
00:17:42.040 So if something's trending, like if Greta Thunberg is trending and they're attacking her, they want to protect her,
00:17:48.420 they can do a proactive poll and basically send us all the jobs related to those keywords.
00:17:55.160 So if they wanted to, they could, you know, choose to target Breitbart news simply by putting those into our queue.
00:18:04.680 Yeah.
00:18:05.420 So when you say protect Greta, we've done a lot of journalism on Greta.
00:18:08.820 We even published a little documentary called GretaInc.com.
00:18:13.180 YouTube doesn't seem to have censored that.
00:18:15.760 In fact, our largest videos of the last, I don't know, six months have been about Greta.
00:18:21.480 But you, of course, work for Facebook and Instagram.
00:18:23.420 How would that, let's say they wanted to protect Greta.
00:18:30.040 I mean, you suggested that might happen.
00:18:33.100 What would that look like to you, who was working as a censor?
00:18:37.520 Would it, would they say if Greta is associated with the word liar or mentally ill or child act or that that would all be caught by an algorithm or something?
00:18:48.320 What does it mean to protect someone like Greta online?
00:18:52.380 Yeah, so there is, like I said, it's called a proactive poll and they use their classifiers and the AI basically picks up on any phrase.
00:19:03.660 And this is kind of a developing story.
00:19:05.240 There might be some more info to come out in the next week regarding some of the things I documented.
00:19:10.600 For example, you know that she was, she was called Gretaarded and that was something that, that Facebook was worried about.
00:19:19.520 A similar example in the US here, we had trending Boogaloo, which is kind of a, a way to talk about revolution.
00:19:27.940 Um, so that was trending during our, during our impeachment proceedings.
00:19:32.840 Um, so yeah, they, they can basically, or, you know, we had the Ukraine whistleblower as well that was trending.
00:19:38.380 Um, so yeah, any key phrase like that, any, something like Boogaloo or something, and then they specifically asked us, hey, we want you to flag, I got an email and I've documented that.
00:19:47.440 Hey, they, they wanted me to flag any examples of right-wing extremism outside the United States.
00:19:53.060 Um, so this is something that they're, they're actively looking for, and this goes as far as elections.
00:19:58.440 They wanted us to flag any content regarding specific elections, even in Canada.
00:20:03.500 So when, um, in this last Canadian election with, uh, Jagmeet Singh, there was specific guidance to, to watch out for any hate speech content targeting candidate Jagmeet Singh with their hate speech, with dehumanizing comparison.
00:20:17.580 Um, and, and once again, the, the policies are very, very nuanced, but they don't allow political speech.
00:20:25.700 For example, if I say, keep Canadians out of the United States, that's something that's not allowed.
00:20:31.080 Uh, that's a violation of the hate speech policy tier three for exclusion.
00:20:36.600 Um, but we, we see them time and time again, uh, targeting conservatives and also, uh, giving exceptions to the policy to, to allow for newsworthy, what they deem.
00:20:47.580 Um, as newsworthy.
00:20:48.700 Well, that's fascinating.
00:20:49.860 This is the first time I've ever heard that Facebook was defending Jagmeet Singh, the, the leader of our socialist party called the NDP.
00:20:57.400 Uh, there are a great many reasons one could criticize Jagmeet Singh that have nothing to do with his ethnicity as a Sikh.
00:21:04.760 However, there are also some legitimate reasons to criticize Jagmeet Singh that do have to, uh, related to his Sikhism.
00:21:13.620 Um, not that that's his religion, but that he supports radical Sikh groups, both in India and abroad, some of which have been violent.
00:21:26.340 Some of which are terrorist groups.
00:21:28.000 Um, I actually think that Jagmeet Singh is banned, uh, if I'm not mistaken, from visiting India because of his affiliation with them.
00:21:34.560 So my question to you, Ryan, is how does Facebook's AI or how would someone like you working in America be able to distinguish, oh, this critique of Jagmeet Singh is racist.
00:21:48.740 We're going to ban it.
00:21:50.120 This critique of Jagmeet Singh is a real policy question about whether or not he's too sympathetic to Kalistani terrorists.
00:21:57.500 Like, that is such a complicated and nuanced thing, and 10 people might have 10 different opinions.
00:22:02.920 You're saying you, based in, I think, Arizona, are making that call every day.
00:22:09.780 Yeah, and it's, like I said, it's very, it is very nuanced.
00:22:12.700 So, for example, uh, the hate speech policy applies visually.
00:22:16.560 So, if I have a meme caricature of someone wearing, um, you know, uh, clothing associated with the Islam religion, or, for example, in this example, Sikh, then simply by having an image of someone dressed like that, that, that counts as, as a protected characteristic.
00:22:35.920 So, if there's an image of anyone, since he's wearing that wardrobe, wearing that clothing associated with Sikh, um, that would, so anything that's even referencing him or attacking him could be, you know, interpreted as an attack on his religion simply because he's wearing that clothing.
00:22:55.260 He has a turban, and I think it's quite fashionable, and I understand it's not for fashion, it's a religious thing, but I think he sort of, you know, really focuses on colorful, uh, it's part of his brand, I think.
00:23:09.640 Yeah.
00:23:09.880 Um, and I personally like it, but my point is, any caricature, any cartoon, any drawing of him would have a turban because that's how he almost always is.
00:23:23.920 So, if you're criticizing him with any image, you're gonna have a turban there, and again, I'm, I'm just thinking, well, who decides whether or not it's a mean depiction of a turban, or a friendly depiction of a turban, and I can't believe that an American company is censoring what Canadians can say about a Canadian election.
00:23:42.020 That's what's weird to me.
00:23:43.320 You're in Arizona, and you're deciding what Canadians can or can't say about a political candidate in Canada.
00:23:50.220 I mean, I don't want anyone to censor, even censor mean words, but I think it's extremely news to Canadians that you, and you seem like a nice guy, but you're based in Arizona, working for some company hired by Facebook, and you're deciding what Canadians can or can't say during our election.
00:24:06.740 Right, and Facebook gave us an election training deck, not only for the U.S., but for Canada, various countries in Europe.
00:24:16.020 Of course, I wasn't working directly on content in Europe, but yeah, no, I had the same conversation with a Spanish TV station in Spanish.
00:24:23.240 I speak fluent Spanish, so I was with El Toro TV yesterday.
00:24:27.240 There was also some concerning content regarding Venezuela.
00:24:30.500 Keep an eye out, there might be a new story coming out about Venezuela, because I witnessed direct interference from Facebook in an armed revolution in Venezuela.
00:24:39.420 So if you think that's bad, just censoring the speech, I mean, Facebook is on a global level.
00:24:46.420 This is a conversation we should be having, you know, this is, countries should be autonomous, countries should be able to not have their public discourse controlled, as you say, by an American company, by some moderators in Phoenix, Arizona.
00:25:02.460 Yeah, I mean, of course, I have no racial animus towards Jagmeet Singh or the Sikh people.
00:25:08.000 In fact, I actually admire Sikhism, and I wouldn't like if someone was hateful towards Sikhs, and I would probably feel personally compelled to rebut that.
00:25:20.700 But that's what should happen.
00:25:22.200 It shouldn't be a censored thought crime.
00:25:25.860 That's what troubles me.
00:25:26.940 It doesn't trouble me that people care about hate or care about harmony.
00:25:32.600 It troubles me that the censorship is being done by these impenetrable tech companies.
00:25:40.320 I mean, let me ask you, when you, let's say you were to delete a post in Canada during the Canadian election.
00:25:46.880 Would there be any record of that?
00:25:48.600 Would there be any communication of that?
00:25:50.520 Would there be any appeal of that?
00:25:52.040 Or is that just a laugh?
00:25:54.520 So I believe there is an appeal process.
00:25:56.640 If you are a user on Facebook, you can appeal the decision.
00:25:59.120 But on our end, so I roughly, I would be working on looking at maybe 200 pieces of content a day.
00:26:08.640 And of those, so let's say 1,000 jobs a week that I'm doing, either deleting or leaving on the platform.
00:26:14.680 And I would be audited on maybe 50 of those a week.
00:26:18.160 So if it wasn't audited, then they would never be raised up to anybody what I deleted or what I didn't delete.
00:26:24.600 So there is a system in place, but it's not, you know, there are, things can slip by, slip through the cracks, yeah.
00:26:33.540 If you're doing 200 a day, my rough math says that's about two minutes each.
00:26:38.780 Yes.
00:26:39.160 And of course, you're not, you're not having, it's not like a trial where you have a prosecutor and a defender and you're a judge.
00:26:45.520 You're just glancing at something.
00:26:46.780 You're a judge, jury, prosecutor, executioner, the whole thing.
00:26:49.620 And, and you seem like a fair-minded guy who obviously was concerned about it, but I want to show a clip.
00:26:55.680 This is where you were talking to a senior HR business partner of Facebook, Leslie Brown.
00:27:01.600 We were just, we've been talking about Sikhism and a Sikh candidate.
00:27:06.640 Here's a clip of Leslie Brown talking about, well, then not quite every race or religion is equal in Facebook.
00:27:13.000 Let's take a quick look at that.
00:27:14.080 But I mean, they were able to fire him without having to worry about discrimination.
00:27:20.320 Indiligence, right, right.
00:27:21.880 No, I, yes.
00:27:23.120 No, I'm saying James Amore.
00:27:24.680 Yeah, yeah, white man.
00:27:25.800 So no problem.
00:27:26.780 He can't do it that easily if there are other issues.
00:27:30.860 Oh, it's, it's easier when they're.
00:27:32.680 White man.
00:27:33.340 Okay.
00:27:34.680 Yeah, no protected class.
00:27:35.580 No one's, yeah, no one's, no one's, yeah.
00:27:39.760 No one has the white man's back anymore.
00:27:41.540 You're saying because he's a white male, there was, there's more leverage.
00:27:47.100 That if he chose to sue the company, most attorneys would just laugh.
00:27:52.380 Look, every single human being has their biases, their own identities, maybe their own grudges or grievances.
00:27:59.520 I'm just terrified that there are secret armies of corporate bureaucrats doing the deleting.
00:28:07.400 How many people were working with you in Phoenix?
00:28:09.520 Were you doing this from home or were you in a big office?
00:28:12.940 So we worked in the office and because of the type of content we received, so we would look at pornography, also child pornography, pornography.
00:28:20.520 So because of U.S. laws, we, we had to work in an office and some of the material was very sensitive.
00:28:25.800 But yeah, in my office in Phoenix, it was, there were shifts around the clock 24-7.
00:28:30.000 And there was roughly, I would say, anywhere between 1,000 and 1,500 people at that office.
00:28:35.560 Did you just say 1,500?
00:28:37.560 Yes.
00:28:38.400 1,500 sensors.
00:28:41.380 And you were working 20, obviously in shifts, working 24 hours a day.
00:28:47.380 Right.
00:28:48.980 Unbelievable.
00:28:49.540 Did you have any interaction with the Canadian government or was this all just, you were just a contractor directly with Facebook and that's all, that's the only, that's the only boss there was?
00:29:01.420 Yeah, that's the only boss there was.
00:29:03.260 One of our, you know, one of our policy managers who's mentioned in the video, Sean Browder, who's a very pro-Bernie supporter.
00:29:09.240 He, he had direct contact with the client, with Facebook, so they would, they would video conference on a daily basis.
00:29:15.800 And those policies, those decisions would be, would come from on high, from Facebook as to what to, to allow what to, to keep.
00:29:23.060 So one of the most egregious examples that we saw during, during, you know, as far as our elections is in 2018, there was a, a kid who was a Trump supporter wearing a MAGA hat.
00:29:33.900 And he was in a restaurant in Texas and he got attacked and assaulted and his, his hat got knocked off.
00:29:40.000 So this was a viral video in the summer of 2018.
00:29:43.280 And, um, in the video, the adult was cursing at the, the minor.
00:29:47.260 And so it, it technically violated, you know, it's kind of a gray area, but you could argue that it violated Facebook's policies because we don't allow cursing at minors.
00:29:56.000 Um, unless, anyway, so Facebook made the, the, the, the broad, the decision to delete this video across the platform.
00:30:05.180 It was a viral video, but of course it showed a Trump supporter being victimized.
00:30:09.700 Um, and on the flip side, there was another, another great example that shows the converse is that in Australia, there's a far right senator named Fraser Anning.
00:30:20.080 And he was doing a press conference and he got attacked by a minor.
00:30:24.740 And so this kid walked up behind him and egged him, cracked an egg on his, on the back of his head.
00:30:29.980 And so Fraser Anning turned around and slapped the kid in the face a couple of times repeatedly.
00:30:34.860 And that technically violates our child abuse policy.
00:30:37.860 So, so we have child abuse, clear violation, but Facebook said, Hey, we're making a newsworthy exception.
00:30:43.280 We're going to allow this on the platform.
00:30:46.360 And, and of course it showed a, a far right Senator being humiliated, being attacked.
00:30:50.820 Huh?
00:30:52.140 Um, this is fascinating and terrifying.
00:30:55.680 The fact that there's a list of 200 plus banned people, places, ideas, groups is shocking.
00:31:01.580 The fact that I, I know a few of them and some of them will work with, I'm, I'm not surprised.
00:31:05.580 I mean, we had heard, uh, Tommy's problems, um, before I just never heard that the list was available, but obviously someone would have had to have had it.
00:31:15.760 Can you tell us anything more about Canada?
00:31:18.220 Because I mean, obviously that's something we care about a lot based here in Canada.
00:31:23.980 Other than the Jagmeet Singh protection order, was there anything that was off limits or anything you were told to boost?
00:31:33.500 Did you have any other instructions about the Canadian political, uh, sphere?
00:31:40.480 Um, I know there's some guidance in, in the election training deck for Canada.
00:31:44.640 There was some guidance, um, about, you know, content that was targeting immigrants with dehumanizing comparison.
00:31:50.720 So very, for example, if there's a, there's a picture of, of immigrants in Canada and the caption on the meme or not the meme, but the caption on the photo says,
00:31:59.180 uh, look closely at this photograph.
00:32:01.560 Do these look like refugees or are they opportunistic leeches coming to take advantage of Canadian kindness?
00:32:07.640 So this would violate the hate speech policy because it's immigrants would be considered a protected characteristic.
00:32:14.400 Excuse me.
00:32:15.180 Um, and so by calling them leeches, you're basically comparing them to an insect or a bug.
00:32:22.180 So simply by, to me, that's, that's, you know, discussing immigration issues.
00:32:26.640 Uh, but for Facebook, this is a violation of hate speech because you're, uh, you know, you have an image of immigrants.
00:32:33.060 So they're depicted in the image and it's comparing them to opportunistic leeches, take, taking advantage of Canadian kindness.
00:32:40.140 So it's, it's not nice to, to dehumanize people.
00:32:43.360 And I, and we know that dehumanizing people by comparing them to animals is done historically, I mean, calling people like rats or like pigs.
00:32:51.700 But again, that's part of discourse.
00:32:53.800 We do call, and I'm not saying that's pleasant.
00:32:56.080 I'm not saying I support that, but to be a leech, to be a pig, to be a wolf in sheep's clothing, to be, uh, a bull, the, we use metaphors.
00:33:06.360 It's interesting to me that the only metaphor being banned is one that is negative towards immigration, which is a huge issue.
00:33:14.780 Are any other, are any other images or examples?
00:33:19.320 So immigration is obviously something that the elites and the left supports and populists and the right oppose.
00:33:24.800 Were there any other countervailing things that you couldn't say?
00:33:28.960 You couldn't call a conservative a Nazi.
00:33:32.500 You couldn't call a right-wing candidate a Nazi.
00:33:36.180 That would be my analogy.
00:33:37.800 You can't call an illegal immigrant a leech.
00:33:41.200 Okay.
00:33:42.100 Is there anything you can't call a right-winger or is it, you can call Maxime Bernier, the head of the People's Party, for example.
00:33:49.020 You can call him a Nazi, no problem.
00:33:51.480 Mm-hmm.
00:33:52.160 So, yeah, so that's a great question.
00:33:54.220 So, uh, we have a, what's called a bullying slang list.
00:33:58.180 So there's a lot of words that are used a lot and we want to make sure we're doing, you know, all doing it the same way, actioning, taking action the same way.
00:34:06.440 So I, I, I'm allowed to call someone a Nazi, even though technically, um, any attack on someone's character, temperament, mentality, disposition is a delete.
00:34:17.220 But they made it, they carved out an exception for any kind of ideological attack.
00:34:21.700 But it's, it's interesting because, so on the one side, you can call someone a Trump-humper.
00:34:25.820 Like if you called me a Trump-humper, Ryan, you're a Trump-humper, and I reported it directly, it would still stay up.
00:34:31.540 But if you were to call me a feminazi, uh, that would be taken down.
00:34:37.200 So, uh, Trump-humper stays up, feminazi, it gets taken down.
00:34:42.300 And of course-
00:34:42.840 So a feminazi is a word used sometimes to criticize a feminist.
00:34:47.340 But how about just calling someone a Nazi?
00:34:49.540 So, I mean, I'm Jewish myself.
00:34:51.680 I hate being called a Nazi.
00:34:52.900 I'm, I'm very pro-Israel, very anti-Nazi, obviously.
00:34:55.440 Yeah.
00:34:55.720 Um, I've been to the Holocaust Museum many times, et cetera.
00:34:58.680 If someone were to call me a Nazi Nazi, not a feminazi, but a Nazi, is that kosher under Facebook's rules?
00:35:06.860 Yeah.
00:35:07.400 It's, it's going to stay up no matter what you do.
00:35:09.120 There's no way to, to take it down.
00:35:10.540 People can call you Nazi all day.
00:35:12.280 And because of the, the, the, yeah, they, they changed, may have changed the policy under bullying.
00:35:17.340 Because technically it's attacking your character, right?
00:35:19.160 It's attacking who you are.
00:35:20.440 Well, it's defamatory.
00:35:21.680 So you can call a Jewish conservative a Nazi.
00:35:24.300 And unfortunately that, that happens from time to time.
00:35:26.420 But you can't call a leftist a feminazi.
00:35:29.700 Am I understanding?
00:35:30.360 So call a conservative a Nazi.
00:35:31.880 That's okay by Facebook.
00:35:33.440 Call a feminist censor a feminazi.
00:35:37.500 And that's taken down.
00:35:39.120 Is that, am I right?
00:35:40.420 That's correct.
00:35:41.180 Yep.
00:35:41.860 And, uh, another example that we have is Facebook wanted things escalated.
00:35:47.820 So in the training deck for Canada, it said, um, that, you know, if there's a, they, if there's something that's, that's really trending or viral going on with the election in Canada, they want that escalated.
00:35:59.620 So someone from, from Facebook that works for Facebook would see that, um, they wanted us to identify any trends in viral posts that are related to elect, to the elections, including humorous and satire posts, any hate speech or bullying or harassment related to the elections, any threats to political candidates.
00:36:15.580 That one makes sense.
00:36:17.320 Any attempts for voter fraud and spreading misinformation, misinformation around the election and any privacy violations.
00:36:24.780 Um, and then it says, yeah, just please follow the normal procedures for escalating content in the workflow tribe.
00:36:30.860 Um, and that mentions people we should contact at Facebook for any escalations.
00:36:36.040 Are you at liberty to publish the Canadian training deck that you've been referring to here?
00:36:41.280 Has Project Veritas made, put that online yet?
00:36:44.900 Um, not yet.
00:36:45.620 That's something that we would, I would have to get approval with them, uh, to, to give you that.
00:36:49.820 I think that's, I would love to.
00:36:51.940 Um, and in that video, you know, there are a lot of things that are, that are great, that are very newsworthy, but there's still a lot that I did uncover.
00:36:59.780 And then I, a film that hasn't been made public yet, but I would love to, uh, with their permission, of course, and their blessing, uh, share that with you.
00:37:07.140 Well, I mean, obviously we're, we care about censorship for the whole world.
00:37:11.420 We've stood up for free speech in America, in the UK, in Australia, wherever we operate.
00:37:17.040 We care most about our home, which is Canada, but we love the United States and we love its freedom and its First Amendment.
00:37:23.320 Uh, we're deeply interested in Facebook's Canadian censorship manual.
00:37:27.980 And if you do get the green light from James O'Keefe, who's a friend of our channel, please let us know, because we will absolutely go through that with meticulous detail, perhaps with you as our guide to teach Canadians how our election is being tampered with by foreign agents, you being one of them.
00:37:45.460 And we like you because you've blown the whistle on it.
00:37:48.020 But as you told us, there were 1,500 people working around the clock on this.
00:37:52.180 Listen, what a pleasure with an eye opener.
00:37:53.860 Ryan Hartwig, thank you for your time today.
00:37:56.920 And I'd invite our viewers to watch other videos and other facts in this at projectveritas.com.
00:38:04.420 We've been talking to Ryan Hartwig.
00:38:05.840 Last word to you.
00:38:07.120 Is this going to get better or worse as we head up to the November election in the States?
00:38:13.620 Um, it's hard to say.
00:38:14.640 I think this will give some impetus for, for Congress to act.
00:38:18.540 And, you know, uh, we, here in the U S there's a, something called the communications decency act, and hopefully we can remove the protections that, um, Facebook enjoys under section 230 of that act.
00:38:30.640 Um, but it's, it's, it's tough to say.
00:38:33.520 It's, it's really hard fighting against these behemoths against, uh, you know, big tech, Google, Facebook, but I'm optimistic.
00:38:39.500 I know there's a lot of patriots out there like myself who, who want to know the truth.
00:38:43.580 So, uh, I'm, I'm mildly optimistic, but it's going to be an uphill battle.
00:38:48.440 Well, very interesting.
00:38:49.600 I learned a lot today.
00:38:50.600 Thank you for your time, Ryan.
00:38:52.140 You're welcome.
00:38:52.760 Thank you.
00:38:53.480 All right.
00:38:54.040 Stay with us.
00:38:54.860 More ahead on the level.
00:39:07.040 Hey, what do you think of this?
00:39:08.160 Literally today, as I was doing a show, the book arrived.
00:39:11.300 It's, uh, I'm very excited about it.
00:39:13.700 It's, uh, I think our eighth book that we published with Amazon.
00:39:16.600 Amazon tried to censor it for two months.
00:39:19.080 They refused to publish China virus, saying it contradicted official narratives on the pandemic.
00:39:24.140 They finally relented.
00:39:25.240 Anyways, I hope you'll like him.
00:39:26.660 Uh, on my monologue last night, Sherry writes, how is it Trump supporters fault that she was fired?
00:39:32.300 Conservatives aren't even allowed to cancel.
00:39:34.420 Try taking responsibility for your own actions.
00:39:36.440 What a pathetic display.
00:39:37.360 That's a reference to that Harvard grad who was fired from Deloitte after threatening to stab people.
00:39:43.700 Yeah, I mean, that's a little bit of karma.
00:39:45.100 It's, you know, there's a lot of people out there who are being unfairly tarnished and canceled.
00:39:49.680 It's tough for me to muster sympathy for that gal.
00:39:52.260 Just like it's tough for me to muster sympathy for Wendy Mesley for being fired for repeatedly dropping the N-word in the office.
00:39:58.880 She's the one who tried to tag every conservative as racist.
00:40:02.360 It's a bit of poetic justice.
00:40:03.820 Perry writes, today all young people want their five minutes of fame, whether it be Facebook, Instagram, whatever.
00:40:10.200 It's true.
00:40:11.260 And being famous for being famous, sort of the, you know, the Kardashian approach or the Paris Hilton approach, it's not substantial.
00:40:19.880 I like the idea of fame for doing something worthy of fame.
00:40:24.860 Why do we still know the name Christopher Columbus, Thomas Edison?
00:40:29.580 Why do these names ring centuries later?
00:40:32.300 Well, they were famous for achievement and accomplishment and courage, not famous for being famous.
00:40:39.620 On my interview with Manny Montenegreno, Paul writes,
00:40:41.920 I read about 10 emails on that conversation I had with Manny, and I think only one of them supported Manny's point of view.
00:41:02.760 Look, I get what Manny, I think I get what Manny was trying to do.
00:41:06.120 He was saying Trudeau should confess that he's unethical, and he's tainted the whole legal system, and that's why we have to do the trade to get the two hostages back.
00:41:15.720 Okay, that's an interesting thought exercise, but in real life it would be a disaster.
00:41:20.320 That would be saying, oh, you corrupted it once, so we must now be corrupt all the time.
00:41:24.700 I don't think I've ever disagreed with Manny before in my life, but yesterday I sure did.
00:41:29.660 Well, that's the show for today, and I just want to hold it up one more time because I'm sort of excited about this book,
00:41:34.840 partly because Amazon tried to stop it.
00:41:37.020 I don't know if you saw my YouTube video, but for two months they just, we started to upload this in April,
00:41:42.560 and they rejected it and rejected it, and we appealed and they rejected it,
00:41:45.680 and we had our lawyers write to them and they ignored it, and just last week they finally relented.
00:41:50.440 So, I mean, it's a short book, but there's lots of footnotes in it.
00:41:53.000 I bet you probably know most of this stuff from watching the show,
00:41:56.100 but I know for a fact there's new details in here that I hadn't published before.
00:42:00.860 I'm really excited about this, so let me know what you think.
00:42:03.400 You can find out more at ChinaVirusBook.com.
00:42:07.900 Don't mind me, I'm just a little excited that this came in the mail.
00:42:10.260 All right, that's our show for today.
00:42:11.800 Until next time, on behalf of all of us here, well, oh, before I go,
00:42:15.480 what do you think about the news about Facebook's censorship factory?
00:42:20.120 Let me know what you think about that.
00:42:21.200 Send me an email to Ezra at RebelNews.com.
00:42:23.820 All right, I gotta go.
00:42:25.120 Until next time, on behalf of all of us here at Rebel World Headquarters,
00:42:28.280 good night, and keep fighting for freedom.
00:42:30.440 I'm welcome.
00:42:38.640 I'm great.
00:42:38.940 Thank you.
00:42:41.180 Thank kelvin, who's in perfect sense, in subject to a ticket for all of us here,
00:42:44.580 and for all of us here, we're going to wrap this up in the economy.
00:42:46.640 Come back to TIFF Oh!
00:42:47.980 How are we?
00:42:48.600 Philippine, who's running through?
00:42:49.740 The show is in question.
00:42:50.920 We need to go.
00:42:51.740 You've got to be a murder happening.
00:42:52.240 No, we've been inτί شيء.
00:42:52.940 We have a detective.
00:42:54.020 We've been in myilot.
00:42:55.320 You've been in the toilet.