You can't go to church or synagogue, and you can't open your business, but Canada's top doctor says millionaire baseball players can. What's an essential service in a pandemic? Who gets to decide? And why would a politician get involved?
00:14:00.620So when you say target conservatives, you're not just speaking metaphorically.
00:14:04.520Just before we turn the camera on, you told me that there's a hot list or a fire list that anything referring to these people in a positive way is automatically deleted.
00:14:14.160And you mentioned that our old friend Tommy Robinson and actually a couple of other rebel alumni, including Gavin McInnes, are on that list.
00:14:21.540What does it mean when Tommy Robinson, who has never been charged with a hate crime, let alone convicted of one, I know his case pretty well because, you know, we crowdfunded his legal defense when he was charged with contempt of court.
00:14:34.320But what does it mean within Facebook when someone like Tommy Robinson is on a blacklist?
00:18:05.420So when you say protect Greta, we've done a lot of journalism on Greta.
00:18:08.820We even published a little documentary called GretaInc.com.
00:18:13.180YouTube doesn't seem to have censored that.
00:18:15.760In fact, our largest videos of the last, I don't know, six months have been about Greta.
00:18:21.480But you, of course, work for Facebook and Instagram.
00:18:23.420How would that, let's say they wanted to protect Greta.
00:18:30.040I mean, you suggested that might happen.
00:18:33.100What would that look like to you, who was working as a censor?
00:18:37.520Would it, would they say if Greta is associated with the word liar or mentally ill or child act or that that would all be caught by an algorithm or something?
00:18:48.320What does it mean to protect someone like Greta online?
00:18:52.380Yeah, so there is, like I said, it's called a proactive poll and they use their classifiers and the AI basically picks up on any phrase.
00:19:03.660And this is kind of a developing story.
00:19:05.240There might be some more info to come out in the next week regarding some of the things I documented.
00:19:10.600For example, you know that she was, she was called Gretaarded and that was something that, that Facebook was worried about.
00:19:19.520A similar example in the US here, we had trending Boogaloo, which is kind of a, a way to talk about revolution.
00:19:27.940Um, so that was trending during our, during our impeachment proceedings.
00:19:32.840Um, so yeah, they, they can basically, or, you know, we had the Ukraine whistleblower as well that was trending.
00:19:38.380Um, so yeah, any key phrase like that, any, something like Boogaloo or something, and then they specifically asked us, hey, we want you to flag, I got an email and I've documented that.
00:19:47.440Hey, they, they wanted me to flag any examples of right-wing extremism outside the United States.
00:19:53.060Um, so this is something that they're, they're actively looking for, and this goes as far as elections.
00:19:58.440They wanted us to flag any content regarding specific elections, even in Canada.
00:20:03.500So when, um, in this last Canadian election with, uh, Jagmeet Singh, there was specific guidance to, to watch out for any hate speech content targeting candidate Jagmeet Singh with their hate speech, with dehumanizing comparison.
00:20:17.580Um, and, and once again, the, the policies are very, very nuanced, but they don't allow political speech.
00:20:25.700For example, if I say, keep Canadians out of the United States, that's something that's not allowed.
00:20:31.080Uh, that's a violation of the hate speech policy tier three for exclusion.
00:20:36.600Um, but we, we see them time and time again, uh, targeting conservatives and also, uh, giving exceptions to the policy to, to allow for newsworthy, what they deem.
00:21:28.000Um, I actually think that Jagmeet Singh is banned, uh, if I'm not mistaken, from visiting India because of his affiliation with them.
00:21:34.560So my question to you, Ryan, is how does Facebook's AI or how would someone like you working in America be able to distinguish, oh, this critique of Jagmeet Singh is racist.
00:21:50.120This critique of Jagmeet Singh is a real policy question about whether or not he's too sympathetic to Kalistani terrorists.
00:21:57.500Like, that is such a complicated and nuanced thing, and 10 people might have 10 different opinions.
00:22:02.920You're saying you, based in, I think, Arizona, are making that call every day.
00:22:09.780Yeah, and it's, like I said, it's very, it is very nuanced.
00:22:12.700So, for example, uh, the hate speech policy applies visually.
00:22:16.560So, if I have a meme caricature of someone wearing, um, you know, uh, clothing associated with the Islam religion, or, for example, in this example, Sikh, then simply by having an image of someone dressed like that, that, that counts as, as a protected characteristic.
00:22:35.920So, if there's an image of anyone, since he's wearing that wardrobe, wearing that clothing associated with Sikh, um, that would, so anything that's even referencing him or attacking him could be, you know, interpreted as an attack on his religion simply because he's wearing that clothing.
00:22:55.260He has a turban, and I think it's quite fashionable, and I understand it's not for fashion, it's a religious thing, but I think he sort of, you know, really focuses on colorful, uh, it's part of his brand, I think.
00:23:09.880Um, and I personally like it, but my point is, any caricature, any cartoon, any drawing of him would have a turban because that's how he almost always is.
00:23:23.920So, if you're criticizing him with any image, you're gonna have a turban there, and again, I'm, I'm just thinking, well, who decides whether or not it's a mean depiction of a turban, or a friendly depiction of a turban, and I can't believe that an American company is censoring what Canadians can say about a Canadian election.
00:23:43.320You're in Arizona, and you're deciding what Canadians can or can't say about a political candidate in Canada.
00:23:50.220I mean, I don't want anyone to censor, even censor mean words, but I think it's extremely news to Canadians that you, and you seem like a nice guy, but you're based in Arizona, working for some company hired by Facebook, and you're deciding what Canadians can or can't say during our election.
00:24:06.740Right, and Facebook gave us an election training deck, not only for the U.S., but for Canada, various countries in Europe.
00:24:16.020Of course, I wasn't working directly on content in Europe, but yeah, no, I had the same conversation with a Spanish TV station in Spanish.
00:24:23.240I speak fluent Spanish, so I was with El Toro TV yesterday.
00:24:27.240There was also some concerning content regarding Venezuela.
00:24:30.500Keep an eye out, there might be a new story coming out about Venezuela, because I witnessed direct interference from Facebook in an armed revolution in Venezuela.
00:24:39.420So if you think that's bad, just censoring the speech, I mean, Facebook is on a global level.
00:24:46.420This is a conversation we should be having, you know, this is, countries should be autonomous, countries should be able to not have their public discourse controlled, as you say, by an American company, by some moderators in Phoenix, Arizona.
00:25:02.460Yeah, I mean, of course, I have no racial animus towards Jagmeet Singh or the Sikh people.
00:25:08.000In fact, I actually admire Sikhism, and I wouldn't like if someone was hateful towards Sikhs, and I would probably feel personally compelled to rebut that.
00:27:35.580No one's, yeah, no one's, no one's, yeah.
00:27:39.760No one has the white man's back anymore.
00:27:41.540You're saying because he's a white male, there was, there's more leverage.
00:27:47.100That if he chose to sue the company, most attorneys would just laugh.
00:27:52.380Look, every single human being has their biases, their own identities, maybe their own grudges or grievances.
00:27:59.520I'm just terrified that there are secret armies of corporate bureaucrats doing the deleting.
00:28:07.400How many people were working with you in Phoenix?
00:28:09.520Were you doing this from home or were you in a big office?
00:28:12.940So we worked in the office and because of the type of content we received, so we would look at pornography, also child pornography, pornography.
00:28:20.520So because of U.S. laws, we, we had to work in an office and some of the material was very sensitive.
00:28:25.800But yeah, in my office in Phoenix, it was, there were shifts around the clock 24-7.
00:28:30.000And there was roughly, I would say, anywhere between 1,000 and 1,500 people at that office.
00:28:49.540Did you have any interaction with the Canadian government or was this all just, you were just a contractor directly with Facebook and that's all, that's the only, that's the only boss there was?
00:29:03.260One of our, you know, one of our policy managers who's mentioned in the video, Sean Browder, who's a very pro-Bernie supporter.
00:29:09.240He, he had direct contact with the client, with Facebook, so they would, they would video conference on a daily basis.
00:29:15.800And those policies, those decisions would be, would come from on high, from Facebook as to what to, to allow what to, to keep.
00:29:23.060So one of the most egregious examples that we saw during, during, you know, as far as our elections is in 2018, there was a, a kid who was a Trump supporter wearing a MAGA hat.
00:29:33.900And he was in a restaurant in Texas and he got attacked and assaulted and his, his hat got knocked off.
00:29:40.000So this was a viral video in the summer of 2018.
00:29:43.280And, um, in the video, the adult was cursing at the, the minor.
00:29:47.260And so it, it technically violated, you know, it's kind of a gray area, but you could argue that it violated Facebook's policies because we don't allow cursing at minors.
00:29:56.000Um, unless, anyway, so Facebook made the, the, the, the broad, the decision to delete this video across the platform.
00:30:05.180It was a viral video, but of course it showed a Trump supporter being victimized.
00:30:09.700Um, and on the flip side, there was another, another great example that shows the converse is that in Australia, there's a far right senator named Fraser Anning.
00:30:20.080And he was doing a press conference and he got attacked by a minor.
00:30:24.740And so this kid walked up behind him and egged him, cracked an egg on his, on the back of his head.
00:30:29.980And so Fraser Anning turned around and slapped the kid in the face a couple of times repeatedly.
00:30:34.860And that technically violates our child abuse policy.
00:30:37.860So, so we have child abuse, clear violation, but Facebook said, Hey, we're making a newsworthy exception.
00:30:43.280We're going to allow this on the platform.
00:30:46.360And, and of course it showed a, a far right Senator being humiliated, being attacked.
00:30:52.140Um, this is fascinating and terrifying.
00:30:55.680The fact that there's a list of 200 plus banned people, places, ideas, groups is shocking.
00:31:01.580The fact that I, I know a few of them and some of them will work with, I'm, I'm not surprised.
00:31:05.580I mean, we had heard, uh, Tommy's problems, um, before I just never heard that the list was available, but obviously someone would have had to have had it.
00:31:15.760Can you tell us anything more about Canada?
00:31:18.220Because I mean, obviously that's something we care about a lot based here in Canada.
00:31:23.980Other than the Jagmeet Singh protection order, was there anything that was off limits or anything you were told to boost?
00:31:33.500Did you have any other instructions about the Canadian political, uh, sphere?
00:31:40.480Um, I know there's some guidance in, in the election training deck for Canada.
00:31:44.640There was some guidance, um, about, you know, content that was targeting immigrants with dehumanizing comparison.
00:31:50.720So very, for example, if there's a, there's a picture of, of immigrants in Canada and the caption on the meme or not the meme, but the caption on the photo says,
00:32:15.180Um, and so by calling them leeches, you're basically comparing them to an insect or a bug.
00:32:22.180So simply by, to me, that's, that's, you know, discussing immigration issues.
00:32:26.640Uh, but for Facebook, this is a violation of hate speech because you're, uh, you know, you have an image of immigrants.
00:32:33.060So they're depicted in the image and it's comparing them to opportunistic leeches, take, taking advantage of Canadian kindness.
00:32:40.140So it's, it's not nice to, to dehumanize people.
00:32:43.360And I, and we know that dehumanizing people by comparing them to animals is done historically, I mean, calling people like rats or like pigs.
00:33:54.220So, uh, we have a, what's called a bullying slang list.
00:33:58.180So there's a lot of words that are used a lot and we want to make sure we're doing, you know, all doing it the same way, actioning, taking action the same way.
00:34:06.440So I, I, I'm allowed to call someone a Nazi, even though technically, um, any attack on someone's character, temperament, mentality, disposition is a delete.
00:34:17.220But they made it, they carved out an exception for any kind of ideological attack.
00:34:21.700But it's, it's interesting because, so on the one side, you can call someone a Trump-humper.
00:34:25.820Like if you called me a Trump-humper, Ryan, you're a Trump-humper, and I reported it directly, it would still stay up.
00:34:31.540But if you were to call me a feminazi, uh, that would be taken down.
00:34:37.200So, uh, Trump-humper stays up, feminazi, it gets taken down.
00:35:41.860And, uh, another example that we have is Facebook wanted things escalated.
00:35:47.820So in the training deck for Canada, it said, um, that, you know, if there's a, they, if there's something that's, that's really trending or viral going on with the election in Canada, they want that escalated.
00:35:59.620So someone from, from Facebook that works for Facebook would see that, um, they wanted us to identify any trends in viral posts that are related to elect, to the elections, including humorous and satire posts, any hate speech or bullying or harassment related to the elections, any threats to political candidates.
00:36:51.940Um, and in that video, you know, there are a lot of things that are, that are great, that are very newsworthy, but there's still a lot that I did uncover.
00:36:59.780And then I, a film that hasn't been made public yet, but I would love to, uh, with their permission, of course, and their blessing, uh, share that with you.
00:37:07.140Well, I mean, obviously we're, we care about censorship for the whole world.
00:37:11.420We've stood up for free speech in America, in the UK, in Australia, wherever we operate.
00:37:17.040We care most about our home, which is Canada, but we love the United States and we love its freedom and its First Amendment.
00:37:23.320Uh, we're deeply interested in Facebook's Canadian censorship manual.
00:37:27.980And if you do get the green light from James O'Keefe, who's a friend of our channel, please let us know, because we will absolutely go through that with meticulous detail, perhaps with you as our guide to teach Canadians how our election is being tampered with by foreign agents, you being one of them.
00:37:45.460And we like you because you've blown the whistle on it.
00:37:48.020But as you told us, there were 1,500 people working around the clock on this.
00:37:52.180Listen, what a pleasure with an eye opener.
00:37:53.860Ryan Hartwig, thank you for your time today.
00:37:56.920And I'd invite our viewers to watch other videos and other facts in this at projectveritas.com.
00:38:14.640I think this will give some impetus for, for Congress to act.
00:38:18.540And, you know, uh, we, here in the U S there's a, something called the communications decency act, and hopefully we can remove the protections that, um, Facebook enjoys under section 230 of that act.
00:38:30.640Um, but it's, it's, it's tough to say.
00:38:33.520It's, it's really hard fighting against these behemoths against, uh, you know, big tech, Google, Facebook, but I'm optimistic.
00:38:39.500I know there's a lot of patriots out there like myself who, who want to know the truth.
00:38:43.580So, uh, I'm, I'm mildly optimistic, but it's going to be an uphill battle.
00:40:11.260And being famous for being famous, sort of the, you know, the Kardashian approach or the Paris Hilton approach, it's not substantial.
00:40:19.880I like the idea of fame for doing something worthy of fame.
00:40:24.860Why do we still know the name Christopher Columbus, Thomas Edison?
00:40:29.580Why do these names ring centuries later?
00:40:32.300Well, they were famous for achievement and accomplishment and courage, not famous for being famous.
00:40:39.620On my interview with Manny Montenegreno, Paul writes,
00:40:41.920I read about 10 emails on that conversation I had with Manny, and I think only one of them supported Manny's point of view.
00:41:02.760Look, I get what Manny, I think I get what Manny was trying to do.
00:41:06.120He was saying Trudeau should confess that he's unethical, and he's tainted the whole legal system, and that's why we have to do the trade to get the two hostages back.
00:41:15.720Okay, that's an interesting thought exercise, but in real life it would be a disaster.
00:41:20.320That would be saying, oh, you corrupted it once, so we must now be corrupt all the time.
00:41:24.700I don't think I've ever disagreed with Manny before in my life, but yesterday I sure did.
00:41:29.660Well, that's the show for today, and I just want to hold it up one more time because I'm sort of excited about this book,
00:41:34.840partly because Amazon tried to stop it.
00:41:37.020I don't know if you saw my YouTube video, but for two months they just, we started to upload this in April,
00:41:42.560and they rejected it and rejected it, and we appealed and they rejected it,
00:41:45.680and we had our lawyers write to them and they ignored it, and just last week they finally relented.
00:41:50.440So, I mean, it's a short book, but there's lots of footnotes in it.
00:41:53.000I bet you probably know most of this stuff from watching the show,
00:41:56.100but I know for a fact there's new details in here that I hadn't published before.
00:42:00.860I'm really excited about this, so let me know what you think.
00:42:03.400You can find out more at ChinaVirusBook.com.
00:42:07.900Don't mind me, I'm just a little excited that this came in the mail.