Rebel News Podcast - February 05, 2021


ā€œBillions have been invested in content moderationā€ | Facebook whistleblower Ryan Hartwig


Episode Stats

Length

45 minutes

Words per Minute

174.53944

Word Count

7,927

Sentence Count

492

Misogynist Sentences

3

Hate Speech Sentences

4


Summary

On today's show, we have a special guest, Ryan Hartwig of the Hartwig Foundation for Free Speech, a group dedicated to exposing social media bias. He joins us to talk about what he's learned about what's going on at the company, and why he thinks it's a problem.


Transcript

00:00:00.000 So I've spent quite a bit of time looking at pictures of hate organizations, Hitler,
00:00:23.240 Nazis, MAGA, you know, Proud Boys, all that stuff all day long.
00:00:27.240 Does it surprise you that he combines Hitler, Nazis, and MAGA?
00:00:34.960 He's describing hate organizations, he's moderating for Facebook, he kind of throws MAGA in there.
00:00:40.040 What is your reaction to that?
00:00:41.680 So yeah, he groups together hate organizations, Hitler, Nazis, MAGA.
00:00:45.760 So that's kind of how the moderators are conditioned to think, like, hey, anything that's right-wing,
00:00:51.120 hey, it could possibly be on the hate list.
00:00:53.480 In his first day, President Biden already issued a number of executive orders on areas that
00:00:59.920 we as a company really care quite deeply about.
00:01:02.200 We have a system that is able to freeze commenting on threads in cases where our systems are detecting
00:01:10.600 that there may be a thread that has hate speech or violence in the comments.
00:01:16.080 You are filing.
00:01:17.080 We have already initiated and will be publicly announcing and making available for everyone
00:01:21.400 a lawsuit that we're filing against Facebook Inc. regarding unfair competition, fraud, false
00:01:26.980 advertising and antitrust.
00:01:28.800 This is why I am suing the Facebook fact checkers.
00:01:32.400 I'm suing them on behalf of you, your favorite creators and news sites, on behalf of our freedom
00:01:38.480 of speech and thought.
00:01:39.480 At that point, I was seeing them interfering on a global level in elections.
00:01:44.480 And then I saw a blatant exception that just targeted conservatives or favored liberals.
00:01:57.480 Ryan Hartwig is a Facebook insider and whistleblower with Project Veritas.
00:02:01.120 He is president of the Hartwig Foundation for Free Speech.
00:02:04.440 And you can follow him on Facebook, Instagram, Gab and YouTube.
00:02:08.600 His channel name is Ryan Hartwig Official.
00:02:10.880 Of course, though, if you Google Ryan, you won't know most of that.
00:02:14.480 Why would you?
00:02:15.480 He is working to expose social media bias, however, and that is obviously not allowed.
00:02:19.880 Ryan, thanks for joining us.
00:02:21.080 How are you today?
00:02:22.080 Yeah.
00:02:23.080 Thanks for having me on.
00:02:24.880 Yeah.
00:02:25.880 Thanks so much.
00:02:26.880 I appreciate the opportunity.
00:02:27.880 Oh, we appreciate you coming on.
00:02:30.300 I love to talk about this sort of stuff.
00:02:31.700 I've been in it for years.
00:02:33.860 And as we speak, I wanted to get you on as soon as possible because there's new Project
00:02:38.380 Veritas stuff coming out daily now, just like people did with the CNN calls that were being
00:02:44.460 broadcast on Project Veritas' channels every day.
00:02:48.560 Now it seems like there's going to be a Facebook meeting all the time, whether that's with Mark
00:02:52.940 Zuckerberg or not.
00:02:54.840 So I just want to play most of the first video that came out this week from Project Veritas.
00:03:00.280 And with another Facebook insider leaking their meetings with Mark Zuckerberg and some other
00:03:06.600 staff there.
00:03:07.600 In his first day, President Biden already issued a number of executive orders on areas that
00:03:13.520 we as a company really care quite deeply about.
00:03:15.840 But there has been quite a lot of disquiet expressed by many leaders around the world, from the president of Mexico to Alexander Navalny in Russia and Chancellor Angela Merkel and others saying, well, this shows that private companies have got too much power and they should be only making these decisions in a way that is framed by democratically agreed rules.
00:03:38.160 We agree with that.
00:03:39.480 We agree with that.
00:03:40.480 We agree with that.
00:03:41.480 Mark would be very clear about that, that ideally we wouldn't be taking these decisions on our
00:03:45.480 own.
00:03:46.480 We would be taking these decisions in line with and in conformity with democratically agreed
00:03:52.480 rules and principles.
00:03:53.480 And at the moment, those democratically agreed rules don't exist.
00:04:01.800 We still have to take decisions in real time.
00:04:03.800 We have a system that is able to freeze commenting on threads in cases where our systems are detecting that there may be a thread that has hate speech or violence sort of in the comments.
00:04:17.720 These are all things we've built over the past three, four years as part of our investments into the integrity space or efforts to protect elections.
00:04:26.760 I wonder whether or not we can use Oculus to help a white police officer understand what it feels like to be a young black man who's stopped and searched and arrested by the police.
00:04:36.680 And I want every major decision to run through a civil rights lens.
00:04:40.360 I think that these were all important and positive steps.
00:04:43.560 And I am looking forward to opportunities where Facebook is going to be able to work together with this new administration on some of their top priorities, starting with the COVID response.
00:04:56.760 So that campaign is called Expose Facebook.
00:05:02.440 Ryan, why don't you tell us a little bit more, go a bit more in depth about what exactly we're seeing there from Zuckerberg and the team?
00:05:08.440 So yeah, in this video, we have, you know, a leaf call basically, and we have Nick Clegg, the head of global affairs for Facebook, talking about how, yes, Facebook agrees we should follow the democratic process.
00:05:21.320 So as an American, as someone living in the United States, this, this really shocked me, because, you know, if they had any respect for the rule of law or for the democratic process, then they would be following Section 230.
00:05:34.140 And I think it's insane to believe that it's correct in any country to delete the president of that country.
00:05:42.340 Like, come on, guys, that's that's the democratic process.
00:05:44.260 That's the leader of the nation.
00:05:45.860 So they sent it to their, you know, their advisory board or their their independent audit board.
00:05:53.180 And so, yeah, this is just incredible, the fact that they're saying, yes, we agree we should follow the democratic process.
00:05:59.580 Well, why don't you start with the First Amendment?
00:06:01.340 Why don't you're a public square?
00:06:02.980 Why are you censoring the yeah, the president of the United States?
00:06:08.740 So what do you think is the thought process behind some of the stuff we saw in there?
00:06:13.000 They're wanting to disable conversations that include stuff that they call free speech.
00:06:17.880 Do they actually want to make things better, the people working there?
00:06:21.820 Do they think that they're changing the world by doing this or do they understand?
00:06:25.540 Do you think that hate speech is actually subjective?
00:06:28.540 Are these people aware that they're what they're saying might be might be seen by bigger people and they're just saying what they think other people want to hear?
00:06:36.440 Do they actually believe that they're making a real difference by subjectively censoring speech?
00:06:42.920 I think they honestly think.
00:06:44.580 Yeah, they they honestly think they're making a difference.
00:06:48.320 They have been conditioned to think that way.
00:06:50.220 I mean, if you're living in Silicon, excuse me, if you're living in Silicon Valley, you're living in a bubble.
00:06:55.240 So a lot of it is, you know, think they're they think they're protecting the world and and from violence or whatnot.
00:07:01.720 So you had, you know, for example, in early January, you had an event in the capital, a protest, a peaceful protest.
00:07:10.020 And there was some violence, but you have on the opposing side.
00:07:13.800 You have lots of violence over the summer from Antifa and you see how Facebook treats them differently.
00:07:19.400 So I have it documented going back to 2017 that Facebook did not did not treat Antifa as a criminal organization.
00:07:25.760 And yet after and yet after one event at the capital, all of a sudden, all the Trump supporters are violent racists.
00:07:33.820 So, yeah.
00:07:35.120 So it has moved really fast.
00:07:38.540 And I struggle with I mean, I understand what you're saying there in Silicon Valley.
00:07:44.340 They're in this big bubble.
00:07:45.320 But I kind of think that people who are at the level of Mark Zuckerberg and guys who are top level engineers and all this, they're not stupid people.
00:07:54.980 They have to realize that there's other opinions out there.
00:07:58.220 Mark Zuckerberg says this stuff.
00:07:59.840 So does that sort of thinking go all the way to the top, do you think, where we're saving the world?
00:08:05.700 Yeah, I think part of it's saving the world.
00:08:07.300 Part of it is just maybe engineers doing their job.
00:08:09.540 So I worked as a content moderator for Facebook, but I was subcontracted by Cognizant.
00:08:14.620 And our goal was to make the client happy.
00:08:16.720 Facebook was the client.
00:08:18.020 So maybe there's engineers who, look, they're getting paid really well.
00:08:22.400 Maybe they have some more qualms about it.
00:08:24.040 But at the end of the day, they're getting a paycheck.
00:08:26.560 And that's all that matters for them.
00:08:31.480 So, I mean, there's people there who may think they're helping the world, helping protect the world, keep the world a safer place.
00:08:40.440 But, you know, that's part of the argument of Section 230, why Section 230 was created in the first place, was to protect the Internet from children.
00:08:47.740 And so now we're looking, now we're seeing that Facebook is using 230 as basically a brand protection tool.
00:08:53.480 Because if we were to follow Section 230 as it's written, like Facebook could not censor as much as they censor.
00:09:02.500 And Facebook would not be as popular because they could not restrict as much content as they're restricting.
00:09:09.980 And Section 230 has been misinterpreted by the Ninth Circuit Court.
00:09:17.480 And it's given Facebook additional protections.
00:09:20.720 And so Jason Fick, he had a lawsuit that was about to be heard by the Supreme Court.
00:09:25.620 And they declined hearing that case earlier this month in the United States.
00:09:31.180 It was Jason Fick, FYK versus Facebook.
00:09:33.900 So once again, the U.S. Supreme Court has failed us.
00:09:36.440 So our democratic processes have failed us.
00:09:39.100 I think we're facing a constitutional crisis because we had that election lawsuit in Texas, from Texas, that was tossed out by the U.S. Supreme Court.
00:09:46.740 And we also had this earlier in January, we had this case from Jason Fick thrown out.
00:09:51.500 So, I mean, we're trying to follow the democratic process.
00:09:53.840 We're trying to rein in these companies.
00:09:56.480 But they've been given undue protections under Section 230.
00:10:02.060 And Section 230 has been misinterpreted.
00:10:04.180 It needs to be reinterpreted by the Supreme Court, not repealed.
00:10:08.660 That would be the wrong solution.
00:10:10.800 So that's part of this larger debate.
00:10:12.440 I mean, yeah, you have these employees who think they're trying to do the right thing.
00:10:18.160 But the laws have been shaped in such a way that Facebook, they can do whatever they want and they never get in trouble.
00:10:24.200 So what Facebook should do is they should make a spirit of the policy decision.
00:10:28.400 So a lot of times when I was a content moderator, we would make decisions based on the spirit of the policy.
00:10:34.120 So the policy, the letter of the policy said one thing, but we could basically break the rules a little bit and make a spirit of the policy decision.
00:10:42.000 And Facebook should do the same thing.
00:10:43.660 They should say, look, maybe there is some violence with the president, but based on the spirit of the policy, the wrong thing would be to remove Trump completely from Facebook and Instagram.
00:10:55.480 So I think they should follow their, maybe do what we did sometimes as content moderators and follow the spirit of the policy.
00:11:01.840 But who determines what the spirit of policy is?
00:11:05.100 And that's very subjective.
00:11:06.400 And that's the issue is at the end of the day, it's six people on the global policy team in San Francisco, maybe in Ireland, who make these subjective decisions about the spirit of their policy.
00:11:18.940 Are these six people well known?
00:11:20.720 Are they public people or are they public and people who nobody would hear or have heard of if they heard their name?
00:11:29.120 I don't think they're public, but they should be public.
00:11:31.840 I mean, with the amount of power that Facebook wields, there should be a government oversight board in each country where these, you know, global employees make decisions that affect each country, whether it's Canada or the United States or Venezuela.
00:11:50.420 So, yeah, they really should.
00:11:52.940 Facebook really should be accountable.
00:11:56.000 Now, they say they have a, I see advisory board, but it's the wrong word.
00:12:00.300 There's some kind of an independent board that they have.
00:12:05.140 And so they said, oh, we sent it to appeal.
00:12:07.620 We sent our decision for appeal to that body.
00:12:11.660 But I think they really should just have, we should break up Facebook.
00:12:15.060 There's some antitrust legislation.
00:12:17.320 We really should break up Facebook.
00:12:20.620 And, you know, maybe each country should have their own version of Facebook.
00:12:23.720 I know in Brazil, where I've done some interviews, they have different versions of different companies like Patria Book.
00:12:29.580 And there's one called Conservative Corps.
00:12:32.380 But, yeah, there's other lawsuits that are coming out.
00:12:35.360 I just saw this one about Stephen Crowder.
00:12:37.020 Thanks for this one from Stephen Crowder.
00:12:40.260 I think he just announced it yesterday or two days ago.
00:12:44.680 Yeah.
00:12:44.860 So they're suing.
00:12:45.700 Yeah.
00:12:46.140 I wanted to play a clip of that.
00:12:47.440 We can get your thoughts about that.
00:12:48.700 Just a good minute and a half of what I think the crux of their lawsuit is going to be.
00:12:53.420 You are filing.
00:12:54.560 We have already initiated and will be publicly announcing and making available for everyone a lawsuit that we're filing against Facebook, Inc.
00:13:01.040 regarding unfair competition, fraud, false advertising, and antitrust.
00:13:06.620 Woo!
00:13:06.940 You've done more than all the Republican senators.
00:13:08.920 A little bit, yes.
00:13:09.920 A little bit.
00:13:10.380 Wow!
00:13:10.600 And I know that behind the scenes we've been disappointed that there have been four years and not a lot has been done about big tech.
00:13:16.780 Can you clarify?
00:13:17.880 Because a lot of people go out and complain here about being banned or having something removed and they're filing some petition.
00:13:25.460 This is different from that.
00:13:26.820 We are filing, in fact, you have filed a lawsuit.
00:13:29.300 And it's available at lightoithcrader.com.
00:13:30.680 There's going to be more information there.
00:13:32.380 We're going to be providing that.
00:13:33.700 We'll be keeping updates on folks.
00:13:35.100 That's why I'm going to be spending more time away from the show and focusing on the lawsuit as we move forward.
00:13:39.800 The reason why it's different is because we're going after Facebook based on its own words and its own promises.
00:13:44.740 It's a platform that was ever since 2016 when the Gizmodo article came out and said, oh, we're suppressing the feed.
00:13:50.460 We're taking certain views and we're going to suppress them in the trending and news topics.
00:13:54.920 We don't do that anymore.
00:13:56.040 That's what Facebook said.
00:13:56.860 We don't do it.
00:13:57.420 They told Congress we don't do it.
00:13:58.560 They told the consumers we don't do it.
00:13:59.800 They told us that they don't do it.
00:14:01.180 But over the course of the years, we've realized they actually are doing it.
00:14:04.420 And we've seen it from the election stream that was cut off from various posts and other things that have been suppressed.
00:14:09.560 Did we ever get a reason as to why that stream was cut off?
00:14:11.240 None.
00:14:11.440 So my question to you, Ryan, is just like Steven Crowder said, they're doing more than a lot of these politicians are.
00:14:19.400 I like to point as to one person as Ted Cruz as being somebody who is very vocally against the tech censorship and a lot of the stuff we've been seeing on Facebook, Twitter and YouTube.
00:14:30.320 But a question that most people ask and that I have asked is, why didn't the Republicans do anything when they controlled all the branches?
00:14:39.820 Are they getting stalemated by, let's call them, rhino Republicans or whatever you want to call them against people who don't believe in that as part of a broader problem that we have?
00:14:52.200 Or did they just drop the ball?
00:14:54.040 What do you think happened?
00:14:55.020 Why do you think it's taking these lawsuits that we're hearing about now to actually hold these companies accountable?
00:15:01.080 Yeah, Andrew.
00:15:01.620 So, yeah, Ted Cruz has been a great proponent, along with Josh Hawley, a senator from Missouri, and they've pushed back against Facebook.
00:15:10.140 I think, yeah, for the first two years of the Trump presidency, we controlled the Senate and the House and nothing was done about it.
00:15:18.620 And this last year, late 2020, we had quite a few different hearings with congressional hearings and Senate hearings where big tech executives were questioned.
00:15:29.300 But everyone always asks, well, you know, they keep on having these hearings and asking questions and nothing happens.
00:15:35.040 In July of last year, I helped with a criminal referral against Mark Zuckerberg that was submitted by Congressman Matt Gaetz for alleged perjury to Congress.
00:15:48.460 Because in 2018, right after I started working for Facebook, Mark Zuckerberg testified that they do not censor political speech.
00:15:56.340 But I have evidence that they do.
00:15:58.240 And so, in another hearing this last year, Jim Sensenbrenner is a congressman from, I believe, Wisconsin.
00:16:07.780 And he was talking about how, well, we shouldn't punish companies that are successful, right?
00:16:13.180 So, I mean, being successful is one thing, but literally destroying the competition is another thing.
00:16:17.920 And, you know, that brings into the question, I mean, the issue of Parler.
00:16:26.620 We have Parler that just got deleted, pretty much deleted off the internet.
00:16:30.500 The Amazon AWS servers were, you know, were removed off the internet.
00:16:36.600 But, yeah, the Stephen Crowder lawsuit is good because it's based on basically the anti-business practices.
00:16:48.580 I think antitrust might be the way to go.
00:16:50.460 I know also, I believe in December, 48 states filed a lawsuit against Facebook, an antitrust lawsuit.
00:16:58.400 So, if we can't get in with Section 230, I think antitrust might be the way to go.
00:17:02.460 So, I'm heartened by these various lawsuits coming.
00:17:07.720 Yeah, I hope it works out the same way.
00:17:11.600 Now, I was under sort of an, and I think at least a few people were,
00:17:15.720 that Facebook gave up trying to withhold their standing as a platform as opposed to a publisher.
00:17:24.220 And I guess from what I'm hearing from you is that I'm actually wrong about that.
00:17:27.100 So, there is a Laura Loomer lawsuit that she had with Facebook, which I believe, I don't want to misstate.
00:17:34.820 It was one of the Russian publications, either RT or Sputnik's article,
00:17:39.460 that said that basically Facebook in their lawsuit had admitted that they are a publisher
00:17:43.280 and therefore they can delete what they want.
00:17:46.640 But it sounds to me from what I'm hearing that I'm wrong about that.
00:17:48.920 So, I'll concede about that.
00:17:49.960 Another lawsuit that I want to talk about, because we're talking, these are all coming at good times, I'd say.
00:17:56.840 Candace Owens is starting, has started a lawsuit against particularly the fact checkers
00:18:01.740 that, you know, flag articles, get people taken off, their profiles taken off, community strikes,
00:18:08.480 things of that nature.
00:18:10.120 So, I want to play a bit of her clip that she posted.
00:18:12.760 I think she just had a baby, so she's probably not updating that right now.
00:18:15.740 But if we can go ahead and get to that clip, I want to get your reaction to yet another lawsuit
00:18:20.420 against Facebook's fact-checking company, I guess you could call it.
00:18:25.680 In 2016, hysterical liberals had to find someone to blame for their humiliating loss to Donald Trump.
00:18:32.740 In their minds, they could not possibly have lost due to their own horrendous candidates or policies
00:18:38.060 or their own failing message, so they attacked the one thing that they did not have total control over,
00:18:43.680 social media companies.
00:18:46.120 They applied extreme pressure to silence or censor fake news, which was just a fancy way of saying news
00:18:53.640 that they don't like.
00:18:55.500 Facebook bent to that pressure and created a fact-checker network with god-like powers over all of us.
00:19:02.480 Here is how Facebook fact-checking works.
00:19:05.500 A website you have never heard of, run by partisan beta-leftists, stalk the pages of your favorite
00:19:13.060 conservative personalities.
00:19:15.400 Whenever we say anything they disagree with, these fact-checkers write a vicious partisan hit piece.
00:19:21.740 Then they harass us and our audience by slapping hazardous warning labels on what we have posted.
00:19:27.800 Many times, those labels say missing context or disputed.
00:19:33.500 Yes, thanks Facebook.
00:19:35.540 Every political argument in America is disputed.
00:19:38.740 Every argument is indeed missing some context.
00:19:42.720 Now Ryan, how to get through Candace is obviously, she ought to make an entertaining video though,
00:19:48.600 but how accurate is what she's saying?
00:19:50.940 Take us from the point of where Facebook decided it needed fact-checkers.
00:19:55.780 Was that during the election in 2016 or was that decision decided before the election season
00:20:01.440 and just happened to be implemented?
00:20:03.180 How did it come about?
00:20:04.020 Do you know?
00:20:05.240 As far as I know, there was not much fact-checking going on prior to 2016.
00:20:09.700 So from what I heard, you know, when I was working at Facebook,
00:20:12.440 the kind of the rumors and when I asked around,
00:20:15.120 people would say that after 2016 is when Facebook decided to move all the content,
00:20:20.260 a lot of content moderators to the United States.
00:20:22.760 So prior to 2016, there weren't many U.S.-based or Canadian-based content moderators.
00:20:28.760 Now, content moderating is different from fact-checking.
00:20:31.340 Just to be clear, I was not a fact-checker.
00:20:33.920 But yeah, this momentum to prioritize U.S. elections and North American elections started in 2016, 2017.
00:20:44.200 So Cognizant, the company I worked for, received the contract in 2017, a three-year, $200 million contract.
00:20:51.600 And so, I mean, there's a lot of money being spent, billions and billions of dollars on content moderation.
00:20:56.480 Now, as far as fact-checking, yeah, I believe, you know, people,
00:21:00.160 I believe this would be a way that Facebook could essentially remove themselves from liability.
00:21:07.080 They could say, well, we didn't do the fact-checking.
00:21:08.980 We had this company do it.
00:21:10.100 But something that's really fascinating is that I believe is similar to what happened with these fact-checking companies.
00:21:16.460 Is, for example, we at Cognizant, we received guidance posts from, essentially from Facebook,
00:21:23.680 but it was posted by a Cognizant employee.
00:21:26.420 But I was told that they have a back channel to Facebook,
00:21:29.620 and they essentially just copy-paste the instructions from Facebook.
00:21:33.900 So Facebook, you know, emails the guidance to our supervisors, and then our supervisors post.
00:21:40.460 So, and we're using the workplace chat, the client tool.
00:21:45.180 So that's one way that Facebook removes the liability.
00:21:48.020 They can say, oh, well, we didn't give those instructions.
00:21:49.820 Well, yes, you did.
00:21:50.960 You gave instructions to a Cognizant supervisor who then told all of us to take a certain action.
00:21:57.280 And Facebook definitely prioritized, yeah, the 2020 election.
00:22:03.180 They created a new queue for content moderators.
00:22:07.560 And so, basically, more content was flagged.
00:22:12.000 They increased the amount of content that was going to be flagged because, yeah.
00:22:17.760 So, these are a few things that Facebook did.
00:22:21.800 They created a new civic harassment queue just for the 2020 U.S. election.
00:22:26.340 And I also saw that Facebook was, we were moderating content in Canada as well.
00:22:34.360 So, Facebook definitely prioritizes the elections, and we are their eyes and ears.
00:22:41.080 So, yeah, for example, before you, yeah, I guess I'll leave it at that.
00:22:47.360 So, what Candace is talking about there is they create, when there's a fact check,
00:22:51.600 and this has happened to us here, of course.
00:22:53.160 Like you said, Facebook is not responsible for the fact check.
00:22:58.320 Let's say you post an article about X, and then the fact check says, no, it's actually X, Y.
00:23:03.560 They create this article that says, here's why it's X, Y, and here's why you shouldn't believe what this article X says.
00:23:12.480 And a lot of that is completely subjective information, and a lot of it is such nitpicking.
00:23:18.120 And I want to bring up what I think is the original fact check that made it so that CNN has one, NBC has one, Facebook has one.
00:23:25.960 And, Justin, can we bring that up?
00:23:27.240 It's the famous acid wash fact check, and I'll never forget it.
00:23:31.680 Trump claims that Hillary Clinton acid washed her email server.
00:23:35.960 That's, it says, nope.
00:23:37.580 The truth, Clinton's team used an app called BleachBit, and she did not use a corrosive chemical.
00:23:43.140 And that, still to this day, this is what I see a lot of fact checks being on Facebook and Instagram.
00:23:48.140 And the problem with that is, when you say, it's not actually blue, it's a marine color, is that you're messing with people's reach,
00:23:56.980 which affects the amount of money they can make, which affects how many, their audience that they can reach,
00:24:01.540 which in turn will affect how many sales they might get.
00:24:04.260 Views turn into sales, they turn into subscriptions, they turn into merchandising, they turn into touring,
00:24:10.260 they turn into all sorts of things.
00:24:11.780 So, do you think that the fact checkers think, again, just like we talked about with Facebook,
00:24:18.120 do you think they're doing what's right?
00:24:19.640 Because the guy who started this website used to work for CNN.
00:24:23.620 I don't know if you're aware of that, but the guy who runs the main fact checking website.
00:24:28.780 So, do you think this was spawned out of an inherent bias?
00:24:33.180 Or, again, are we doing what must be done, Ryan?
00:24:36.120 Are we saving the universe from these slight differences in factoids?
00:24:41.780 Yeah, I think, I mean, the actual employees themselves are probably just trying to get a paycheck and make a living
00:24:49.420 and trying to make the client happy.
00:24:52.340 But, yeah, I don't think it's a coincidence that the former leaders from news companies are forming these companies.
00:24:59.080 There seems to be a very buddy-buddy relationship, kind of a close relationship with these tech executives
00:25:06.160 and the news organizations.
00:25:08.520 I mean, if you can control the narrative, then you can control so many things throughout the country.
00:25:14.120 So, yeah, so with the fact-checking companies, yeah, there's definitely, it's, they're definitely being influenced by these organizations, by CNN.
00:25:29.360 And it's, yeah, we should be looking to form a fact-checked jury.
00:25:33.980 Well, there shouldn't even be, I don't think there should be fact-checking in the first place.
00:25:36.680 I mean, it comes down to the idea of, you know, are humans intelligent enough to make decisions?
00:25:44.040 Yeah, there is misinformation.
00:25:45.980 Then if there's misinformation, then we should actually be focusing more money on educating people in the school system
00:25:51.020 so that they can have critical thinking skills.
00:25:54.240 So, it's essentially a slap in the face and offensive to say that we need fact-checkers
00:25:59.440 because it's saying that we're not capable of doing critical, our own critical thinking.
00:26:05.760 So, I mean, yeah, and this comes down to, you know, what is Facebook's actual goal?
00:26:10.980 And I was a content moderator for two years working for Facebook,
00:26:14.700 and I saw them influencing on a foreign level, giving instructions to delete certain things.
00:26:21.820 And there's another whistleblower who came back, who went public last year, late last year with BuzzFeed News.
00:26:27.760 Her name was Sophie Zhang.
00:26:28.800 She was a Facebook data scientist, and she has said, well, she cooperates the fact that they are influencing elections on a global level
00:26:36.640 by their inaction and allowing political leaders to, you know, basically rig the system
00:26:46.780 and engineer Facebook using bots to manipulate elections.
00:26:52.540 But, yeah, what is Facebook's actual goal?
00:26:54.460 I mean, what is their endgame, right?
00:26:57.120 I mean, a lot of people ask this.
00:27:00.180 And, I mean, the only thing that really makes sense is going to sound kind of cliche,
00:27:04.940 but the only thing that makes sense is world domination.
00:27:08.100 I mean, they're a global company.
00:27:10.680 They say they're trying to unite the world or whatnot, but it kind of reminds me of Blade Runner.
00:27:16.820 You know, you have these global corporations that just take over, and they don't respect the rule of law.
00:27:23.140 They don't respect democratic processes, contrary to what Nick Clegg, the head of global affairs, says.
00:27:29.600 They don't care about the First Amendment in the United States.
00:27:32.320 They could care less.
00:27:33.060 All they care about is their brand and making money and protecting their brand.
00:27:39.140 And right now they're running roughshod over other countries, over conservatives throughout Canada and the United States, throughout the world.
00:27:46.340 They're running roughshod over nationalist movements, just people who are patriotic,
00:27:51.320 who want to prioritize their country and make their country great.
00:27:57.780 And so that's what this comes down to.
00:28:00.080 I mean, Facebook has way too, regardless of whether you're on the right or the left, like Facebook has way too much power right now.
00:28:06.060 We need to rein them in.
00:28:07.880 They are not respecting the rule of law.
00:28:09.880 They're not respecting political leaders.
00:28:11.620 Florida just passed, the governor of Florida is passing legislation to fine the tech companies $100,000 a day when Facebook deplatforms political candidates.
00:28:26.420 So the fact that that's where we're at says a lot, where we have to take action against Facebook because they're deplatforming political candidates.
00:28:35.760 So that's part of the, once again, going back to the original statement from Nick Clegg, that's part of the democratic process is people run for office.
00:28:43.820 And in a public forum, you cannot silence a public, someone running for public office.
00:28:49.000 They have the right to speak, to communicate.
00:28:51.340 And so Facebook is not following, following the spirit of, of the first amendment in the United States.
00:28:58.240 Yeah. And a lot of this has to do with a lack of a competition.
00:29:01.960 And as you just said earlier, they can just kick off almost anybody who, who dissents and they're taking it out of the hands of the regular person of you and me to decide what we want to listen to and, and what we believe and not believe.
00:29:15.460 One more thing about censorship that it came like kind of home is as if it hasn't come home to roost with you enough, but you were suspended from Twitter.
00:29:25.060 I checked before we started this interview. You're still suspended. What day did that happen? And do we know why yet?
00:29:30.220 Yeah. So this past Thursday, um, yeah, this past Thursday I got suspended and I don't know why, like I, I posted a link to, um, what was the name of it? Uh, pocket net dot app.
00:29:44.980 So yeah, I got suspended this past Thursday, January 28th. There was a couple of things I shared that week that maybe were questionable.
00:29:52.600 I actually did post or share a screenshot of when I filmed at Facebook last week as well. So maybe they looked at that, but, uh, I mean, this is, yeah, it's, it's crazy.
00:30:04.380 So I had about 35,000 followers. I have about, about half of that's probably in Brazil. Um, but yeah, after January 6th, there were a lot of people purged.
00:30:16.440 So I, early January, I was about, I was at 43,000 followers and then I dropped down to 35 and then they banned, they suspended me, but I emailed them. I'm going to probably be sending them a legal letter this week to their headquarters.
00:30:29.240 And, uh, yeah, it's, it's just super ironic. Um, it is super ironic, Andrew, that they suspend the, uh, president of the Hartwig foundation for free speech.
00:30:39.580 Uh, I get your, I'm a free speech foundation. My, my purpose, I'm a nonprofit in Arizona. I'm going to apply for 501c3 status, but my purpose is to advocate for free speech on the internet.
00:30:49.600 And guess what happened to me? They suspended me. So I think that's kind of funny.
00:30:53.660 Yeah. It seems like the only way at this point to move forward with these people is litigation. So that Portuguese, is that what you speak? If you're that big in Brazil, I saw an interview with you. I wasn't sure if it was Portuguese or Spanish.
00:31:07.640 Yeah. So I, I do speak fluent Spanish. I'm actually more comfortable in Spanish. And I've done some interviews in the last six, seven months in, in Argentina and Columbia. There's a guy in Argentina named El Presto. Who's the, he's like the Mark Dice of, of Argentina or, or the, or the Steven Crowder of, of, uh, Argentina. And he actually got thrown in prison because for like, uh, you know, speaking ill about the president.
00:31:32.680 Well, that's a good sign. He's, he's out now, but, um, yeah, so I speak Spanish fluently, very fluently. I lived in Mexico city for a couple of years and I speak some Portuguese as well. Like a little, maybe not, not, not as much as the Spanish, but I've done a lot of interviews in Brazil.
00:31:49.240 So Brazil is facing a huge amount of censorship, um, and has been over the last couple of years. So president Bolsonaro actually, uh, used social media to his advantage, much like Trump and won the election. Cause the media in, in Brazil, of course, is not going to, I don't think they have a much of a conservative media as we do. Like they don't have the Fox news or whatever.
00:32:13.220 Um, or what Fox news used to be. Right. Um, so, so yeah, port of Brazil has been big. I went down to Brazil about four months ago in September and met with a congressman down there and we talked about big tech censorship.
00:32:27.560 And, uh, uh, they wanted me to try to help them, uh, to follow a lawsuit or have us senators take action or, or an executive order against Brazil's Supreme court.
00:32:40.860 Cause their Supreme court was basically, basically sending the police to conservatives homes, conservative donors homes and, uh, confiscating their laptops and prosecuting them just for supporting financially, uh, conservative media.
00:32:56.420 What is his, uh, term there? Is, is it five years in Brazil, the term for the, for Bolsonaro or when is his, like, when does the next election happen?
00:33:07.160 Yeah. His next election is next year, 2022. So I believe it's every, I believe it is every four years. Um, I may, maybe mistaken, but yeah, he's coming out for election next year and Brazil just had some local elections where they're actually using the, the same Dominion software voting machines.
00:33:24.580 Oh, that's good. But, but yeah, but unlike here where we can actually do some, uh, counting, physical counting of the ballots, there are no, absolutely no physical ballots.
00:33:35.940 It is all done electronically and there can be, and there can be no third party audits from, uh, outside companies.
00:33:42.020 That's insane. So that's one of the biggest problems I have with the voting system in the U S like in Canada here, you go in, you put an X on the piece of paper,
00:33:51.940 you hand it to the senior citizen that's taking your thing and that's it. There's no, I think they use a counting machine, uh, but there's no Dominion software.
00:34:03.240 Thank God. But they did use that for the conservative party election here. So who knows? And, uh, Dominion headquarters was, or is, has one in here in Toronto, which is always great news.
00:34:15.680 So one thing I wanted to ask you about parlor, you mentioned it earlier. How do you, how did this happen? Do you have any inside information on that?
00:34:24.540 Obviously the Amazon web servers, which is a thing probably a lot of people didn't know about. I mean, it's more advertised in Europe.
00:34:32.160 I see soccer fields have AWS on their score, their score displays and stuff. And I don't think a lot of people knew that, uh, Amazon was hosting here.
00:34:42.000 And do you think it was, it was purely political? Did they come out with a reason as to why I know that some people wanted them parlor to agree to certain rules. Do you have information on that?
00:34:53.320 That's a good question. So, uh, it all, so yeah, the Amazon AWS servers suspended or removed parlor. Now, now Gab was smarter from the get go.
00:35:02.860 So they had their own servers built out and Andrew Torba, the CEO of Gab is doing a great job of trying to create this parallel infrastructure.
00:35:11.720 So that's what we really need is a parallel infrastructure, kind of a dual culture slash, you know, infrastructure for technology.
00:35:19.000 If they're going to keep on, you know, censoring us, removing people. Um, so, but yeah, with parlor,
00:35:24.780 I know they were getting really, really popular right after, you know, leading up to January 6th, the event, uh, in, in, uh, Washington DC where Trump gave a speech.
00:35:35.580 And I try to avoid using the word riot, or of course I would never use the word insurrection, but you know, I, cause I, it is a mischaracterization of the event.
00:35:44.380 I was actually there on January 6th. So I flew in on the 5th and I was there through the 7th.
00:35:48.740 And I mean, there was like probably 2 million people there and you have like a handful of people, maybe a couple of thousands who were, who were being violent.
00:35:56.940 So it was, yeah, it was very, very peaceful. Um, so yeah, we had right after that event is when the crackdown began.
00:36:06.300 So they use that as an excuse. They use the mischaracterization of the peaceful protest as an excuse to ban parlor.
00:36:15.400 And so their argument is that parlor, uh, removed that was, was allowing objectionable content, allowing people inciting for violence.
00:36:24.520 Now parlor does approach, I believe they approach their content moderation a different way, but they don't have the billions of dollars to spend on content moderation.
00:36:34.240 Like Facebook does Facebook probably has at least 10,000 content moderators in the U S. Um, if not more.
00:36:41.060 And so that that's another reason why we should not repeal section 230, because if we repealed section 230, then you would need a whole lot more either.
00:36:51.340 You could, yeah, the, the larger businesses with more money would, would basically be in a better spot to capitalize, um, on everything on, on new legislation.
00:37:01.220 So that's why we shouldn't repeal section 230, because if we started the, the, uh, legal legislation process, process, process again, Facebook would have an advantage, but yeah, parlor, um, yeah.
00:37:12.740 So parlor got targeted because everyone was on parlor.
00:37:16.540 I mean, you got a lot of big conservative voices on there.
00:37:19.500 Dan Bongino, I think it was really big on parlor.
00:37:22.060 So I don't, we haven't, we haven't, we've never seen Amazon act this openly.
00:37:30.100 Normally it's been the other companies that have been more blatant about their censorship.
00:37:34.100 So I was fairly shocked by it.
00:37:37.100 Um, yeah, Jeff Bezos usually, I mean, it's a service we use all the time.
00:37:42.400 Um, yeah, parlor.
00:37:45.700 Yeah.
00:37:46.100 They got suspended and removed from the AWS servers.
00:37:48.920 So John Matz, I was going to do an interview with him in Brazil, actually on, on the, one of the largest conservative YouTube channels in Brazil, uh, TerƧa Livre.
00:37:57.180 But I think, yeah, John Matz, he had to go into hiding because the left targeted him.
00:38:02.740 So he had, he was very concerned for his physical safety.
00:38:05.680 So we've seen a lot of things since January 6th.
00:38:08.500 We've seen a huge backlash, cancel culture against conservatives.
00:38:12.260 Um, Mike Lindell, the, the founder of mypillow.com, very close with the president of the United States.
00:38:20.840 He, um, Mike Lindell is being, you know, he's had a lot of backlash.
00:38:25.040 They're trying to cancel or different companies are canceling his pillows in their stores.
00:38:29.700 I met with him about a month ago in late December at an event in West, West Palm Beach, Florida with a turning point USA with Charlie Kirk.
00:38:36.900 But, um, but yeah, um, there's just been this huge pushback and, and they're arresting a lot of people who were at the, uh, Capitol event.
00:38:45.740 But yeah, Parler, um, look, like going back to it, like Antifa was on Facebook for years and years.
00:38:52.500 They were organizing and Facebook knew nothing about it.
00:38:55.360 And then you're going to say, that's okay.
00:38:58.700 But the second that a new social media platform pops up, you're going to, you know, take down a few conservatives who, I don't even know if they were even inciting violence.
00:39:07.920 They were just mentioning they were at the event and the, but that suddenly becomes violence.
00:39:12.220 But, you know, for years and years, the left has always said, well, if you want to, if you don't like our platform, create your own.
00:39:17.800 Well, we did, we created Parler.
00:39:19.560 And then what did you do?
00:39:20.520 You deleted it off the actual internet.
00:39:22.540 So what's taking it so long to come back?
00:39:27.180 I think, well, the word was that they're not coming back at all.
00:39:31.640 I don't know.
00:39:32.920 Hopefully they can find a server that can host them.
00:39:36.220 They were going to be host.
00:39:37.500 I think they were finding, they're going to find a Russian server that would host them.
00:39:41.040 But of course the media is quick to point out that there's other, um, like truly racist and abhorrent groups that are hosted by some of these Russian servers.
00:39:50.620 Um, so it's almost like a crime of, uh, of just, just being in the same room with these people.
00:39:58.460 Oh, you can't be on that server.
00:39:59.940 They, they also host other content that's bad.
00:40:01.760 So just guilt by association is basically what the left is.
00:40:05.820 That's the left is the left's tactics right now is to guilt Parler into.
00:40:11.100 Yeah.
00:40:11.720 And, but I know, but the good news is Parler, I think it did file a lawsuit against Amazon.
00:40:16.880 So hopefully they can, uh, come back at some point.
00:40:21.200 Yeah.
00:40:21.620 I hope so too.
00:40:22.940 And because I liked Parler more than I like Gab and, and mine's no offense to them, but I just felt like it was just alt Twitter where I'm not getting censored.
00:40:32.600 It was much easier to grow on there.
00:40:34.180 So the last thing I want to leave you with is a little bit of fun for me and you don't have to answer these questions just because of your whistleblower.
00:40:42.460 And I know you, you want to keep it, uh, being as honest as you can and not be associated with things that we just talked about.
00:40:50.180 I want to ask you about the conspiracies about Facebook.
00:40:53.280 There's long been this conspiracy about Facebook and the, the DARPA program life log start ending the day Facebook started.
00:41:01.800 And, you know, Mark Zuckerberg, they had the movie, how Facebook actually started, you know, they presented in a certain way.
00:41:09.660 Is there any truth to the, that, you know, to the government having a hand in Facebook?
00:41:14.300 Is it actually just a data mining company?
00:41:16.780 Is that, is that the point that they don't really care about communication with the masses?
00:41:21.020 They just want to be able to sell data.
00:41:23.040 Can you touch on any of those points or would that get you on a, on a target list?
00:41:28.520 No, I think I can talk about that some, I mean, I'm watching this show on Netflix right now called, uh, like spy craft.
00:41:35.280 It's here in the U S it's really good.
00:41:36.640 And it talks about like the CIA's tactics throughout the years for spying on people.
00:41:39.820 And like, so, I mean, if it, you know, if it were, if Facebook were a CIA program,
00:41:46.680 I mean maybe, who knows, maybe it is, I mean, it would be very, very useful.
00:41:51.660 And, you know, if, if, uh, if Mark Zuckerberg were a CIA asset or whatever, well, he is, he, he's an effective CIA asset.
00:42:01.760 Right.
00:42:01.900 I mean, like if you're going to go down that road, like, yeah, you know, Mark Zuckerberg does a good job of, of, in doing what he does.
00:42:08.360 Uh, he reads, you know, he just reads a script or whatever, um, follows the rules and, uh, his own rules.
00:42:15.780 And it's funny, cause I think their motto at one point was to move fast and break things.
00:42:20.900 Um, and they definitely broken a lot of things.
00:42:23.840 It's funny how they all changed their slogans further down the line.
00:42:27.920 Wasn't Google's don't be evil.
00:42:29.520 I believe at one point, and then who knows what, uh, what Microsoft was, but I feel like a lot of these companies have started in not so honest terms.
00:42:40.940 So it's hard for me to believe that, oh, Mark Zuckerberg's actually this great guy, even though he shafted the, uh, the, the Winklevosses and Bill Gates is an amazing guy, but he, he screwed over his buddy there.
00:42:53.340 And with, uh, with, I think after they, they left Xerox and he, he takes the, the model of the Apple.
00:43:00.840 So, yeah.
00:43:01.940 And then you have the, the Microsoft, uh, lawsuit in the nineties where they were building stuff to fail.
00:43:08.540 It's crazy stuff, Ryan.
00:43:09.860 And I'm glad that we have a person who's willing to speak about this, especially with the rest of Project Veritas.
00:43:15.700 Usually has everything covered.
00:43:16.980 I watched during the election, of course.
00:43:18.680 And then as you mentioned, all that stuff gets thrown out.
00:43:21.080 So it's a great, great system we're living under.
00:43:24.300 So you want the final word here, Ryan?
00:43:26.420 What should we, what should we be looking out for in terms of these Facebook lawsuits and anything with Project Veritas or anything you're working with?
00:43:33.980 Yeah, I think lawsuits are the way to go.
00:43:35.880 I mean, we want to exhaust all of our options before we become, uh, before we, you know, before we go take the other route of, uh, you know, like an Arab spring type route of an insurrection or an actual insurrection.
00:43:49.620 Not a, not a fake insurrection with someone wearing, uh, a Viking corns, right?
00:43:54.380 But, uh, yeah, we want to definitely, uh, exhaust all our legal options.
00:43:58.700 I think that's the way to go.
00:43:59.900 But we've seen that the U.S. Supreme Court has not acted and not taken decisions on the things they should have.
00:44:06.040 So I, we're kind of at a breaking point in the United States.
00:44:08.500 I hope we don't just, uh, you know, lay down dead and, and give up.
00:44:15.460 Um, so lawsuits are the way to fight right now.
00:44:17.620 I know Project Veritas, I'm sure they have lots of, lots up their sleeve right now.
00:44:21.160 Um, but once again, you can, you can present all that evidence and get it to the courts.
00:44:26.320 But, um, if the courts don't do anything about it, then what, then what, what next?
00:44:30.340 But yeah, I'm, I'm optimistic about this year.
00:44:32.640 I think it's going to be a great year, uh, having, you know, united, having someone to unite against when Joe Biden is, makes it for an interesting, uh, interesting battle.
00:44:44.660 So yeah, I'm, I'm optimistic about this year.
00:44:46.740 I think we're going to do great things, uh, but we really need to fight against big tech.
00:44:50.520 So if you want to learn more about me and what I'm doing, you can go to ryanhartwick.org.
00:44:55.780 And, uh, yeah, we're in this fight for, for the long haul.
00:44:59.020 And I think we're going to do a great job in, in fighting, uh, tech censorship.
00:45:02.980 Yeah.
00:45:03.420 And I'm, I'm glad, uh, I'm glad to have, to speak to you, Ryan, and I'd encourage people to go to your YouTube channel.
00:45:09.180 I think it'd be great to hear you do, uh, short bites about just stuff that people wouldn't normally know about these social movements.
00:45:16.740 So thanks once again, and we'll talk to you in the near future and, uh, hopefully things do get better in 2021.
00:45:23.060 Thanks a lot, Ryan.
00:45:24.380 Thanks, Andrew.