00:13:54.440It was a third party vendor named Cognizant Technology Solutions.
00:13:59.500And so Cognizant had a deal with Facebook that they were going to take a part of their content and they were going to work with it offsite.
00:14:11.580And we did have some people from Facebook stop by occasionally to see how everything was going.
00:14:17.100I remember I saw one of Mark Zuckerberg's personal lawyers there once.
00:14:22.920But for the most part, this was while we had the Facebook tech to go do all the stuff and we had all Facebook's content, it was managed directly by Cognizant.
00:14:33.580And so going forward with that, it was in Tampa, Florida, and it was in this place called Woodland Center.
00:14:42.880So it was just this kind of giant parking lot, kind of just in the middle of nowhere.
00:14:48.020And it was just this small building that had it there.
00:15:16.780They just can you if they see the phone because the way that they describe it is all this is other people's sensitive content.
00:15:23.060So they don't want anybody recording, taking notes, anything, which I understood for like the phones.
00:15:28.800But for like pencil, paper, they didn't even allow that in there.
00:15:32.720So you go in there and basically you have these two incredibly small bathrooms that only have about two toilets, two urinals in the mail bathroom.
00:15:42.620And I believe just two toilets in the girl's bathroom.
00:15:46.480And this place housed about a thousand people.
00:15:49.540So it was terrible conditions for the bathroom.
00:15:54.740And so you have these two small bathrooms.
00:16:06.540They did set up this little store there and every week they would give you $20 on a card and you could go eat whatever you wanted to as long as you didn't go over your $20 at this little store.
00:16:20.320But for some reason, all of the food there was junk food.
00:16:26.480And sometimes they would have like fruit cups, salads, but all the people that were higher up got all those.
00:16:32.320So it's like the team leaders, the trainers, they were able to get the good food.
00:16:36.780And so everybody else was really just eating junk food all day.
00:16:39.260And what really kind of hurt me a lot is I would see this graphic content and what my people would tell me is like, hey, go grab something there.
00:17:26.340They say that they wanted it all open, but it was just nothing but computers.
00:17:33.200And there was nothing else there for us.
00:17:36.200They had this one tiny room in the back corner, what was supposed to be like a calm down, relaxing room where it's like, if you need to calm down after seeing this content, you could go in there.
00:17:48.560They had like a little billiard board and all that.
00:17:51.640But no one was ever allowed in there because they would like time you obsessively if you ever got up from your seat.
00:17:58.700So it would be like if you go to the restroom, they would literally put a timer on your computer.
00:18:04.620And if it goes over the amount of time, you're going to get yelled at by the boss for going over the time.
00:18:10.340If you're on a break and you're like a minute late from your break, the computer alarm will go off and it's immediately going to tell your supervisor that you're not there.
00:18:19.540And even if you're just like right around the corner, you're going to get another warning, you're going to get, you know, dock and pay.
00:18:26.600So they were very like obsessive with you being at your computer at all times.
00:18:32.020In fact, what was just really ridiculous with it is how there were people there that wanted to just get up and walk after spending like four hours doing nothing at the computer.
00:18:46.220And like the bosses just would not have it.
00:18:49.340They just wanted to have a body in the seat.
00:18:51.920They didn't really care how that body was doing.
00:18:54.540They just wanted the person in the seat.
00:19:44.820Facebook is not holding them accountable.
00:19:46.460In fact, it was Cognizant that was the one that backed out of the deal with Facebook about being a vendor for their content.
00:19:54.240Facebook was not holding them accountable at all.
00:19:57.360Cognizant was able to do whatever they wanted without really any interference as long as a certain amount of content was being looked at every single day.
00:20:18.200What percentage is successful to them?
00:20:19.880So a percentage that was successful to them is they wanted each and every employee to do about 500 to 1,000 pieces of content every day.
00:20:30.700And I was usually doing about 100 to 200 contents on usually a day where there was not much going on.
00:20:38.520Because going through this content, when you're in a specialized area, you have to not only just action it, you have to go through the Internet, find out the source of it.
00:20:49.320If you can find it, verify if it's real or not, make sure you have the links, make sure you say how you found it, and then write up a report on it.
00:20:58.740So a great example would be like, I'm finding a lot of these terrible bestiality videos.
00:21:05.600And I was able to find a bestiality website, and I was able to find those same videos from there.
00:21:10.400So I was able to find the source of it there.
00:21:12.440So then I would write up a report saying that this was not an original Facebook piece of content.
00:21:16.920This was something that was taken from another website.
00:21:19.720And then I would write up my report saying why it would be taken down, what the actions would be, what it breaks in the policy.
00:21:27.400So that does take up a good amount of time, especially when you're basically going through the whole Internet looking for a piece of content like that.
00:21:34.540But there were a lot of times when my trainers would tell me, go slow, do it, you're doing great.
00:21:42.960And then I would have the team leader telling me, no, you need more content, more content, more content done.
00:21:47.720And so I was getting two different perspectives on what I needed to do there.
00:21:54.180There was one that was telling me I needed a bigger number of content.
00:21:57.700And then I had another group of people telling me that I needed to just focus on the accuracy of what I was doing because these were real people, real animals in these situations.
00:22:08.540It was really pretty much the ethical side versus the business side.
00:22:12.480I'm looking at this Tampa Bay Times article that says Facebook agrees to pay $52 million settlement with content moderators who suffer trauma on the job.
00:22:20.800Former and current moderators in Florida and across the country will receive $1,000 each and may be eligible for more money to cover medical treatments and damages, et cetera, et cetera.
00:22:30.240Is this, did you see this actually become a reality?
00:22:33.620Like did you guys see payment from Facebook on this $52 million?
00:22:50.420I originally didn't want to because I didn't want anybody to come across thinking that I'm just doing, like coming out and saying all this stuff for money.
00:22:57.660Like there's a message I wanted to bring about this.
00:23:00.720But I decided that it probably would be best, especially after everything that's happened to me.
00:23:05.520And then there's another article that says a total of 556 employees will be laid off early next year from a controversial facility near Carolwood that monitors Facebook for banned content such as hate speech, bullying, threats, and videos of violence against criminals and children, violence against animals and children.
00:23:21.480Cognizant Technology Solutions, a contractor for Facebook, plans to close its operations at 7725 Woodland Center Boulevard, about two miles north of Tampa Bay, Tampa International Airport.
00:23:52.700Why did they drop Facebook if it's an account that's paying you good money?
00:23:57.000I believe it was just the bad amount of publicity that they were getting.
00:24:00.100Last I heard about it is their stocks were starting to drop, and there was a lot of negative news going on about how they were treating everybody by being a vendor for Facebook.
00:24:10.240So they decided just to drop the account altogether.
00:24:14.080Is this the same Cognizant company that's a publicly traded company doing like $16 billion a year, like doing revenue is, yeah, $16.65 in 2020.
00:25:06.320It looks like a reasonable company that does good things, and they decided to cut this relationship.
00:25:12.640So let me go a little deeper with this.
00:25:14.940So, one, you explained what it was like for you guys.
00:25:18.060I can only imagine how challenging it could be.
00:25:21.640And you were explaining the fact that fetus on the bathroom wall and, you know, in the stalls where you wash your hands, and you said sex in the bathroom.
00:26:14.100They showed some memes, like, that had to deal with, like, if you had, like, pineapple on your pizza, you would die.
00:26:22.060Or it was just the incredibly benign things like that.
00:26:24.840There was nothing that was, like, overly bad.
00:26:27.160The biggest thing that they showed in that book, and it was a picture that CNN took of this little boy, and I believe he was taken from the Middle East on a plane, and he was all bloodied up, and he was all dusty, and they showed us that.
00:26:42.180They said, this might be the worst it gets.
00:26:48.140And so, that was as far as they showed us, saying, like, this is going to be the type of content you'll see.
00:26:53.600So, of course, I'm thinking, if that's the worst, I can do this, right?
00:26:58.840In retrospect, I do believe that they were just hiring bodies to fill the seats.
00:27:04.520I do not think that they were hiring the best people.
00:27:08.860I think they seemed like they were fortunate that they had someone like me that had a college degree, that had a background in going into public records, that was able to do the stuff they wanted me to do.
00:27:20.720Because a lot of my job at the Graphic Violence Division was to go through content, and if it was in the United States, I would go through the accounts, I would go through their LinkedIn, their Facebook, Twitter, anything, any sort of public accounts.
00:27:37.960I would go look at any information, look at their property records, housing records, go through any sort of utility bills that I could find, and I would basically write a portfolio of them.
00:27:47.780And I did this a lot in college when I took public affairs reporting, so I was very familiar how to, like, find public records, go through these areas and do this.
00:27:56.760So that was one reason why they wanted me on the Graphic Violence Division, was because I had a good bit of knowledge on how to basically find out how they put it, find the bad spot on the apple.
00:28:09.360So you start off as the people at the bottom as general queue, which is meme, stuff like that comes up.
00:28:17.360You get a promotion, you become a social media content analyst, and you start seeing stuff that just makes no sense.
00:28:25.600Walk me through the hierarchy structure.
00:28:35.860It's more just like the general queue is at the bottom, and what you specialize in would just be all a big circle that is above the general queue.
00:28:46.240So it's not so much that there was anybody like another department that was above me.
00:28:53.080It was very much just different departments that focused on different types of content at that point.
00:28:58.400And those departments were very small.
00:29:02.760So in my department, it was me and seven other people.
00:29:06.900And we were the whole Graphic Violence and Hate Speech Division.
00:29:11.500They had one that was for bullying that had just 10 people on it.
00:30:17.100So, okay, so when that happens, it's like that video that took place where they left it on.
00:30:23.000Did they leave it on in specific countries, or did they leave it on across the board, anywhere, America, all of that?
00:30:29.320Or did they filter America out because that's the one place they didn't want to leave on?
00:30:33.000They left it on everywhere, all countries.
00:30:36.800And the specific reason for that is because Facebook has hyper-specific types of policies.
00:30:43.700And for graphic violence, what they say is videos that are not in a medical setting, and you can see visible innards, are not allowed.
00:30:53.980And so according to this, this was a video, this was not in a medical setting, but they claim that the skull fragments and the bit of brain matter that came out was not visible innards.
00:31:08.720So they said it could stay on, and so they allowed it.
00:31:18.000What other videos that obviously made no sense to stay on, they would leave it on?
00:31:22.640What other videos did you see where you're like, okay, there's, like, for example, the pig video you talked about, did that one get taken down?
00:31:29.000So that one did get taken down, but it got taken down specifically because that was not one that was in the general area of where we get our content from.
00:31:40.600That one was in a private Facebook page.
00:31:43.420And this was what I really specialized in, was that there were a lot of private pages.
00:31:49.280And they would be named something like, I love dogs, or let's go bulldogs, or anything like that.
00:31:56.400And so you would just see it, it would be like a very general name, you wouldn't think anything bad about it.
00:32:26.360They were people that were making these videos.
00:32:29.720And they were inviting people into the page to buy and auction the videos.
00:32:35.400So what the hardest part about that was, is that they were using the Facebook payment system.
00:32:43.540They were using other means of getting currency.
00:32:47.120And they would have a horrific video, such as the video of the pig and the girl.
00:32:53.620And people would actually bet an auction on how much money so they could get the full video.
00:33:00.900And then that person that made the original video would also take requests from others and make content specialized to tailor, tailored to them.
00:33:15.180And a lot of these, this type of content that was homemade by the parents, by the children, by the people that were making money off of it.
00:33:24.460This was stuff that was just in the private area of these private pages.
00:33:29.740And Facebook would not allow me to do anything about it.
00:33:33.140Because even if I wrote the report, even if I could prove that they were in America, even if I had their driver's license, their face, their license plate, their home address,
00:33:43.220they said that it would, they said that it would look bad for them.
00:33:48.440And they said that an internal Facebook team would look at it.
00:33:52.040And yet that same content was still on there.
00:33:56.920So are you saying if they really wanted to catch the bad guy, they could, but because they could figure out where it's coming from?
01:02:08.380So if we had someone that was not him in control, then perhaps we could actually get something done.
01:02:16.480Because I have, I don't see Zuckerberg ever bending the knee, so to speak, to actually make these changes.
01:02:24.980What I think needs to be changed could be Facebook needs to be divided up into different segments.
01:02:32.220And then have these different independent individuals actually be in charge of these segments.
01:02:38.380So maybe we really do need some sort of government influence looking into this.
01:02:44.800I know a lot of people will claim that that would be like an oversight of the American government.
01:02:50.460But when something has grown this big and it has such an importance in not just our society, but world society,
01:02:59.260maybe it needs to not be a private company and maybe it should be some sort of public entity.
01:03:03.860I mean, Facebook is so big, along with other social media sites, that maybe we shouldn't consider it a private company.
01:03:15.620Maybe we should consider it as something as a public square.
01:03:18.820Maybe it should be considered something that would be the equivalent of going out in public and saying something and not something that is controlled by a private entity.
01:03:29.520The only other thing I could possibly think of Facebook to do is maybe Facebook just needs to do a total revamp of their policies.
01:03:40.980Because when I was working there, even though the policies were changing by the day, they were only changing by specific current events and they would only change for like specific people.
01:03:49.940So like a great example would be during the Justice Kavanaugh hearings.
01:03:57.620Normally, you're not allowed to make any sort of disparaging bullying comments about people that claim that they are survivors of rape or molestation or anything like that.
01:04:08.540But Facebook made an exception with Dr. Ford.
01:04:11.580So you were actually allowed to make fun of her for that.
01:04:32.400And for normal cases, you're not because it would get deleted.
01:04:35.140But those were like the examples of like policies changing for specific people or specific events and their policies were so hyper specific.
01:04:46.580I think in a normal ruling body, we could all agree that, you know, killing, killing animals online should be banned.
01:04:56.740I don't think there's any reason why we should have it on there.
01:05:01.460But apparently, there's still like a lot of stipulations like you can kill animals on video as long as it's in like in eating, preparing food setting.
01:05:13.160Or you can also kill animals if it's in a perceived self-defense setting, where there's a lot of people that took advantage of those and they just like mutilated a bear because they claimed that they were hunting a bear in self-defense.
01:05:25.860And so they had a caption that said like killing that said, like, you know, we're killing the bear in self-defense.
01:05:32.980And they were like ripping its jaw off and like just torturing the thing while it was still alive and drugged up.
01:05:38.580And Facebook allowed that because the caption said, oh, it was in self-defense.
01:05:50.960Do you think do you think there's any chance that maybe a a a a large, you know, a country like China has additional motives on the inside where they want to hurt this great country of America and they want to figure out a way to internally destroy the younger generation?
01:06:13.040You know, any do you think China has any influence over Facebook?
01:06:28.520You know, sometimes you read articles and it says the hands a country like China has on these major social media companies.
01:06:37.380And the other part is typically when you see one person gets off on a site, everybody else follows suit.
01:06:42.840Do you think there is a coalition amongst all the major social media sites where they work together where if one bans somebody, the other follows as well?
01:06:52.140But only when it's people that are in higher opposition.
01:06:55.380So if it's just like regular Joe guy, I they're not going to do that.
01:06:59.440But if it's someone that is more influential, like, as you said, Alex Jones before, of course, they're going to follow suit on that.
01:07:05.960Of course, they did the same thing with Donald Trump.
01:07:08.380If it was a different sort of celebrity, I'm sure that they would have all gotten together and gone like, yeah, let's all do a joint ban on him.
01:07:16.520Do you think Trump should have been banned or Alex Jones should have been banned?
01:07:21.280So that one's a difficult question, because I if I understand correctly, Alex Jones was the man who claimed that the Sandy Hook shooting was a false flag and that the kids that died in that were actually not killed and they're still alive somewhere and the parents are lying.
01:07:40.380So Alex Jones, I don't know his specifics of everything he said.
01:07:49.360Has he claimed has he claimed any sort of real world harm against anybody?
01:08:40.340So what happened at the Capitol was incredibly weird.
01:08:45.000I personally do think that he riled up his fans to go do that.
01:08:51.420I don't really know if there was an ulterior motive to it, but I do know that he I would consider him to be the one that, you know, lit the match, so to speak.
01:09:39.440And I understand just he had broken other policies before there was.
01:09:46.500But as a public figure, especially the president of the United States, it's really difficult to say, like, if you're allowed to break it.
01:09:55.300If we treat him like a normal person, if he did not have the checkmark next to his Twitter account, yes, he should have been banned.
01:10:02.000Because as a normal person, he was breaking the rules and the policies.
01:10:07.420But since he was the president of the United States, I feel like the social media companies should have had more of a sit down with him and say, if you're going to do diplomacy through social media, we need to set something up that can actually make that happen.
01:10:24.920Maybe social media needs to grow and figure out how to do diplomacy through Twitter.
01:10:33.400Maybe that is an idea that we think is really stupid, but it also should be something that maybe we should actually think about.
01:10:42.980Maybe it could be realistic in the future.
01:10:45.500Because if you're doing it through social media and Twitter, at least you're, like, speaking to the public and actually getting your point across so everybody knows where you stand.
01:11:58.320So I heard it, and there was constantly bickering and fighting in between there.
01:12:04.840And I can say for a fact that some of the team leads I worked under, they were politically motivated because they would flat out tell us their thoughts.
01:12:13.640So, yes, the moderators I did work with were politically motivated.
01:12:18.340Sean, I got to tell you, I've really enjoyed talking to you.
01:12:23.500I haven't enjoyed the stories, like, visually for my mind to go there, and, you know, it's extremely disturbing.
01:12:30.260But your approach on you having the courage to go out there and talk about this, where you're getting the audience to be thinking about these issues that are day-to-day on a platform that we all use on a daily basis.
01:12:43.860All of us are on Facebook, Instagram, Twitter, YouTube.
01:13:18.880For all those that are watching, thank you very much for tuning in and hearing my story out.
01:13:25.660If I could get any sort of message across, it would be that going forward with social media, we need to be more careful with how we go about it.
01:13:37.300We need to be more careful on how we interact through social media.
01:13:41.460We need to be more careful about how we treat others through social media.
01:13:46.580And finally, the biggest point I wanted to get across about this is Facebook has this horrible content that I have spoken about with animals, people, babies, toddlers, women, children.
01:14:00.520And Facebook refuses to take action on it.
01:14:04.300And for anyone that's listening that has any sort of anger from this like I do, please, let's try to stop this.
01:14:16.160Because the last thing that we want is for more of this violence to happen on there and more people to make money off of animals and people suffering.
01:14:39.460Like, yeah, we can check your messages.
01:14:41.800And then the one-sided with moderators, their job, how it's on the outside, how they report it, what stays, just because it's a private group.