Valuetainment - October 05, 2021


Facebook Whistleblower Reveals Censorship Guidelines for Moderators - WARNING!


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

172.42218

Word Count

12,990

Sentence Count

787

Misogynist Sentences

20

Hate Speech Sentences

14


Summary


Transcript

00:00:00.120 So I want to give you a warning before you watch this.
00:00:02.040 This is probably the most uncomfortable interview I've ever done in my life.
00:00:05.840 And I've interviewed a lot of different people.
00:00:08.040 I want to warn you, you may not want to watch this video because the stories this moderator
00:00:13.940 tells of what his job was at Facebook, with the videos he was watching every day, and
00:00:18.880 his job was to report to Facebook to take these videos down, and many of these videos
00:00:23.200 were left up, they're going to leave an image in your head.
00:00:29.440 So if you don't want that kind of a thing, just don't watch this.
00:00:31.860 Now on the flip side of it, this was scheduled to be released a couple weeks from now, and
00:00:36.880 it's being released today because of today, while you're watching the video today, today
00:00:41.820 is October 4th, and Facebook's been down for the last few hours, all day.
00:00:49.160 Facebook and Instagram have been down.
00:00:51.160 And I don't know what this has to do with that, if anything.
00:00:55.580 I just think it's something you need to watch to know because this stuff bothers me a lot
00:01:01.200 because this kind of stuff is stuff that innocent kids are being affected by.
00:01:05.340 And I think the people at Facebook can do something about this.
00:01:08.580 So if you watch this and you feel compelled to share it with others who have influence,
00:01:15.620 I recommend you do that.
00:01:16.880 But if you're somebody that doesn't want to get the visual of what this gentleman who worked
00:01:23.600 as a moderator at Facebook reveals and talks about, and is uncomfortable at the end himself
00:01:29.440 talking about this, I suggest if there's any video I recommend you skipping, this being
00:01:34.480 the one for you to skip.
00:01:35.480 With that being said, I've given you the warning, and I'm letting you know it's going to be uncomfortable.
00:01:40.820 Having said that, if you do stick around to watch it, here's the interview with Sean Spiegel.
00:01:46.880 My guest today is Sean Spiegel, who was a former Facebook moderator, and we're going
00:01:55.300 to get down to the bottom of what is going on at Facebook, but the life of a moderator
00:02:00.440 at Facebook.
00:02:01.440 He was there for about six or seven months.
00:02:03.580 It's going to give you hopefully a lot more perspective on what happens on the back end.
00:02:06.840 With that being said, Sean, thank you so much for being a guest on Valuetainment.
00:02:11.000 Thank you very much for having me today.
00:02:12.920 So Sean, if you don't mind, let's just start off with the basics.
00:02:16.500 For the audience who doesn't know what is a Facebook moderator, what is the job of
00:02:21.580 a Facebook moderator?
00:02:23.860 So the general job of a Facebook moderator is you are put into specific cues, as in specific
00:02:31.320 types of content.
00:02:33.600 And for a Facebook moderator, what you are to do is you're to go through this type of content
00:02:38.400 that's given to you.
00:02:40.640 And you're just supposed to go through the policies, you're supposed to action it.
00:02:45.120 And if there's anything out of the ordinary, you would add some notes to it, and you would
00:02:49.860 move on.
00:02:51.540 So that would be what a general Facebook moderator does.
00:02:54.620 So is there levels to moderators?
00:02:56.620 Or no?
00:02:57.620 Is there like, you know how, level one clearance, level two clearance, level three, is there anything
00:03:02.100 like that?
00:03:03.100 Or no?
00:03:04.100 Every moderator, same level?
00:03:05.100 No, no.
00:03:06.100 In fact, there are different levels.
00:03:08.200 It's not specifically levels like one, two, three.
00:03:10.620 It's actually you're broken down into different types of departments.
00:03:15.580 For most people, you would be in the general queue.
00:03:18.800 That's just people that have the basic knowledge of Facebook, basic area of content.
00:03:23.620 What you're mostly going to see there is just general photos, memes, pictures, nothing that
00:03:30.800 would be out of the ordinary, text messages, instant messages, just everything that would
00:03:36.100 be very benign.
00:03:38.840 For me, as an example, I was actually promoted to the graphic violence and hate speech division.
00:03:45.300 That is a division that actually requires a whole different set of skills.
00:03:49.740 And my title actually went from moderator to social media content analyst as I went through
00:03:55.340 there.
00:03:56.340 Got it.
00:03:57.340 So my title there was social media content analyst.
00:04:00.460 And what I did is I specifically dealt with content that was in the graphic violence.
00:04:06.460 So I dealt with pictures, photos, videos of basically the worst things that you could
00:04:13.120 do to a human or an animal.
00:04:16.220 And I also dealt with a lot of hate speech as well.
00:04:18.780 A lot that had to do in and out of the United States.
00:04:23.180 But to answer your question.
00:04:25.520 The Q's are generally regulated depending on what your expertise is, if you have a degree
00:04:31.400 in specific areas you're able to focus on.
00:04:35.340 But the largest Q, the one that most people are in, would be the general Q. And that's where
00:04:39.460 you'll just see the most benign of content.
00:04:41.760 That's where most content goes to.
00:04:43.920 If you have a specialty in something, if you specialized in an area, you would go to some
00:04:49.880 of these different ones.
00:04:51.160 Sexual exploitation, sexual solicitation, drugs and firearms, crack of violence, hate speech.
00:04:58.160 I could go on and on.
00:05:00.200 Got it.
00:05:01.200 So can you unpack what some of the stuff you saw?
00:05:05.000 I mean, you know, sometimes I sit and I let you say I'm going through my newsfeed and
00:05:08.700 I'll see a mother or a father beating a six month old kid, I'm like, why is this thing
00:05:14.600 here?
00:05:15.600 It's got 52 million views.
00:05:16.600 I'm like, what's the purpose of leaving this on?
00:05:18.800 Or you'll see some bizarre videos that make zero sense for them to be on, but they have
00:05:24.700 those things up there.
00:05:26.420 What did you see?
00:05:27.420 And I can only imagine as a user what that makes me feel when I walk away.
00:05:30.780 What are some of the things you saw?
00:05:33.320 I saw a lot, the best way to explain it is if you've been to many of these sites, I know
00:05:40.820 some of them are not around anymore, but there were sites such as Best Gore, Chaotic with
00:05:48.160 a K, there was the Live Leaks.
00:05:52.340 There were many other sites that I can think of, like Hard Candy.
00:05:56.800 And you would have these videos mainly of just these horrific acts that could range from
00:06:03.300 child pedophilia to putting fireworks in a dog's mouth and then setting it up on fire
00:06:10.220 to fathers making their daughters have sex with pigs and then stabbing the pig while
00:06:18.840 the daughter was having sex with the pig.
00:06:23.320 I saw those types of videos day by day.
00:06:27.040 A lot of these had to deal with bestiality, abuse against animals.
00:06:32.920 I also specialized in, during the summer, it was in China, the dog eating festival.
00:06:39.560 So Facebook allowed these different types of videos and photos up there of people that
00:06:45.760 were cooking, skinning, eating dogs alive.
00:06:49.720 Because Facebook said that due to the cultural differences, it would be the equivalent of
00:06:53.660 us having pictures of cows or chickens.
00:06:56.720 So they allowed people to basically just mutilate dogs on live stream, through video, through photos,
00:07:05.260 for the sake of this dog food festival that they have in China every summer.
00:07:09.440 So you see that video, you see the dog eating festival in China.
00:07:14.040 What else was there?
00:07:15.040 So, so far it's sexual exploitation, animal, any other, are you seeing also killing, are
00:07:21.440 you seeing like live shooting?
00:07:23.480 What else are some of the things you guys are seeing?
00:07:25.160 There were plenty of live shootings.
00:07:27.160 In fact, a lot of them came from the United States having to do between gang wars, people
00:07:31.680 that would just have their cell phones out in the middle of a shootout.
00:07:35.200 There would also be a lot of content from the Middle East of people that were stoning pedophiles
00:07:40.980 to death or people that were basically just beating their kids until they were black and
00:07:45.740 blue on the face and their eyes were popping out of their head.
00:07:48.400 A lot of the content that had to do with the Middle East usually had to dealt with beating
00:07:53.480 women and children or stoning people that they perceived as doing a wrongdoing.
00:07:58.800 There was, in fact, this one video that I was working with content wise, and it was a person
00:08:04.660 going to put a gun in another man's mouth and then pulled the trigger, and you could visibly
00:08:10.680 see the innards going through the back of his skull, and I actioned it to delete it.
00:08:17.520 And there was another person that, like, checks the moderators, checks the checkers, so to
00:08:22.960 speak, and this person made the claim that what we saw was not actually the visible innards
00:08:29.860 going out, but bullet fragments, and I made the argument that it's clearly white.
00:08:35.320 These are skull fragments coming out of the back of their head.
00:08:37.700 They're not silver.
00:08:38.440 They're not bullet holes, and this person actually said that I was wrong, and that video
00:08:45.460 was allowed to stay on there because of that.
00:08:48.640 So a lot more of the content had to deal with sexual exploitation.
00:08:54.900 A lot of it dealt with children that were online, and they were basically being groomed
00:08:59.700 by pedophiles to come meet them in person.
00:09:04.560 There was also an incident with organ harvesting videos.
00:09:07.980 So a lot of these children in other countries were basically being ripped apart alive, and
00:09:16.420 they were having their organs taken out while they were screaming, and I remember how bad
00:09:20.800 it was that day because everybody had to come into the main area and calm everybody down.
00:09:26.060 There were, like, people that were throwing up on the floor, like, people that were just screaming
00:09:31.340 and crying.
00:09:32.200 Wait, these are your coworkers who are throwing up on the floor?
00:09:34.340 Yes, and I remember one of our team leads was telling us how we were going to be getting
00:09:42.340 more graphic content, specifically because Facebook couldn't keep up with the amount that
00:09:47.860 they were getting.
00:09:48.960 So they were telling us that, like, the floodgates were opening.
00:09:53.240 So we were just going to be dealing with graphic content all the time in my department.
00:09:57.740 So I got two questions.
00:09:59.140 Let me go to the one first.
00:10:00.680 First, you know, when cops are known for having a very high divorce rate because the life of
00:10:10.440 a cop is not the most glamorous life.
00:10:12.760 You know, all we see is, you know, what they're doing.
00:10:14.120 We see it on TV, good cops, bad cops, all this stuff, but they have a very high divorce rate.
00:10:18.160 The life sucks from what many of my friends talk to me about.
00:10:21.400 A person who's a divorce attorney, you're not going to be able to have, you're constantly
00:10:28.980 talking about divorces.
00:10:30.040 And if you're married, you're going to come home and all you're thinking about is you're
00:10:32.340 probably next because your entire life is about divorces.
00:10:34.540 I was being recruited to be a special forces at the 18 Delta Fifth Group at Fort Campbell,
00:10:39.700 Kentucky.
00:10:40.020 And they asked me to go interview with other 18 Deltas.
00:10:43.100 And the guy was telling me, he says, you ever planning on getting married and falling in love?
00:10:46.180 I said, of course.
00:10:47.220 He says, don't ever become special forces because you can't.
00:10:49.980 You're psychologically going to be off.
00:10:51.540 I'm like, I'm psych, he says, you're psychologically going to be off.
00:10:53.820 That's going to mess with you.
00:10:54.880 That was one interview.
00:10:55.740 Then I went to the next guy, then the next guy, then the next guy.
00:10:58.840 Some jobs you take, they're going to mess with you.
00:11:02.100 But this is different.
00:11:03.800 There's a difference between you watching, you know, certain videos and certain articles versus
00:11:09.060 seeing this.
00:11:09.860 How are you guys psychologically affected by this when you're doing this eight hours a day and
00:11:14.160 you're going home?
00:11:14.800 How are you handling that?
00:11:15.740 How is it affecting you?
00:11:17.620 I was actually doing this 10 hours a day, specifically because they needed extra hands
00:11:22.600 on deck for the extra two hours to do this.
00:11:25.440 So I was there always a bit longer than most other people were, specifically because of the
00:11:30.760 department I worked in.
00:11:32.880 Just talking about the mental health of the people that I worked with, it was not good within
00:11:39.500 the building.
00:11:40.060 So there were people that were smearing feces on the bathroom walls.
00:11:45.920 There were people that were defecating in the sinks and the urinals.
00:11:50.760 There were people that would just be like smoking, drinking in the middle of the day in their cars
00:11:59.360 during their break period.
00:12:00.500 There was always talk about people using the break room for sexual activities.
00:12:05.860 There was always people that were bringing in all sorts of illegal substances to take while they were doing this stuff.
00:12:14.180 Just the mental health of my co-workers was not well because of all of this.
00:12:20.340 Specifically going through, we were supposed to have a counselor there, some mental health therapist to help.
00:12:28.500 I was only able to see this person once, and it wasn't even the actual one that worked there.
00:12:34.900 It was someone that was just filling in.
00:12:36.940 And I remember he flat out told me, I don't know how I can help you guys.
00:12:41.520 Because he did not know what to do.
00:12:44.980 He did not see the type of content we were seeing.
00:12:48.100 And he just flat out said, I'm filling in, but I don't know how to help you guys.
00:12:53.920 Because all of this was new to him as well.
00:12:56.440 And I never met the actual person that was supposed to be on board with us every day.
00:13:00.620 And a lot of my co-workers I spoke with never met this person either.
00:13:07.680 I'm not even sure if this person really existed, or maybe this person was only there like a few hours, a few days.
00:13:14.140 So I have no real confirmation on this person ever actually helping somebody through a tough time they've had there.
00:13:21.600 Sean, are you guys talking to each other?
00:13:27.360 Are you guys on break?
00:13:29.620 You're like, dude, you won't even believe what I just saw today.
00:13:32.000 Is it constant hearing of, like visually, is this a location near Facebook headquarters?
00:13:39.380 Are you offside?
00:13:40.600 Give me a visual of what this looks like and the relationship amongst each other.
00:13:46.100 So give me a little bit of the optics there.
00:13:48.560 All right.
00:13:49.000 So this was not, so this was third party.
00:13:53.440 It was not Facebook.
00:13:54.440 It was a third party vendor named Cognizant Technology Solutions.
00:13:59.500 And so Cognizant had a deal with Facebook that they were going to take a part of their content and they were going to work with it offsite.
00:14:09.580 So they have their own sites.
00:14:11.580 And we did have some people from Facebook stop by occasionally to see how everything was going.
00:14:17.100 I remember I saw one of Mark Zuckerberg's personal lawyers there once.
00:14:22.920 But for the most part, this was while we had the Facebook tech to go do all the stuff and we had all Facebook's content, it was managed directly by Cognizant.
00:14:33.580 And so going forward with that, it was in Tampa, Florida, and it was in this place called Woodland Center.
00:14:42.880 So it was just this kind of giant parking lot, kind of just in the middle of nowhere.
00:14:48.020 And it was just this small building that had it there.
00:14:52.940 And you walk into the building.
00:14:55.360 They have a security guard in front of there all times.
00:14:59.760 You're not allowed to bring any pen, paper, pencils, anything you could record or like take any notes on.
00:15:07.400 Not even your phone.
00:15:08.100 So you can't have a phone on you while you're working?
00:15:09.660 No, you cannot have your phone on you.
00:15:11.700 If they see you with your phone, they just immediately fire you.
00:15:14.500 And I'm dead serious.
00:15:16.780 They just can you if they see the phone because the way that they describe it is all this is other people's sensitive content.
00:15:23.060 So they don't want anybody recording, taking notes, anything, which I understood for like the phones.
00:15:28.800 But for like pencil, paper, they didn't even allow that in there.
00:15:32.720 So you go in there and basically you have these two incredibly small bathrooms that only have about two toilets, two urinals in the mail bathroom.
00:15:42.620 And I believe just two toilets in the girl's bathroom.
00:15:46.480 And this place housed about a thousand people.
00:15:49.540 So it was terrible conditions for the bathroom.
00:15:54.740 And so you have these two small bathrooms.
00:15:57.340 Then you have this kitchen area.
00:15:59.400 And it was just a very small cafeteria.
00:16:03.180 It was just very banal.
00:16:05.180 There was nothing really there.
00:16:06.540 They did set up this little store there and every week they would give you $20 on a card and you could go eat whatever you wanted to as long as you didn't go over your $20 at this little store.
00:16:20.320 But for some reason, all of the food there was junk food.
00:16:26.480 And sometimes they would have like fruit cups, salads, but all the people that were higher up got all those.
00:16:32.320 So it's like the team leaders, the trainers, they were able to get the good food.
00:16:36.780 And so everybody else was really just eating junk food all day.
00:16:39.260 And what really kind of hurt me a lot is I would see this graphic content and what my people would tell me is like, hey, go grab something there.
00:16:49.380 Go cheer yourself up.
00:16:50.640 So I grabbed like, you know, a chocolate milk, some donuts.
00:16:53.280 And I put on a lot of weight while working there.
00:16:56.180 I actually weighed over 300 pounds when I left.
00:16:59.920 I have lost about 41 pounds now since then.
00:17:05.560 And I put on some good muscle.
00:17:07.660 I'm very happy about that.
00:17:09.320 But it was a very unhealthy lifestyle, just not for me, but for everybody that was working there.
00:17:15.500 So after that, you just have these two main wings and they are just filled with computers.
00:17:24.620 There's no cubicles, no nothing.
00:17:26.340 They say that they wanted it all open, but it was just nothing but computers.
00:17:33.200 And there was nothing else there for us.
00:17:36.200 They had this one tiny room in the back corner, what was supposed to be like a calm down, relaxing room where it's like, if you need to calm down after seeing this content, you could go in there.
00:17:47.200 They had like Legos.
00:17:48.560 They had like a little billiard board and all that.
00:17:51.640 But no one was ever allowed in there because they would like time you obsessively if you ever got up from your seat.
00:17:58.700 So it would be like if you go to the restroom, they would literally put a timer on your computer.
00:18:04.620 And if it goes over the amount of time, you're going to get yelled at by the boss for going over the time.
00:18:10.340 If you're on a break and you're like a minute late from your break, the computer alarm will go off and it's immediately going to tell your supervisor that you're not there.
00:18:19.540 And even if you're just like right around the corner, you're going to get another warning, you're going to get, you know, dock and pay.
00:18:26.600 So they were very like obsessive with you being at your computer at all times.
00:18:32.020 In fact, what was just really ridiculous with it is how there were people there that wanted to just get up and walk after spending like four hours doing nothing at the computer.
00:18:46.220 And like the bosses just would not have it.
00:18:49.340 They just wanted to have a body in the seat.
00:18:51.920 They didn't really care how that body was doing.
00:18:54.540 They just wanted the person in the seat.
00:18:56.320 And that was it.
00:18:57.420 What were they paying you guys?
00:18:58.520 So they were paying us about $30,000.
00:19:04.200 It was originally $28,000 working in the graphic violence division.
00:19:08.340 Per month?
00:19:09.200 Per month?
00:19:10.580 I wish it was per year.
00:19:13.060 They're paying you $30,000 a year?
00:19:15.200 Yes.
00:19:16.260 To watch all, okay.
00:19:18.300 Okay, they're paying you $30,000 a year.
00:19:20.180 And that's like $15 an hour is what they're paying you.
00:19:22.880 Okay.
00:19:24.660 So these are two separate companies.
00:19:26.940 Now, let me ask you, is the liability, is the liability on Facebook or is the liability on Cognizant Technology Solutions?
00:19:36.680 And how was Facebook holding these guys accountable for doing their job right?
00:19:42.200 So that's an easy question to answer.
00:19:44.820 Facebook is not holding them accountable.
00:19:46.460 In fact, it was Cognizant that was the one that backed out of the deal with Facebook about being a vendor for their content.
00:19:54.240 Facebook was not holding them accountable at all.
00:19:57.360 Cognizant was able to do whatever they wanted without really any interference as long as a certain amount of content was being looked at every single day.
00:20:06.000 That's all that they cared about.
00:20:07.060 So they really didn't even care much about the accuracy ratings.
00:20:10.620 They just wanted the number to go up to say, we looked at this much content.
00:20:14.740 That was really what they were focusing on.
00:20:17.460 What's successful?
00:20:18.200 What percentage is successful to them?
00:20:19.880 So a percentage that was successful to them is they wanted each and every employee to do about 500 to 1,000 pieces of content every day.
00:20:30.700 And I was usually doing about 100 to 200 contents on usually a day where there was not much going on.
00:20:38.520 Because going through this content, when you're in a specialized area, you have to not only just action it, you have to go through the Internet, find out the source of it.
00:20:49.320 If you can find it, verify if it's real or not, make sure you have the links, make sure you say how you found it, and then write up a report on it.
00:20:58.740 So a great example would be like, I'm finding a lot of these terrible bestiality videos.
00:21:05.600 And I was able to find a bestiality website, and I was able to find those same videos from there.
00:21:10.400 So I was able to find the source of it there.
00:21:12.440 So then I would write up a report saying that this was not an original Facebook piece of content.
00:21:16.920 This was something that was taken from another website.
00:21:19.720 And then I would write up my report saying why it would be taken down, what the actions would be, what it breaks in the policy.
00:21:27.400 So that does take up a good amount of time, especially when you're basically going through the whole Internet looking for a piece of content like that.
00:21:34.540 But there were a lot of times when my trainers would tell me, go slow, do it, you're doing great.
00:21:41.540 That's why we promoted you.
00:21:42.960 And then I would have the team leader telling me, no, you need more content, more content, more content done.
00:21:47.720 And so I was getting two different perspectives on what I needed to do there.
00:21:54.180 There was one that was telling me I needed a bigger number of content.
00:21:57.700 And then I had another group of people telling me that I needed to just focus on the accuracy of what I was doing because these were real people, real animals in these situations.
00:22:08.540 It was really pretty much the ethical side versus the business side.
00:22:12.480 I'm looking at this Tampa Bay Times article that says Facebook agrees to pay $52 million settlement with content moderators who suffer trauma on the job.
00:22:20.800 Former and current moderators in Florida and across the country will receive $1,000 each and may be eligible for more money to cover medical treatments and damages, et cetera, et cetera.
00:22:30.240 Is this, did you see this actually become a reality?
00:22:33.620 Like did you guys see payment from Facebook on this $52 million?
00:22:37.000 No.
00:22:37.480 I believe it's still in court right now, still actually being worked on.
00:22:42.480 Casey Newton from The Verge told me that he wanted me to be a part of it.
00:22:48.260 And so I decided I will be.
00:22:50.420 I originally didn't want to because I didn't want anybody to come across thinking that I'm just doing, like coming out and saying all this stuff for money.
00:22:57.660 Like there's a message I wanted to bring about this.
00:23:00.720 But I decided that it probably would be best, especially after everything that's happened to me.
00:23:05.520 And then there's another article that says a total of 556 employees will be laid off early next year from a controversial facility near Carolwood that monitors Facebook for banned content such as hate speech, bullying, threats, and videos of violence against criminals and children, violence against animals and children.
00:23:21.480 Cognizant Technology Solutions, a contractor for Facebook, plans to close its operations at 7725 Woodland Center Boulevard, about two miles north of Tampa Bay, Tampa International Airport.
00:23:32.340 Is that the facility you were at?
00:23:33.700 That is the facility, yes, sir.
00:23:35.260 It's crazy.
00:23:35.520 So you remember when this story came out, because this is two years ago, a year and a half ago, give or take.
00:23:41.580 I do remember that story coming out.
00:23:45.720 So you said Cognizant Technology Solutions dropped Facebook.
00:23:52.360 Yes.
00:23:52.700 Why did they drop Facebook if it's an account that's paying you good money?
00:23:57.000 I believe it was just the bad amount of publicity that they were getting.
00:24:00.100 Last I heard about it is their stocks were starting to drop, and there was a lot of negative news going on about how they were treating everybody by being a vendor for Facebook.
00:24:10.240 So they decided just to drop the account altogether.
00:24:14.080 Is this the same Cognizant company that's a publicly traded company doing like $16 billion a year, like doing revenue is, yeah, $16.65 in 2020.
00:24:24.140 Is this the company?
00:24:25.640 That's the company, yes.
00:24:27.440 Got it.
00:24:27.880 Interesting.
00:24:28.200 So I'm sure they don't want that kind of publicity to be tied to that.
00:24:31.480 No.
00:24:31.880 Is that their business model, though?
00:24:33.800 Like, do they do this also for Twitter, YouTube, Facebook, LinkedIn, other sites as well?
00:24:38.440 Or was this like a singular project that they took with Facebook?
00:24:42.900 As what I was specifically told is that it was not in Florida, but it was in Texas.
00:24:49.120 There was another branch of Cognizant that was doing the same thing we were doing, but instead of Facebook, Instagram, it was Twitter.
00:24:56.460 So they were doing this for Twitter as well?
00:24:58.200 Okay.
00:24:58.600 Right.
00:24:59.260 Right.
00:24:59.600 Because it looks like I'm looking at the reputation.
00:25:01.560 This company doesn't have a bad reputation.
00:25:03.320 It just seems like this one account they were taking.
00:25:05.260 I don't know.
00:25:05.660 Maybe there is.
00:25:06.320 It looks like a reasonable company that does good things, and they decided to cut this relationship.
00:25:12.640 So let me go a little deeper with this.
00:25:14.940 So, one, you explained what it was like for you guys.
00:25:18.060 I can only imagine how challenging it could be.
00:25:21.640 And you were explaining the fact that fetus on the bathroom wall and, you know, in the stalls where you wash your hands, and you said sex in the bathroom.
00:25:31.820 What's that all about?
00:25:32.980 What has that got to do with anything?
00:25:34.440 Is it who they were hiring?
00:25:36.040 Were they pretty much hiring anybody and everybody off the street, or was there a requirement to meet to have the job to work here?
00:25:42.360 So, there was an initial test that you take for the interview where it goes through how well you're familiar with Facebook and Instagram.
00:25:52.960 And then after that, they show you a book, and they say, this is going to be some of the content you might see.
00:25:59.980 And they say, you might see it.
00:26:02.120 And the content was very benign compared to what was actually there.
00:26:06.500 They showed hentai, which is like the animation pornography drawings.
00:26:12.200 So, they showed some of that.
00:26:14.100 They showed some memes, like, that had to deal with, like, if you had, like, pineapple on your pizza, you would die.
00:26:22.060 Or it was just the incredibly benign things like that.
00:26:24.840 There was nothing that was, like, overly bad.
00:26:27.160 The biggest thing that they showed in that book, and it was a picture that CNN took of this little boy, and I believe he was taken from the Middle East on a plane, and he was all bloodied up, and he was all dusty, and they showed us that.
00:26:42.180 They said, this might be the worst it gets.
00:26:44.840 And I was like...
00:26:45.720 I remember that picture, by the way.
00:26:47.720 Right.
00:26:48.140 And so, that was as far as they showed us, saying, like, this is going to be the type of content you'll see.
00:26:53.600 So, of course, I'm thinking, if that's the worst, I can do this, right?
00:26:58.840 In retrospect, I do believe that they were just hiring bodies to fill the seats.
00:27:04.520 I do not think that they were hiring the best people.
00:27:08.860 I think they seemed like they were fortunate that they had someone like me that had a college degree, that had a background in going into public records, that was able to do the stuff they wanted me to do.
00:27:20.720 Because a lot of my job at the Graphic Violence Division was to go through content, and if it was in the United States, I would go through the accounts, I would go through their LinkedIn, their Facebook, Twitter, anything, any sort of public accounts.
00:27:37.960 I would go look at any information, look at their property records, housing records, go through any sort of utility bills that I could find, and I would basically write a portfolio of them.
00:27:47.780 And I did this a lot in college when I took public affairs reporting, so I was very familiar how to, like, find public records, go through these areas and do this.
00:27:56.760 So that was one reason why they wanted me on the Graphic Violence Division, was because I had a good bit of knowledge on how to basically find out how they put it, find the bad spot on the apple.
00:28:09.360 So you start off as the people at the bottom as general queue, which is meme, stuff like that comes up.
00:28:17.360 You get a promotion, you become a social media content analyst, and you start seeing stuff that just makes no sense.
00:28:25.600 Walk me through the hierarchy structure.
00:28:27.580 Who's after you?
00:28:28.500 Who's after that person?
00:28:29.640 Who's after that person?
00:28:30.960 What does that tree look like?
00:28:32.260 So it's not really like a tree.
00:28:35.860 It's more just like the general queue is at the bottom, and what you specialize in would just be all a big circle that is above the general queue.
00:28:46.240 So it's not so much that there was anybody like another department that was above me.
00:28:53.080 It was very much just different departments that focused on different types of content at that point.
00:28:58.400 And those departments were very small.
00:29:02.760 So in my department, it was me and seven other people.
00:29:06.900 And we were the whole Graphic Violence and Hate Speech Division.
00:29:11.500 They had one that was for bullying that had just 10 people on it.
00:29:15.900 They had one for drugs and firearms.
00:29:19.580 And that one just had about five people on it.
00:29:22.460 So it wasn't like a lot of people were being put into these different areas.
00:29:27.380 The vast majority were just put in general queue.
00:29:31.060 And if you showed any sort of promise, they would put you on a different desk.
00:29:35.060 Got it.
00:29:35.740 Got it.
00:29:36.340 So you said something earlier.
00:29:37.800 You said when you saw this video with the guy shooting the guy's brains out, and you said it was obviously white.
00:29:45.680 And you're like, no, that's not.
00:29:46.740 That's the gun.
00:29:47.440 It's not real.
00:29:48.140 And it was left to stay on.
00:29:51.040 Who allowed it to?
00:29:52.200 Who was approving it above you?
00:29:54.100 Was it a Facebook person, or was it a Cognizant Technology Solution person?
00:29:58.160 It was a Facebook person.
00:29:59.560 Oh, that's a Facebook person.
00:30:01.260 So it goes through you.
00:30:03.360 You kind of mark it up.
00:30:04.860 And then it leaves Cognizant Technology Solution to Facebook.
00:30:08.320 And a Facebook person dictates whether they're going to leave that alone or not.
00:30:12.100 Correct.
00:30:17.100 So, okay, so when that happens, it's like that video that took place where they left it on.
00:30:23.000 Did they leave it on in specific countries, or did they leave it on across the board, anywhere, America, all of that?
00:30:29.320 Or did they filter America out because that's the one place they didn't want to leave on?
00:30:33.000 They left it on everywhere, all countries.
00:30:36.800 And the specific reason for that is because Facebook has hyper-specific types of policies.
00:30:43.700 And for graphic violence, what they say is videos that are not in a medical setting, and you can see visible innards, are not allowed.
00:30:53.980 And so according to this, this was a video, this was not in a medical setting, but they claim that the skull fragments and the bit of brain matter that came out was not visible innards.
00:31:08.720 So they said it could stay on, and so they allowed it.
00:31:12.620 That was their reasoning behind it.
00:31:16.240 Okay.
00:31:18.000 What other videos that obviously made no sense to stay on, they would leave it on?
00:31:22.640 What other videos did you see where you're like, okay, there's, like, for example, the pig video you talked about, did that one get taken down?
00:31:29.000 So that one did get taken down, but it got taken down specifically because that was not one that was in the general area of where we get our content from.
00:31:40.600 That one was in a private Facebook page.
00:31:43.420 And this was what I really specialized in, was that there were a lot of private pages.
00:31:49.280 And they would be named something like, I love dogs, or let's go bulldogs, or anything like that.
00:31:56.400 And so you would just see it, it would be like a very general name, you wouldn't think anything bad about it.
00:32:01.760 And it was a private only page.
00:32:03.300 So you can only go to that page if you were invited.
00:32:06.120 So if you were actually invited to that page, you then realize that it was actually just a cesspool of specialized content.
00:32:14.500 That was horrific and disgusting.
00:32:17.820 And what made it the worst is that these videos and these private pages, they were homemade.
00:32:24.380 They were not from other websites.
00:32:26.360 They were people that were making these videos.
00:32:29.720 And they were inviting people into the page to buy and auction the videos.
00:32:35.400 So what the hardest part about that was, is that they were using the Facebook payment system.
00:32:43.540 They were using other means of getting currency.
00:32:47.120 And they would have a horrific video, such as the video of the pig and the girl.
00:32:53.620 And people would actually bet an auction on how much money so they could get the full video.
00:33:00.900 And then that person that made the original video would also take requests from others and make content specialized to tailor, tailored to them.
00:33:12.240 So that's, that's what I dealt with.
00:33:15.180 And a lot of these, this type of content that was homemade by the parents, by the children, by the people that were making money off of it.
00:33:24.460 This was stuff that was just in the private area of these private pages.
00:33:29.740 And Facebook would not allow me to do anything about it.
00:33:33.140 Because even if I wrote the report, even if I could prove that they were in America, even if I had their driver's license, their face, their license plate, their home address,
00:33:43.220 they said that it would, they said that it would look bad for them.
00:33:48.440 And they said that an internal Facebook team would look at it.
00:33:52.040 And yet that same content was still on there.
00:33:56.920 So are you saying if they really wanted to catch the bad guy, they could, but because they could figure out where it's coming from?
00:34:02.320 Is that kind of what you're saying?
00:34:04.060 Yes.
00:34:04.660 And I figured out where a lot of the content was coming from.
00:34:08.300 So did my coworkers that worked with me on the desk.
00:34:10.260 So it wasn't hard for you to figure that out?
00:34:12.280 It was not because people put their names on their content.
00:34:15.940 They will actually like, like make it a brand.
00:34:18.840 They will be stupid enough to show their house, their home address.
00:34:22.880 They are stupid enough to have their car license plate show up in the video.
00:34:27.820 They are dumb enough to say their first and last name.
00:34:30.940 Even though this is a homemade video, you can easily send this to the FBI.
00:34:34.660 But Facebook didn't want to do that.
00:34:37.080 Why?
00:34:37.460 Is it because they don't want the users to feel uncomfortable that the information is being shared with a third party?
00:34:44.380 Is that kind of what their alibi was?
00:34:46.720 No, in fact, their alibi was that they were saying that this content going public would be bad for Facebook's optics.
00:34:55.100 So they would just have an internal investigation through it, through Facebook's own team.
00:35:00.520 And obviously that went nowhere when you would see the same profiles, the same pictures, making the same content.
00:35:08.140 Back the next day, nothing ever happened to them.
00:35:11.160 And they're making new content to go with it.
00:35:13.700 Okay, so let's go a little bit deeper with this.
00:35:20.380 So, simple.
00:35:21.840 Why, David, why do they want this stuff to stay on?
00:35:27.620 Like, what's the purpose of staying on?
00:35:28.980 So let me kind of give you my skeptical side and you tell me if I'm on or if I'm off.
00:35:33.380 So on one end, I sit there and I give them the benefit of the doubt.
00:35:37.080 They have so much content that's coming in, they just can't track all of it.
00:35:40.800 It's like they can't even track all of it for the billions of postings that's taking place.
00:35:45.960 So eventually they're just kind of like, you know what, man, we can't get to everything.
00:35:50.260 So you know the whole 98% ratio or the 80% ratio, whatever they got.
00:35:53.640 Okay, so that's that.
00:35:54.340 The next part, they put it in there and they don't take it down because they don't think fully it's inappropriate
00:36:00.740 and they have an excuse or reason to say this makes sense to stay up.
00:36:04.180 Or the next one is it still drives so much traffic and eyeballs that indirectly it's helping the algorithms
00:36:11.240 to help them stay number two website in the world and that helps with business.
00:36:15.940 I don't know.
00:36:16.480 I'm trying to figure out what the reason would be to leave this stuff up.
00:36:19.160 The reason that I can say for leaving the stuff up is that Facebook in general has been staying stagnant
00:36:26.000 in the amount of followers and the amount of people that sign up for it.
00:36:29.520 And it's also been declining in the amount of new users that have been going on.
00:36:33.920 So I think maybe part one would be Facebook is incredibly desperate to keep its own demographic
00:36:40.640 that it has no matter how obscene and disgusting and perverted it is.
00:36:44.220 Number two, I think Facebook would rather very much just put all this under the rug, not even deal
00:36:52.500 with it, not do anything with it, just say we're going to do it, categorize it, file it,
00:36:58.700 but never actually act on it.
00:37:01.340 And I completely believe that Facebook is too much of a coward to actually go after these people
00:37:08.520 that are posting this content on their own website.
00:37:11.820 In fact, I think Facebook is more concerned with hiding it as best as they can and just
00:37:18.660 pretend it never happened.
00:37:22.500 Okay.
00:37:23.100 So we didn't see it.
00:37:24.280 Oh, we didn't see it.
00:37:25.660 But you get a million people.
00:37:27.120 You know how you go online and you see a video and you say report, report, report, report,
00:37:32.540 report.
00:37:33.440 Does that go to an organization like the one you guys were a part of?
00:37:36.520 Like what does a report action do to content that one sees?
00:37:42.440 I'm sure plenty of videos that you see that's like, this is just not appropriate to be on
00:37:46.360 here.
00:37:47.080 When one clicks report, where does that go?
00:37:49.860 So normally when you click report, it would go into the general queue.
00:37:53.440 Once it's in the general queue, it will either be categorized based on what you see in it,
00:37:58.620 and then it would be taken off into the different other queues.
00:38:01.340 So that's basically how it works for the most part, just with that.
00:38:07.200 The report button, in my opinion, only really works for the general queue because when you
00:38:13.500 deal with these higher up, more dangerous things, types of content that you're dealing
00:38:18.620 with, the report button really becomes very banal in a sense that it's just not useful.
00:38:26.120 And the main part of what's happening is it's really more of an AI that's then scanning things
00:38:32.060 to see if it looks similar to what you're looking at, and then it will send it forward.
00:38:37.080 But the report button is really just there, if I had to be honest, for things that are
00:38:42.100 such as like bullying or like something that you just don't agree with, anything that's
00:38:46.580 speech-wise.
00:38:47.400 The way I got the vast majority of my content was specifically through digging through these
00:38:54.000 private pages and what the AI was sending me because I was working with an AI that was
00:39:00.380 trying to help identify graphic content.
00:39:03.760 So a lot of what I got was not based on the report button because the general public was
00:39:11.420 not seeing what I was seeing because these were all private pages.
00:39:14.740 So these are not things that would go viral.
00:39:18.040 And the only time the FBI, the police, or like Facebook really got involved is when one
00:39:23.260 of those videos broke out of those private pages, and then it got mainstream.
00:39:28.380 There was an instance, I remember specifically, of these two girls, and they were babysitting
00:39:35.480 these kids, and they were forcing the toddlers and babies to smoke marijuana.
00:39:39.160 And that one went viral.
00:39:42.080 And I remember Facebook came in with their team.
00:39:45.200 We actually had the local police from that area work with us, and we were able to identify
00:39:50.500 them and apprehend those two.
00:39:53.180 So that would be one of those rare instances where we actually did do something, but it
00:39:59.140 only really happened because it went mainstream, because it went viral.
00:40:03.460 And a lot of the content I dealt with just never became viral because it was in these isolated
00:40:09.720 pockets.
00:40:11.620 Just to go on a little bit more about that, with a lot of this content, especially the graphic
00:40:20.660 violence stuff, Facebook had this incredibly weird policy where a lot of abuse, whether
00:40:27.800 towards children, animals, adults, they would leave it on there, and they would tell us they
00:40:34.960 would leave it on there because they would say that either a good Samaritan or a police
00:40:39.400 officer would see that content, and they would give a lead to who that person was.
00:40:46.120 And so their idea was, we're just going to leave it on there.
00:40:50.120 So hopefully a good Samaritan will give us some info, and then they can get justice to them.
00:40:55.840 What do you think about that?
00:40:56.620 It's a lie, because the police don't have our systems.
00:41:01.220 The police aren't looking at what we're looking at.
00:41:03.520 They don't have the software that Facebook has to actually look at the type of content
00:41:08.420 that we're looking at.
00:41:09.640 And I wish I realized that when I was in training instead of believing it and thinking that I
00:41:15.420 was making a change and trying to help these people and animals, even if they're already
00:41:23.320 dead, as long as their final moments aren't being desecrated in these videos and pictures.
00:41:29.200 And I really did believe that somehow good Samaritans or police were actually going to help see
00:41:37.520 this content, report on it, say who it was in there.
00:41:41.100 But those people aren't seeing it.
00:41:44.580 Only I'm seeing it.
00:41:46.240 So let me ask you, so sometimes when you run a big company, you don't really know what's
00:41:55.160 going on everywhere, right?
00:41:56.740 I mean, so, you know, Facebook's got, give or take, 60,000 employees.
00:42:02.220 That's a pretty big organization they got.
00:42:04.260 Some sites say they get 1.87 billion unique visitors per month.
00:42:07.980 And some sites say they get 2.5 to 2.8 billion unique visitors per month, users per month.
00:42:14.180 Okay, whatever the number is, let's say 2 billion.
00:42:15.880 That's a lot of people that come to their website on a monthly basis.
00:42:18.800 So you've got 60,000 plus employees.
00:42:21.260 You've got 2 billion users actively that are logging on to your website.
00:42:24.640 You're the largest country in the world, essentially.
00:42:27.380 China's got 1.5 billion.
00:42:28.600 India's got 1.4, 1.5 billion.
00:42:30.720 You've got 2 billion.
00:42:31.740 You're a country.
00:42:32.660 You're a virtual government.
00:42:33.840 Do you think this stuff is stuff that maybe it's so far away from Zuck's hands that he
00:42:40.920 doesn't even know this stuff is taking place?
00:42:44.280 No.
00:42:45.240 He knows what's taking place because his hands are all in it.
00:42:48.640 Like, this is his creation.
00:42:50.860 And as we've seen through his multiple appearances with Congress and the way he speaks in any sort
00:42:56.420 of interviews, he obviously presents himself as someone that has his hands in all the pies,
00:43:04.140 so to speak.
00:43:05.900 What I think is the problem is, one, Facebook has grown too big and needs to be cut down
00:43:13.760 into sizable chunks.
00:43:15.040 Maybe they need to be operated by different individuals that are not affiliated with him.
00:43:19.460 But the biggest problem I had was just the reallocation of resources was terrible.
00:43:25.200 There was, so I talked a little bit about this AI that we were using that was trying
00:43:29.700 to help with graphic content.
00:43:31.480 What the AI was primarily being used for was looking at sexually suggestive photos, not
00:43:38.700 even photos that were sexual, not even photos that showed anything.
00:43:43.220 It wasn't any pornographic images.
00:43:45.400 So, examples, like if there's a picture of a girl and she has her butt sticking out in
00:43:51.640 the photo, she's wearing all clothes, the AI needed to know if that was a butt-focused
00:43:57.440 picture or not.
00:43:58.560 Or a lot of the times we would just look at the pictures and we would just identify if
00:44:04.220 something was cleavage, as in the indentation between the breasts.
00:44:08.200 That was cleavage.
00:44:09.640 There was another one of identifying women in swimsuits.
00:44:13.540 There was another one that identified nipple or areola.
00:44:20.380 So, we actually had to train the AI to look at something and see if they could just identify
00:44:24.860 the areola around the nipple.
00:44:27.420 So, this is what a lot of the AI resources were going to.
00:44:31.240 Not graphic violence, not child pornography, not any of these terrible things.
00:44:37.720 It was going towards these sexually suggestive photos and videos and it was just a terrible
00:44:46.420 use of the AI.
00:44:47.720 There was absolutely no need to teach an AI how to categorize that content because none
00:44:52.860 of that content was breaking any policy.
00:44:54.760 Because even though we were working with it, we were never actioning that content because
00:45:00.740 there was nothing ever wrong with the content.
00:45:02.820 For some reason, they just wanted that categorized.
00:45:06.080 It was ridiculous that they decided to put the AI to use that way.
00:45:09.440 So, let me go a different direction here.
00:45:12.680 So, throw Alex Jones in there.
00:45:16.260 Throw whoever, any of these other guys that are out there.
00:45:22.300 Throw Trump in there.
00:45:23.740 Throw some of these guys that are banned from Facebook and they're not on Facebook.
00:45:28.180 First, they took Alex Jones' company down.
00:45:30.180 They took him down, right?
00:45:31.220 How different do they treat threats of people that share ideas that they may not agree with?
00:45:38.740 Whether as outlandish of an idea as it is as communism or, you know, anarchy or whatever
00:45:44.360 it is.
00:45:44.740 As outlandish as it may be, how do they process that versus the child porn and all the other
00:45:52.580 stuff that they leave on there?
00:45:53.720 In their mind, why do they think this is more threatening than the child porn content?
00:45:59.220 So, they deal with the politics more than any of the other content I discussed about, specifically
00:46:07.420 because politics are a major part of U.S. life and Facebook takes place in the United
00:46:14.320 States.
00:46:15.060 So, Facebook feels like it has this priority to take a look at politics over anything else
00:46:21.440 first.
00:46:21.880 And when I worked there, there were some instances with Justice Kavanaugh when he was having his
00:46:30.120 confirmation and then there was Dr. Ford who claimed sexual molestation at a younger age.
00:46:38.460 So, that was one of those rare times that all of Facebook, all of the FBI got really involved
00:46:44.120 because throughout that entire hearing process, there were people that were on the Trump side
00:46:49.720 that were saying that they wanted to put, like, pipe bombs underneath Dr. Ford's car and then
00:46:55.860 there was this other side that they wanted to ram cars into Justice Kavanaugh if they saw
00:47:02.820 him and they were each, like, creating their own little terrorist cell deciding on who to
00:47:10.200 kill, who to take out for their political ambition.
00:47:13.040 And I guess if I just had to go more straightforward with it, while the stuff going on is horrendous
00:47:20.440 and terrible and it should be given absolute priority, a lot of the content I dealt with
00:47:25.520 should be given that priority.
00:47:27.840 Unfortunately, it seems like the politics and the extremism on both sides, and yes, I'm going
00:47:33.500 to say both sides because what I witnessed on Facebook, the people on the left side were
00:47:39.180 just as crazy as the people on the right side, a lot of that stuff has been bleeding into
00:47:45.920 real world life, and it hasn't just been staying on Facebook or on the internet, and it's been
00:47:53.220 actually being brought into real life actions and consequences.
00:47:57.720 And so, I think Facebook has maybe thought that that's a bigger deal of having these things
00:48:03.320 come, these things be more straightforward, especially when a lot of these people are
00:48:10.200 frequent users on their sites, and when they get public headlines, that looks bad for them.
00:48:15.900 So, you said on the right side, you said the bombs, Ford, all that, and you said it's crazy
00:48:21.820 on the left side and the right side.
00:48:23.520 What crazy things did you see on the left side?
00:48:25.800 So, for the left side, there was much talk about how a lot of conservative women should
00:48:33.000 have their genitals mutilated because they were claiming that these conservative women
00:48:37.760 didn't deserve to be women for that reason because they're not going with the feminist
00:48:41.160 side.
00:48:42.660 There was also a lot going on towards the gay rights where they were saying that they wanted
00:48:47.740 to indoctrinate kids as early as two or three into gay rights activist group, which I
00:48:55.640 am a person that stands for gay rights, but I'm not a person that would ever, like, go that
00:49:00.840 forward to someone that's just a young child.
00:49:03.700 There was also much talk of grooming young children into being drag queens, and there was also much
00:49:11.620 talk going forward about how, basically, they wanted to start segregating Democrats and Republicans
00:49:19.060 or primarily Democrats and Trump supporters and wanting to, like, slash the tires of their cars
00:49:24.640 or, like, deface their housing or threaten them in public, and there was a lot of these
00:49:30.380 actions that were starting to become real-world consequences.
00:49:34.100 And so, as I said, there are plenty of crazies on the left side that were creating real-world
00:49:41.740 harm.
00:49:42.780 Now, what access did you have?
00:49:44.920 And earlier on in the messaging, you said the general cue was photo, memes, text messages,
00:49:53.240 instant messenger.
00:49:54.500 So you guys were able to see what's being said in the instant messenger and text messages
00:49:59.320 or no?
00:50:00.260 Yes.
00:50:01.060 Also, whatever we post there and we're communicating, you see all of it?
00:50:04.660 Yes.
00:50:05.380 Absolutely.
00:50:05.900 Everything that's communicated here, if you wanted to see all of it, you can easily see
00:50:13.520 it.
00:50:14.100 Yes.
00:50:14.720 And you have access to who?
00:50:17.160 Like, is there levels of clearance on who you have access to, want to be able to go see?
00:50:22.140 Let me see if this one guy I don't like, let me see what he stands for.
00:50:25.280 I'm going to go see what this guy's saying to this person here.
00:50:27.620 Could you do that, or was there limitations to what they sent to you?
00:50:30.820 No.
00:50:31.680 In the general cue, there were limitations.
00:50:33.940 Where I was, there were not.
00:50:35.020 Were you ever tempted to see some conversations of people that were higher-ups or no?
00:50:40.680 I was tempted, yes.
00:50:42.100 But not specifically higher-ups.
00:50:44.320 I have to be honest, politics isn't my thing.
00:50:46.780 I was more curious about seeing what my cousin was saying up in Georgia.
00:50:50.680 But if I have to be honest, I'm just-
00:50:54.140 Imagine you're like, my cousin doesn't like me.
00:50:57.180 I'm not a political person, even though when I was incognizant, it was very political in
00:51:03.180 there.
00:51:03.420 But I'm just not much of a politics person to begin with.
00:51:08.300 And you know what's the unique thing about talking to you?
00:51:11.400 You're very sincere.
00:51:12.260 You're just like, this is who I am.
00:51:14.000 This is what I'm doing.
00:51:15.820 Were you, again, I don't even know, like, what do you know about shadow banning that we
00:51:21.040 don't know about?
00:51:21.760 You know, like, you know, when you see sometimes you create content and all of a sudden you're
00:51:26.640 like, wait a minute.
00:51:27.700 Like, you'll see videos that are momentum, momentum, momentum.
00:51:31.880 And typically when a video's momentum goes away, it's typically a drop-off like this, right?
00:51:37.860 And it goes away.
00:51:38.680 It's just kind of how it works.
00:51:39.720 I mean, I've been doing this for a long time.
00:51:41.000 We got a few billion views total online.
00:51:43.060 But sometimes you would see video goes like this, like this, like this.
00:51:45.980 And then sometimes it just goes, boom.
00:51:48.180 That's not possible for a video to do that.
00:51:51.460 There's got to be something that automatically brings it down 90% within a second.
00:51:56.320 Is shadow banning something that's a real thing that's taking place?
00:51:59.820 And who behind closed doors is moderating who they don't want to get exposure and who
00:52:05.220 they want to get exposure?
00:52:07.300 So the whole process about shadow banning is what they're trying to do is content that
00:52:12.360 they feel is either offensive, controversial, or any sort of content that is just, they feel
00:52:18.860 like will cause real world harm.
00:52:21.020 They will tend to try to stop it before it becomes too viral.
00:52:25.460 So they don't want it to become a viral message.
00:52:27.540 They don't want it to go mainstream.
00:52:29.020 So they usually just try to cut it off right at that tipping point, right before it would
00:52:34.100 become viral or mainstream.
00:52:36.240 The people that focus on that type of content were not incognizant.
00:52:41.260 They were with the Facebook queue in general.
00:52:44.240 They were the ones that actually worked with Facebook itself.
00:52:48.100 But for shadow banning, as I said before, it was very much that they were just trying to
00:52:52.760 cut off things that were trying to, almost on the verge of being viral.
00:52:57.160 There were things that could have, that could range from hate speech.
00:53:01.500 There were things that could range from political ideology.
00:53:05.060 They could range from things that could be misinformation about medical uses, medical
00:53:10.960 diagnostics.
00:53:12.100 I mean, I'm sure a lot of people have seen everything go along with the COVID-19 response
00:53:18.700 and how Facebook has been working with that.
00:53:22.200 So it's just things that are going and they're on the verge of being viral, but they have not
00:53:29.020 quite made it yet.
00:53:30.040 They just kind of cut it off right then and there before it gets to that point.
00:53:34.080 Did you see the article that came out a couple weeks ago saying the fact that Facebook has
00:53:39.380 an elite group of users, I think it's like five to six million people, that no matter
00:53:44.000 what they say, no matter what they put in the article, talked about how Neymar had a picture
00:53:48.800 of some girl that was trying to sue him and he put a picture of her naked or something like
00:53:54.360 that.
00:53:54.580 I don't know the exact story.
00:53:55.660 So here, I'm going to put my little plug in here for another thing.
00:53:59.460 The Wall Street Journal has a podcast and they're doing the Facebook files and so that's
00:54:05.800 where I heard that story from.
00:54:07.740 Yes, it was like some Brazilian baseball player.
00:54:09.960 He was very popular in that country.
00:54:12.440 But yes, so apparently, according to these Facebook documents, there is this large group
00:54:17.500 of people and it's not even so much that they're big shot celebrities.
00:54:21.860 They can just be Joe Glow's as long as the people in Facebook add them to the list.
00:54:27.340 And these people just get immunity from almost everything.
00:54:31.680 And I thought what was very interesting is that Donald Trump was included into that elite
00:54:37.440 group of people that nothing bad could happen to him.
00:54:41.580 But apparently...
00:54:42.760 Trump was in that group.
00:54:43.360 Trump was in that five to six million.
00:54:45.220 Yes.
00:54:45.760 Huh.
00:54:46.240 Interesting.
00:54:48.780 Go ahead.
00:54:49.440 You were saying something, but apparently...
00:54:50.760 I was just saying I just found that very interesting that he was part of that group.
00:54:54.180 And then everything after January 6th, he was taken out of that group out of fear of real
00:55:00.300 world retaliation after what happened at the Capitol.
00:55:04.680 Sean, let me ask you, how much, since you were in this for like seven months, you know,
00:55:08.920 sometimes you have a bad experience with a company or something personal that happens
00:55:12.180 to you, whether it's, you know, testing with a drug.
00:55:16.220 You take a drug and next thing you know, your family member gets addicted to Vicodin.
00:55:21.020 They can't get off of Vicodin and they harm themselves.
00:55:23.700 So for the rest of your life, all you're doing is studying Vicodin, right?
00:55:27.200 And doing whatever you can, because you're just like obsessed to get to the bottom of
00:55:31.300 it to see what really happens.
00:55:32.460 Sometimes in military and accidental, somebody dies or there's a suicide and all these other
00:55:38.480 things that happen.
00:55:39.180 You commit your life to wanting to know more about it.
00:55:42.060 Did this kind of get you to say, you know what?
00:55:44.180 I want to kind of get to the bottom and see what the hell is going on.
00:55:46.620 Or were you kind of like, I don't want to have anything to do with this.
00:55:48.500 I'm moving on in my life.
00:55:50.420 So I just want to say that there's no sort of revenge scheme I have against Facebook.
00:55:55.680 I'm just one little person.
00:55:57.660 I'm not here that, like, if I was trying to go after Facebook, I would definitely have
00:56:02.740 tried to go on like Oprah or many other different shows and try to like say my story.
00:56:08.620 For one, I don't really want to be famous known as, hey, you're the guy that watched
00:56:13.180 fetuses be smashed with hammers for 10 hours every day, six days a week.
00:56:17.320 Like, that's not really what I would want to be known for.
00:56:20.840 The reason that I'm doing this and I speak out is because when I started working with
00:56:28.240 Facebook, I thought that the least I could do is try to help the people and animals whose
00:56:35.500 last moments were being desecrated.
00:56:37.580 Even if it's on video or picture, people that were mocking it, making fun of it, getting
00:56:43.120 off to it with sexual pleasure, I wanted to at least help these animals and people and
00:56:50.580 their dying moments and at least give them closure.
00:56:55.360 That's all I wanted to do is try to help.
00:56:58.160 And they even kind of sold it to us when we first started with training is that you're
00:57:02.080 going to be the police of Facebook.
00:57:03.980 You're going to be policing it, monitoring it.
00:57:06.140 You're going to make it a safer place.
00:57:08.240 And I really bought into that.
00:57:09.700 I, it's, I want to, I wish I could, you know, wash my hands of it all and just go, I don't
00:57:20.680 care anymore.
00:57:21.580 I don't want to hear it anymore.
00:57:23.880 But there was, there was too much that happened on Facebook that should not have ever been allowed.
00:57:31.880 And I would probably put money on it.
00:57:35.720 Those same people are still making that same content.
00:57:39.580 One thing that really got to me, and this happened last year.
00:57:46.920 And it was this woman that works, that was an activist.
00:57:51.820 And she was telling me that she was going to Congress to talk about these specific things.
00:57:57.060 So we talked a couple of times and last year she called me and she said, did you work on
00:58:04.120 these specific types of things in Texas?
00:58:07.240 And it was about these girls that were being held in their parents' basement and being used
00:58:13.780 as like sex toys and prostitutes.
00:58:16.880 And I said, yes, I did work on that.
00:58:20.160 Apparently the authorities just found out at that time and arrested them.
00:58:27.060 And that was two years after I left Facebook.
00:58:30.920 And I wrote the report on that.
00:58:34.500 And I did my best to make sure the authorities would try to get it, make sure Facebook would
00:58:39.940 take it seriously, but Facebook wouldn't let it get out.
00:58:43.860 And so those girls suffered for two extra years.
00:58:49.340 And there were many other cases similar to that.
00:58:53.620 And they wanted me to confirm that it was, that I actually sent a report in and discussed
00:59:00.380 that stuff.
00:59:03.220 And it just makes me mad that they suffered two extra years because Facebook couldn't even
00:59:08.860 do the right thing and let the authorities know what it was.
00:59:13.220 There was another instance.
00:59:17.380 And it was one of the first things I saw.
00:59:20.720 And it was these two boys.
00:59:23.280 And they grabbed an iguana by the tail.
00:59:25.640 And they smashed the iguana onto the pavement.
00:59:30.520 And the iguana was just screaming.
00:59:33.320 I've never heard an iguana scream.
00:59:35.420 And I don't think most people have.
00:59:38.060 And they kept doing it and doing it and smashing the iguana onto the pavement until it was just
00:59:43.220 a bloody pulp with a tail.
00:59:45.720 And apparently those boys had their own private Facebook page where they were bashing other
00:59:53.420 animals to death as well.
00:59:55.180 And they never took it down.
00:59:57.080 That iguana video was still up.
00:59:59.460 And they were making more videos of more animals they were killing.
01:00:04.040 And they were getting paid for it.
01:00:05.520 So that's why I want to get this message out is because people are suffering and nobody's
01:00:14.840 doing anything about it.
01:00:18.340 And you may think that it's all just on a different screen.
01:00:22.240 It's not real.
01:00:23.680 They want you to sensitize to it.
01:00:27.040 But people are actually making money off this thing.
01:00:29.840 And I think that's the most disgusting thing in the world is that people are getting paid.
01:00:34.560 And there are people that want to pay you to make this content.
01:00:39.720 And that's why I do this.
01:00:41.620 And yes, I would like to get to the bottom of this.
01:00:43.640 I would like to know more about what the heck they're doing in there.
01:00:49.560 For somebody, you know, for somebody who's on the inside, Sean, who this is your world,
01:00:58.740 what can Facebook do?
01:01:00.780 Meaning, is it in their control to be able to address this and stop it?
01:01:05.100 That's one.
01:01:05.780 So let's just say the answer is yes.
01:01:07.080 If the answer is yes, what can they do?
01:01:09.380 They've been talking about AI for a long time.
01:01:13.120 You know, from your perspective.
01:01:15.740 And again, Sean, I appreciate you for sharing that.
01:01:18.620 And it's 100% felt from speaking to you.
01:01:21.460 It's very obvious that you're not doing this for a dollar.
01:01:25.560 You're doing this because, you know, animals and people matter to you.
01:01:29.420 Even when you said earlier, I'm not a politics guy.
01:01:31.220 I don't follow politics.
01:01:32.180 I don't know what's going on with politics, but it's not my world.
01:01:34.860 But what would you say?
01:01:36.520 What would you say, you know, on what they can do to eliminate all of this?
01:01:40.940 Is it AI?
01:01:41.520 Is it internal team?
01:01:43.540 Is it, what do you think that is?
01:01:46.260 So the first thing that they truly need to do is if Facebook really wants to make this change,
01:01:54.460 is Zuckerberg has to step down.
01:01:56.640 Because Zuckerberg is kind of, he's the one in the way of all this.
01:02:00.900 He's the one that makes all decisions.
01:02:02.900 He's the one that signs off on everything.
01:02:05.800 And this is Zuckerberg's creation.
01:02:08.380 So if we had someone that was not him in control, then perhaps we could actually get something done.
01:02:16.480 Because I have, I don't see Zuckerberg ever bending the knee, so to speak, to actually make these changes.
01:02:24.980 What I think needs to be changed could be Facebook needs to be divided up into different segments.
01:02:32.220 And then have these different independent individuals actually be in charge of these segments.
01:02:38.380 So maybe we really do need some sort of government influence looking into this.
01:02:44.800 I know a lot of people will claim that that would be like an oversight of the American government.
01:02:50.460 But when something has grown this big and it has such an importance in not just our society, but world society,
01:02:59.260 maybe it needs to not be a private company and maybe it should be some sort of public entity.
01:03:03.860 I mean, Facebook is so big, along with other social media sites, that maybe we shouldn't consider it a private company.
01:03:15.620 Maybe we should consider it as something as a public square.
01:03:18.820 Maybe it should be considered something that would be the equivalent of going out in public and saying something and not something that is controlled by a private entity.
01:03:29.520 The only other thing I could possibly think of Facebook to do is maybe Facebook just needs to do a total revamp of their policies.
01:03:40.980 Because when I was working there, even though the policies were changing by the day, they were only changing by specific current events and they would only change for like specific people.
01:03:49.940 So like a great example would be during the Justice Kavanaugh hearings.
01:03:55.640 I'll use that example again.
01:03:57.620 Normally, you're not allowed to make any sort of disparaging bullying comments about people that claim that they are survivors of rape or molestation or anything like that.
01:04:08.540 But Facebook made an exception with Dr. Ford.
01:04:11.580 So you were actually allowed to make fun of her for that.
01:04:14.880 Oh, I apologize.
01:04:16.600 It's okay.
01:04:16.860 They made an exception.
01:04:19.960 You were saying they made an exception for Dr. Ford.
01:04:22.440 Yes.
01:04:23.000 So everybody could make fun of her.
01:04:25.180 Everybody could like claim that she was asking for it.
01:04:28.960 She was a slut.
01:04:29.800 She was a whore.
01:04:30.620 They could claim that.
01:04:32.400 And for normal cases, you're not because it would get deleted.
01:04:35.140 But those were like the examples of like policies changing for specific people or specific events and their policies were so hyper specific.
01:04:46.580 I think in a normal ruling body, we could all agree that, you know, killing, killing animals online should be banned.
01:04:56.740 I don't think there's any reason why we should have it on there.
01:05:01.460 But apparently, there's still like a lot of stipulations like you can kill animals on video as long as it's in like in eating, preparing food setting.
01:05:13.160 Or you can also kill animals if it's in a perceived self-defense setting, where there's a lot of people that took advantage of those and they just like mutilated a bear because they claimed that they were hunting a bear in self-defense.
01:05:25.860 And so they had a caption that said like killing that said, like, you know, we're killing the bear in self-defense.
01:05:32.980 And they were like ripping its jaw off and like just torturing the thing while it was still alive and drugged up.
01:05:38.580 And Facebook allowed that because the caption said, oh, it was in self-defense.
01:05:42.100 So, of course, we can allow that.
01:05:44.320 So it's just like these policies are just they're so easy to find loopholes on.
01:05:48.880 Crazy thing I'm going to throw out there.
01:05:50.020 Tell me what you think about this.
01:05:50.960 Do you think do you think there's any chance that maybe a a a a large, you know, a country like China has additional motives on the inside where they want to hurt this great country of America and they want to figure out a way to internally destroy the younger generation?
01:06:13.040 You know, any do you think China has any influence over Facebook?
01:06:18.540 It's hard to say.
01:06:19.840 If I had to be honest, I I can't give a yes or no answer on that.
01:06:25.360 I respect that.
01:06:26.860 I respect that.
01:06:28.520 You know, sometimes you read articles and it says the hands a country like China has on these major social media companies.
01:06:37.380 And the other part is typically when you see one person gets off on a site, everybody else follows suit.
01:06:42.840 Do you think there is a coalition amongst all the major social media sites where they work together where if one bans somebody, the other follows as well?
01:06:50.340 Yes. Yes.
01:06:51.900 Yes.
01:06:52.140 But only when it's people that are in higher opposition.
01:06:55.380 So if it's just like regular Joe guy, I they're not going to do that.
01:06:59.440 But if it's someone that is more influential, like, as you said, Alex Jones before, of course, they're going to follow suit on that.
01:07:05.960 Of course, they did the same thing with Donald Trump.
01:07:08.380 If it was a different sort of celebrity, I'm sure that they would have all gotten together and gone like, yeah, let's all do a joint ban on him.
01:07:16.040 That makes sense.
01:07:16.520 Do you think Trump should have been banned or Alex Jones should have been banned?
01:07:21.280 So that one's a difficult question, because I if I understand correctly, Alex Jones was the man who claimed that the Sandy Hook shooting was a false flag and that the kids that died in that were actually not killed and they're still alive somewhere and the parents are lying.
01:07:40.380 So Alex Jones, I don't know his specifics of everything he said.
01:07:49.360 Has he claimed has he claimed any sort of real world harm against anybody?
01:07:55.620 Has he personally?
01:07:57.880 It has has any of his posts, anything, have they claimed any real world harm?
01:08:03.600 Like, has he like claimed that he would like do something to someone?
01:08:06.780 Any specific groups? Has he been antagonizing like a specific demographic of people?
01:08:13.080 I couldn't verify that.
01:08:16.100 So that one I would honestly have to look up because guess what?
01:08:21.480 The Internet is made for conspiracy theories.
01:08:23.400 It's made for the crazies.
01:08:24.540 I mean, that's what the Internet is.
01:08:25.940 Go to any Web site.
01:08:26.920 You'll find that anywhere, even Wikipedia.
01:08:29.300 I mean, I'm not dissing Wikipedia.
01:08:30.960 I just mean it's that's the Internet.
01:08:33.420 The Internet was made for insane theories.
01:08:35.100 That's just what it is.
01:08:36.780 But regarding Donald Trump.
01:08:40.340 So what happened at the Capitol was incredibly weird.
01:08:45.000 I personally do think that he riled up his fans to go do that.
01:08:51.420 I don't really know if there was an ulterior motive to it, but I do know that he I would consider him to be the one that, you know, lit the match, so to speak.
01:09:02.420 But I.
01:09:05.000 You see, these are tough questions because I don't want to come on here thinking that I'm like politically motivated one way or another.
01:09:11.720 So I am trying not to come off that way.
01:09:13.820 But I guess just for the most part, it's really tough.
01:09:19.280 Because from an ethical standpoint, you could make that argument that it's a public forum.
01:09:25.420 He did say everybody to go home afterwards.
01:09:29.560 He did.
01:09:30.900 He actually did make that post.
01:09:32.680 I remember seeing that where he said, like, everybody go home.
01:09:35.560 We love you.
01:09:36.380 I understand that.
01:09:39.440 And I understand just he had broken other policies before there was.
01:09:46.500 But as a public figure, especially the president of the United States, it's really difficult to say, like, if you're allowed to break it.
01:09:55.300 If we treat him like a normal person, if he did not have the checkmark next to his Twitter account, yes, he should have been banned.
01:10:02.000 Because as a normal person, he was breaking the rules and the policies.
01:10:07.420 But since he was the president of the United States, I feel like the social media companies should have had more of a sit down with him and say, if you're going to do diplomacy through social media, we need to set something up that can actually make that happen.
01:10:24.920 Maybe social media needs to grow and figure out how to do diplomacy through Twitter.
01:10:33.400 Maybe that is an idea that we think is really stupid, but it also should be something that maybe we should actually think about.
01:10:42.980 Maybe it could be realistic in the future.
01:10:45.500 Because if you're doing it through social media and Twitter, at least you're, like, speaking to the public and actually getting your point across so everybody knows where you stand.
01:10:57.420 It's a very difficult subject.
01:10:59.980 I mean, if I said, yes, he should be banned, there's more to it than that.
01:11:04.320 And if I said, no, he shouldn't be banned, there's more to it.
01:11:06.900 These are subjects that require more than just a yes or no answer.
01:11:11.680 Is your birthday, are you a September or October baby?
01:11:14.460 What month are you?
01:11:16.460 June.
01:11:17.340 Believe it or not, I'm actually June 13th.
01:11:19.740 So I'm actually, like, I think a day before or after Trump's president.
01:11:25.320 Yeah.
01:11:26.040 But the way you're reasoning and the way you're going, very, very interesting.
01:11:31.740 Do you think most of these moderators are politically motivated or no?
01:11:34.960 Like, are some of them?
01:11:36.000 Yes.
01:11:36.660 I'll just say yes, because that's all I heard.
01:11:38.960 That's all I heard when I worked there.
01:11:40.320 There were so many people that were of different political backgrounds.
01:11:45.060 I had many people that were hard Democrats that hated anything a Trump supporter said, and they'd delete it.
01:11:51.180 And then on the other end, I heard people that were hardcore Trump supporters, and they were just like, I hate libtards.
01:11:57.020 I want to get rid of this stuff.
01:11:58.320 So I heard it, and there was constantly bickering and fighting in between there.
01:12:04.840 And I can say for a fact that some of the team leads I worked under, they were politically motivated because they would flat out tell us their thoughts.
01:12:13.640 So, yes, the moderators I did work with were politically motivated.
01:12:18.340 Sean, I got to tell you, I've really enjoyed talking to you.
01:12:23.500 I haven't enjoyed the stories, like, visually for my mind to go there, and, you know, it's extremely disturbing.
01:12:30.260 But your approach on you having the courage to go out there and talk about this, where you're getting the audience to be thinking about these issues that are day-to-day on a platform that we all use on a daily basis.
01:12:43.860 All of us are on Facebook, Instagram, Twitter, YouTube.
01:12:46.900 It's part of our daily lifestyle.
01:12:49.500 It's, what, 30 years ago somebody would go home and they would turn on the TV to watch something.
01:12:54.800 Today people turn on their phones to look at what's going on.
01:12:58.580 And you seem very sincere.
01:13:02.460 There was nothing that felt like there was a motivation there for you on one side or the other.
01:13:06.120 You're just kind of being a matter of fact and sharing your thoughts on this.
01:13:10.540 And I appreciate that.
01:13:11.860 I'll give the final thoughts to you.
01:13:13.020 If you want to say anything final to the viewer, I'll leave it up to you.
01:13:17.780 Thank you.
01:13:18.460 Thank you.
01:13:18.880 For all those that are watching, thank you very much for tuning in and hearing my story out.
01:13:25.660 If I could get any sort of message across, it would be that going forward with social media, we need to be more careful with how we go about it.
01:13:37.300 We need to be more careful on how we interact through social media.
01:13:41.460 We need to be more careful about how we treat others through social media.
01:13:46.580 And finally, the biggest point I wanted to get across about this is Facebook has this horrible content that I have spoken about with animals, people, babies, toddlers, women, children.
01:14:00.520 And Facebook refuses to take action on it.
01:14:04.300 And for anyone that's listening that has any sort of anger from this like I do, please, let's try to stop this.
01:14:16.160 Because the last thing that we want is for more of this violence to happen on there and more people to make money off of animals and people suffering.
01:14:27.160 Thank you.
01:14:27.740 Sean, appreciate your time, buddy.
01:14:29.440 Thank you so much.
01:14:30.780 All right.
01:14:31.260 Thank you very much for having me on again, sir.
01:14:33.420 Anytime.
01:14:33.960 Thank you.
01:14:34.340 By the way, just out of curiosity, what are you thinking about what you just heard right now?
01:14:37.880 Very, the messaging.
01:14:39.460 Like, yeah, we can check your messages.
01:14:41.800 And then the one-sided with moderators, their job, how it's on the outside, how they report it, what stays, just because it's a private group.
01:14:49.640 What are you thinking right now?
01:14:50.700 What's your biggest takeaway?
01:14:51.540 I want to hear your thoughts.
01:14:52.380 Comment below.
01:14:53.600 And if you were enlightened by this interview, there's two other videos I think you'll like.
01:14:57.920 One of them is one I did with Cambridge Analytica.
01:14:59.960 Brittany Kaiser, who is a whistleblower, her and I sat down and talked about it.
01:15:03.300 Click over that.
01:15:04.260 Click here to watch that interview.
01:15:06.700 And the other one is what John McAfee had to say about social media and his level of trust for social media, cell phones.
01:15:12.820 He took a different angle.
01:15:13.840 And he's not here with us, but this is an interview I did with him four or five years ago.
01:15:17.400 If you've not seen it, click over here to watch that as well.
01:15:19.440 Take care, everybody.
01:15:20.060 Bye-bye.