Warning: This article includes content that some readers may find disturbing
"Heads don't explode like you see in video games. It's absolutely horrifying. It's over a year since I saw that and I still see it regularly," says Chris Gray, who was employed as a content moderator for Facebook in Dublin between 2017 and 2018.
Gray was part of an army of about 15,000 people in 20 locations around the world, whose job it is to decide what content should be allowed to stay on Facebook, what should be left up and marked as “disturbing”, and what should be deleted.
Chris Gray can still picture clearly some of the content he moderated: Islamic State executions, murders, beatings, child exploitation, the torture of animals
A year on, he says he can still picture clearly some of the content he moderated: Islamic State executions, murders, beatings, child exploitation, the torture of animals. Compounding the stress of exposure to disturbing images and videos was, he says, the pressure to make the right decision about what to do with it.
“You’re in constant jeopardy of making a wrong decision and being penalised for it. That multiplies the stress when you’re looking at difficult content.”
Facebook's content moderation policies have come under scrutiny recently, after the massacre of 50 people in New Zealand was live-streamed on the platform. The social media site has said that 1.5 million attempts were made to upload the video, and that 1.2 million of these were "removed at upload". The 300,000 that made it on to Facebook were subsequently removed.
In the aftermath of the massacre, Facebook announced last week that it had banned praise, support and representation of white nationalism and white separatism – a task that will fall to both technological solutions like artificial intelligence,or AI, and to human moderators.
A number of current and former Facebook content moderators, based in the United States, have recently spoken anonymously about their experiences of doing what the Wall Street Journal called "the worst job in the US".
Three former content moderators have launched a class action lawsuit against Facebook in a California superior court, alleging that they suffered from symptoms of post-traumatic stress disorder as a result of the repeated viewing of violent videos.
The terms and conditions of their employment are designed to protect content moderators from the threat of repercussion by preserving their anonymity. But they also have the consequence of preventing them from discussing a role in which there is a growing public interest.
The way we moderate all social media is of huge importance. There is a real-life impact from the things we see. There's online bullying. There's the quiet spread of racism
Having consulted a lawyer, Gray has decided to speak publicly about his experiences. “This is not just about moderators having to view difficult content. The way that we moderate all social media is of huge importance to society generally. And I believe it’s not being done right. There is a real-life impact from the things we see on social media. There’s online bullying. There’s the quiet spread of racism.”
He feels that Facebook “is not taking it seriously enough”, a claim that Facebook denies. It says it takes these reports “incredibly seriously”.
As with the content moderators working for the social media platform in places like Arizona, Gray was not employed directly by Facebook. His contract was with the Dublin-based company CPL. He worked in a building in central Dublin with no Facebook logo over the door.
Gray can remember meeting only one Facebook employee during his nine months there. He was paid a basic rate of €12.98 per hour, with a 25 per cent shift bonus after 8pm, plus a travel allowance of €12 per night – the equivalent of about €25,000 to €32,000 per year. The average Facebook employee in Ireland earned €154,000 in 2017.
Not all of the content they moderate is disturbing: much of it, he says, is “soul-destroying . . . banal, petty, awful” threads of comments between people bickering, and all reporting each other.
The guy in orange has to kneel down and the guy in black stands behind him with a gun, presses the trigger. What happens next is not like a video game
But some of the content is so disturbing it stays with him more than a year later. “Imagine you see a video and there’s somebody in an orange jumpsuit, and he looks terrified. And behind him there’s a guy dressed all in black with his face obscured, and he’s got one of those big Kalashnikov machine guns. And the guy in orange has to kneel down and the guy in black stands behind him with a gun, pushes it to his head, presses the trigger.”
What happens next, he says, is “not like a video game”.
As a content moderator, Gray says, you’re not just looking at the video of an execution by Islamic State, aka Isis, through the eyes of a horrified onlooker. You’re also trying to analyse it objectively.
“You’re looking at it, and you’re thinking, What’s the person posting this saying about it. Are they condemning it? Are they supporting it? Is there any Isis symbolism? Is there any praise for Isis?”
You might know you have to delete it, he says, but “you’re asking yourself, Do I delete it for terrorism? Do I delete it for graphic violence? Do I delete it for supporting a terrorist organisation?’
“There are grades of decision making, and if you get it wrong by just a little bit it still counts as a mistake. And that counts [against] your quality score, and you might be fired. You’re not just looking at it objectively; you’re trying to second-guess the system.”
‘Quality scores’
Over and over again, in my conversations with him and another former Facebook content moderator, “Alex”, who spoke to The Irish Times on condition of anonymity, I hear about “quality scores”.
The former moderators say they felt under constant pressure to achieve a 98 per cent quality rating – meaning that auditors agree with 98 per cent of their decision making on a random sample of tickets. “Initially it was 95 per cent and CPL increased it to 98 per cent. You’re scrutinised so much even for the smallest mistakes,” says Alex.
The job involved moderating about 300 and 400 pieces of content – which they call “tickets” – a night, on an average night. On a busy night, their queue might have 800 to 1,000 tickets. The average handling time is 20 to 30 seconds – longer if it’s a particularly difficult decision.
Even if a decision to delete was correct, if you choose the wrong reason for deleting, it counts as an error. “A lot of times, I wake up in the middle of the night and think suddenly ‘Oh my God, I missed a nipple’. That has happened quite often,” Gray says.
Facebook says it instructs its partners not to pressure reviewers to meet targets for the time a job might take or for the number processed on any given day
In a statement, a Facebook spokesperson said that content reviewers were not given targets “for either the amount of time a job might take or for the amount processed on any given day. In fact, we specifically instruct our partners to not put this sort of pressure on reviewers.
“Some type of content, like nudity, is easy to establish and can be reviewed within seconds,” says Facebook. “Others, like impersonation, take longer to confirm.”
Reviewers are encouraged “to take the time they need to make a determination”, Facebook says.
The spokesperson added that Facebook is “committed to providing support for our content reviewers as we recognise that reviewing certain types of content can be hard, so we take reports like this incredibly seriously. Everyone who reviews content for Facebook goes through an in-depth, multi-week training programme on our community standards and has access to extensive psychological support to ensure their wellbeing.
“At CPL, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right.”
I ask Alex why he felt the quality scores mattered so much. “Because your job depends on them on a week-to-week basis.”
How did that impact on him? “Put it this way. The first ever ticket that I did, I had to watch a person being beaten to death with a plank of wood with nails in it, and being stabbed numerous times. And that was also the first ever mistake that I made. Because I deleted it. But at the time it didn’t qualify as gore.
“That’s probably the hardest part of the job. It is not the first time that you see the content, it’s when you go through the mistakes that you made with the auditor, and you’re watching it for the second or the third time, and you’re trying to argue your case against them. You’re trying to say that you made the right decision, [but] you still have to watch that material over and over and over again,” he says.
Child pornography
Facebook doesn’t comment publicly on its moderation guidelines. But it is understood that content is removed if it is glorifying or celebrating the violence, or being shared for sensational viewing pleasure, or if there are visible internal organs or dismemberment. Beyond that, it is marked as disturbing and unavailable to those under 18.
"One thing that is always really distressing is seeing people torturing animals, but because it's in the category of 'culinary', you have to keep it on a platform." Alex gives the example of the Yulin dog meat eating festival in China. It took me just a few seconds to find upsetting, graphic videos of the festival, some of them marked as disturbing, on the platform.
There are people who trawl the internet and collect videos of violent deaths. Then they splice them together and overlay them with happy music, rap music, celebratory music
Gray describes a video of a pig being beheaded to the soundtrack of Peppa Pig. “It shocks you . . . It makes you . . . ” his voice trails off. “And then there are the people who trawl the internet and collect videos of violent deaths [of people]. And then they splice them together, and they overlay them with happy music, rap music, celebratory music.”
Having been a fan of horror movies, Alex thought he was prepared for the job. Now he says horror movies are like comedies. “The second day on the job, I had to watch someone f***ing a dog. Bestiality. Or zoophilia,” he says.
“The second week on the job I got sick because it was the first time that I ever had to see child porn.”
Did you actually throw up? “Yes, I had to step away from the computer and use the bathroom.”
The Facebook spokesperson said that “human eye and judgment” are required for effective moderation. “What happens in the world makes its way on to Facebook, and that unfortunately often includes things that don’t belong on our platform and go against our community standards. This content therefore needs to come down.
“AI has made massive progress over the years in many areas, which has enabled us to proactively detect the vast majority of the content we remove. But it’s not perfect. Many of the decisions we make, include cultural nuance and intent, which still require human eye and judgment, which is why people will continue to be part of the equation.”
24/7 support
Facebook says content moderators have access to a trained counsellor on site, night and day, along with break-out areas, where they can step away from their screens, and peer supervision and support. It is up to the individual how much they want or need to take advantage of these supports and wellness resources.
But, says Alex, the counsellors can only do so much. “It’s like shaking up a bottle of Coke, and opening the bottle just a little bit. It only releases some of the tension. There is a graveyard shift and there is an evening shift, and [if] you’re going to get home at eight in the morning, or three in the morning, and you’re going straight to sleep, the last thing you saw was these images . . .
“There are points where you become numb, when you are dealing with the lot of that kind of material. But it’s not in the job that it affects you most. It’s outside of the job. Politically and emotionally, you become kind of disconnected from people. You can’t really talk about it with your family or with your friends, because people don’t want to hear about that. People want you to move on. But how do you move on from that type of material?”
Gray says he saw the counsellor “once a month; maybe once every six weeks. I didn’t have time. I didn’t get paid to spend my time walking away from my desk once a day to talk to the counsellor, who is equally not paid to spend all her time on me.”
CPL says it cares deeply about its employees and provides extensive training and support to everyone working to review Facebook content, to ensure their wellbeing
Facebook didn’t respond directly to claims made by the moderators of multiple staff lay-offs during their time there, but it pointed out that many steps are taken to help content reviewers who are struggling with particular content or accuracy, including additional training and a performance improvement plan.
In a statement, CPL said: “We care deeply about our employees and take any concerns they raise very seriously. We provide extensive training and support to everyone working to review Facebook content, to ensure their wellbeing . . . We are working to understand what happened here, and encourage these individuals to share their concerns with us directly.”
In common with the accounts of Facebook moderators published in the US, Gray describes noticing, after a time, that his own views and attitudes had begun to change. “Your values change because you’re reading the same bulls**t all the time,” says Gray.
“You’re not reading positive stories. You’re not reading nice stuff. You’re always reading conspiracies and nastiness. And even though you’re consciously saying ‘that’s not true’, you stop processing it. You’re exposed to hatred and nastiness all the time,” says Gray.
On the floor, he says, “I would hear people making racist jokes and thoughtless stupid comments. And we’re supposed to be in the business of policing thoughtless stupid comments.”
Gray says he was affected too, over time. “I think I am more right wing. I’m married to a brown Muslim, and we are both immigrants, and I’ve travelled all over the world. I was the hippie love everybody type of guy. Now I’m much more likely to argue with people. I’m much more likely to be anti-immigration and so forth.
“I’m consciously trying to be the person that I used to be. But you get these flashbacks, and you find yourself saying things that you heard somebody saying months ago [on Facebook]. Or you saw in a Britain First video. It’s still all there, in the back of your head. All the hate all the nastiness, it’s all there inside you.”
“I thought I was resilient,” Gray says. “But looking back . . . I had a discussion with my team leader about the training. And he kind of pushed back. And I basically lost it. I wasn’t shouting or screaming, but I was shaking. I was crying. This is not me. I’m a 50-year-old guy. I’ve been all over the world. I’ve seen all kinds of stuff. Why am I behaving like that?”
Exhaustion and flashbacks
Gray was let go from the job after nine months. The reason given was his quality rating. But he believes his symptoms of stress may have been a contributory factor, rather than a sign that he needed more support.
Months after being let go from the role, after his quality score was affected by what he says was exhaustion and overwork, Alex is still suffering flashbacks, sleep disturbances, and a feeling of disconnection from friends and family.
There are still times where I cannot sleep. Some nights you wake up in an extreme cold sweat, remembering the experience
“There are still times where I cannot sleep at night. I’m still dealing with this experience. Some nights you wake up in an extreme cold sweat, remembering the experience. It is harder to stay optimistic about pretty much everything.”
Dave Coleman, a solicitor who is representing Gray as he explores legal options, believes there has to be a better way to moderate content. "There could be viable alternatives to having an almost live Facebook, in the same way the radio programmes are broadcast with a few seconds delay.
“All employers have a duty to provide a safe place of work and a safe system of work. In cases where this does not happen, and the employee is injured, all companies are liable,” Coleman says.
It is Facebook’s hope that, in the future, more and more content moderation will be done by technology, allowing it to be less reactive, and more proactive about identifying harmful content. But the technology is limited and, for now say the moderators, it’s too easy for people intent on posting harmful content to find workarounds. “There are always going to be borderline cases and the system is not designed to help you manage those borderline cases,” says Gray.
As long as that remains the case, there will be an ongoing need for human moderators. But at what cost? And if the work they do is so important, is it valued enough?
Yes, says Facebook. “This is an important issue, and we are committed to getting this right.”
A couple of weeks after we first spoke, Gray sent me a voice message about why he decided to go public. “I’m just tired of being invisible. We are in an anonymous office block without a Facebook logo on it. We are never allowed to show our faces or give our names. We kind of don’t exist, and we are not part of Facebook. But the work matters.
“If your kid is being bullied online, I was the one who had to take action on that. If one of your relatives is contemplating suicide or posting images of self-harm or says that they’re going to hurt themselves, I had to deal with it.
“We were constantly working for society, and the general public has no idea who we are, and what we go through on a daily basis. I want people to see that we are real human beings and I want them to decide how much they value what we do.”