Facebook's moderation guidelines leak out, reveals how the site handles graphic content

The Guardian has published reports revealing Facebook’s guidelines on moderating graphic content on the site, revealing the company’s internal processes.

Facebook’s moderation process has remained shrouded in mystery over the past few years since the social network achieved global proliferation, and now, thanks to The Guardian, we have one of the most telling looks at how the social network conducts moderation of graphic content.
The full series, dubbed the Facebook Files, show the company’s manuals dealing with issues such as non-sexual child abuse, graphic violence, and cruelty to animals.
Read: Messenger, and Instagram will share cross-app notifications
To achieve a reasonably fast moderation process, Facebook leverages automated systems that can proactively eliminate content – though what’s left falls to a team of moderators to comb through, which the social media company recently announced would receive a sizeable expansion.
The guidelines reveal that Facebook continuously outlines vulnerable groups of people or individuals (homeless persons or heads of state feature, respectively) which are automatically censored through the use of artificial intelligence.

revenge porn facebook definition
An excerpt of Facebook’s moderation guidelines, showing how the site defines revenge porn.

Gray areas, however, fall to the site’s moderators – for example, some images pertaining to animal abuse are permitted in order to raise awareness. Similarly, users attempting to inflict self-harm or commit suicide are allowed on the basis that Facebook “doesn’t want to censor or punish people in distress who are attempting suicide.”
In the wake of the report, Facebook’s Monika Bickert offered an official statement to The Verge, citing that:

Keeping people on Facebook safe is the most important thing we do. Mark Zuckerberg recently announced that over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly. In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.

The social network does seem to be in dire need of help – reports from the company depict that the firm reviews more than 6.5 million reports a week pertaining to fake accounts, nevermind the sharing of violence acts or graphic images that spread throughout the site.

holocaust nudity facebook guidelines
Another excerpt which details how Facebook guides moderators to deal with holocaust nudity.

Some of the social network’s guidelines offer clarity on what is seen to be a credible threat or act of violence. Several of those guidelines are offered as follows:

  • “Threats of violence must be credible. “Someone shoot Trump” is not allowed. As a head of state, Trump is in a protected category.
  • Posts not considered credible threats include “Let’s beat up fat kids”.
  • Not all videos of violent deaths have to be deleted. If they can help create awareness of issues such as mental illness.
  • Photos of non-sexual physical abuse and bullying of children need not always be deleted, unless it is sadistic or there is celebration of the abuse.
  • Sadism and celebration restrictions apply to other forms of violence, such as animal abuse.
  • “Hand-made” art showing nudity and sexual activity is allowed, but not digital art.
  • Abortion videos are allowed. As long as there is no nudity.
  • Live streaming attempts at self-harm is allowed. Facebook does not want to censor people in distress.
  • Holocaust nudity is allowed. Provided images show adults in a camp who are in a state of extreme emaciation.”

Earlier this year, Facebook was left to grapple with the very first premeditated murder being carried on on Live Video throughout the service, in which US citizen Stephen Stephens first broadcasted a statement of his intention to commit murder, and then a second video of the shooting – which resulted in the death of Robert Godwin Sr – and confessed responsibility just 11 minutes later.
Read: Facebook plans to introduce a “Disputed” Tag and a “Dislike” Button
What are your thoughts? Be sure to let us know your opinion in the comments below!
Follow Bryan Smith on Twitter: @bryansmithSA
The full series on Facebook’s policy guidelines can be viewed on The Guardian.