The release of a cache of training guidelines raises important questions over whether Facebooks users are comfortable with the lines the company has drawn

Facebook allows people to live-stream their suicide attempts as long as they are engaging with viewers but will remove footage once theres no longer an opportunity to help the person. Pledges to kill oneself through hashtags or emoticons or those that specify a fixed date more than five days in the future shouldnt be treated as a high priority.

These are tiny snippets from a cache of training materials that Facebook content moderators need to absorb, in just two weeks, before policing the worlds largest social network.

The guidelines also require moderators to learn the names and faces of more than 600 terrorist leaders, decide when a beheading video is newsworthy or celebratory, and allow Holocaust denial in all but four of the 16 countries where its illegal those where Facebook risks being sued or blocked for flouting local law.

The documents detail what is and is not permitted on the platform, covering graphic violence, bullying, hate speech, sexual content, terrorism and self-harm. For the first time the public has a glimpse of the thought process behind some of the companys editorial judgements that go beyond the vague wording of its community standards or statements made in the wake of a live-streamed murder.

This might be the most important editorial guide sheet the world has ever created. Its surprising its not even longer, said Carl Miller, research director at the Centre for the Analysis of Social Media at London-based thinktank Demos. Its come out of a mangle of thousands of different conversations, pressures and calls for change that Facebook gets from governments around the world.

It is clear that Facebook has an unprecedented challenge on its hands. The platform has inadvertently become the worlds largest media organization, with nearly 2bn readers and contributors encompassing the full spectrum of humanitys capacity to entertain, sadden, bore, horrify and disgust.

In order to provide simple instructions to moderators, the documents highlight specific visceral examples. And its not pretty.

Footage of animal abuse is allowed but must be marked as disturbing if there is, among other things, dismemberment or visible innards. Images of physical child abuse is acceptable unless shared with sadism and celebration. Comments such as Irish are stupid are removed while moderators are told to ignore Blonde women are stupid. A picture of a child who appears to have Down syndrome captioned I can count to potato does not have to be deleted.

The files explain that people use violent language to express frustration online without stopping to think about the consequences. This is because they feel indifferent towards their target because of the lack of empathy created by communications via devices as opposed to face to face a neat description of the so-called online disinhibition effect.

This appears to contradict much of Facebook CEO Mark Zuckerbergs 5,700-word manifesto, published in February, that placed heavy emphasis on the social network fostering human connections. It is jarring to see so many examples of human cruelty and depravity laid bare, but they raise important questions over whether Facebooks users are comfortable with the lines the company has drawn.

Either way, Facebook cannot win.

On one hand, it is expected to clamp down on terrorist recruitment, glorified violence and live-streamed crime, while on the other it is accused of overzealous censorship and collaboration with oppressive regimes. This is a terrible bind, Miller said. They found themselves with all these responsibilities and power they never anticipated getting and would rather do without.

Read more: www.theguardian.com