Social media giant faces criticism for doing too little to prevent extremist content as terrorists find ways of bypassing its rules

Facebook moderators identified more than 1,300 posts on the site as credible terrorist threats in a single month and face a mission impossible to control the amount of content proliferated by extremists, according to internal documents and testimony provided to the Guardian.

A document circulated to the teams tasked with policing the site says there were 1,340 credible terrorist threat escalations last August.

This means that potentially worrying extremist content or propaganda was passed to senior Facebook managers who then deleted or disabled 311 posts and accounts.

Eight of the most serious reports were evaluated by the services internal counter-terrorism team, the document adds. It also says the information gleaned from moderators had been a massive help on identifying new terrorist organisations/leaders.

The figures are the first insight into the number of terrorist-related reports dealt with by Facebook, which rarely reveals details about the scale of the problems it deals with every day.

Asked about the documents, Facebook contested the figures but did not elaborate. It also declined to give figures for other months.

Other files show Facebook has designated the western-backed Free Syrian Army, which is fighting to depose the president, Bashar al-Assad, as a terrorist group.

Tackling terrorist-related content is one of Facebooks priority areas. The Guardian has been told it is attempting to help control the problem by using software to intercept extremist content before it gets on the site.

This involves monitoring activity from known bad accounts and fanning out to others related to them. More than half the terrorist-related content removed by Facebook is now identified in this way.

A
A Facebook document on counter-terrorism. Photograph: Guardian

But one source familiar with Facebooks counter-terrorism policies said extremist groups such as Islamic State could easily circumvent moderators to distribute content across the site.

The source said the volume of material often meant moderators had less than 10 seconds to make a decision that might require intimate knowledge of a terrorist organisation and its leaders. Its a mission impossible, the source said.

The figures for last August are included in the scores of documents seen by the Guardian that make up the Facebook Files.

They set out in unprecedented detail the way the social media company has tried to balance its commitment to free speech with growing demands for it to more aggressively challenge abuse and violence on the platform.

Read more: www.theguardian.com