A night before the U.S. midterm elections, Facebook has dropped an independent report into the platform’s effect in Myanmar.
The report into Facebook’s impact on human rights within the country was commissioned by the social media giant, but completed by non-profit organization BSR (Business for Social Responsibility).
And it affirms what many have suspected: Facebook didn’t do enough to prevent violence and division in Myanmar.
“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Facebook’s product policy manager Alex Warofka wrote in a statement.
For the southeast Asian country’s 20 million citizens that are online, Facebook is the internet. The report notes digital literacy is low, where many people find it “difficult to verify or differentiate content (for example, real news from misinformation).”
While Facebook has “substantially increased opportunities for freedom of expression” for the country’s citizens, it has been a “useful platform” for people looking to incite violence and cause offline harm.
For the southeast Asian country’s 20 million citizens that are online, Facebook is the internet.
“A minority of users is seeking to use Facebook as a platform to undermine democracy and incite offline violence, including serious crimes under international law; for example, the Report of the Independent International Fact-Finding Mission on Myanmar describes how Facebook has been used by bad actors to spread anti-Muslim, anti-Rohingya, and anti-activist sentiment,” the report states.
In August, Facebook removed pages and groups belonging to military officials who were using the platform to incite violence and ethnic cleansing of Rohingya Muslims.
The report notes that this could have an adverse effect on the company establishing staff in the country, of which there are none currently.
“Facebook’s action against senior military officials in August 2018 also increased the risks associated with locating Facebook staff in Myanmar, at least in the near term, and it is unclear whether Facebook could have acted against the military if Facebook staff had been present in Myanmar,” the report reads.
Some of the recommendations include that Facebook create a stand-alone human rights policy, improve its enforcement of community standards — especially in relation to credible violence, plus preserve and share data which can be used to evaluate human rights violations.
Facebook has been making steps on these recommendations already, but with Myanmar’s elections in 2020 — which is set to be a flashpoint for hate speech and harassment — time is running short.
Report is a good start, but more data needed
Aim Sinpeng, a digital politics expert from the University of Sydney, said the report was commendable in opening up the conversation on Facebook’s impact on human rights.
But the social network could allow researchers more access to its data, which it has limited since the Cambridge Analytica scandal, making it hard for people to work on such research.
“If Facebook were to be serious about understanding its platform’s impact on human rights in countries like Myanmar, it should either allow proper research on Facebook data and how it might contribute to online hate speech and rights abuse,” Sinpeng explained via email.
“The same openness could go a long way with local NGOs but I suspect that Facebook will be hesitant to share its data even for the purpose of research or a watchdog and wants to keep such efforts in-house, away from the public eye and selectively releases results as it sees fit.”
In August, Facebook released some data for the first time on the detection and removal of hate speech in Myanmar, revealing that it had proactively identified about 52 percent of the content it removed from the platform.
Sinpeng added the report’s recommendation of funding local organizations to support Facebook in maintaining its Community Standards is particularly important, given the country is an important growth market, and is at high risk for social media-fuelled conflict.
“There are already NGOs on this ground working to police hate speech and misinformation online and despite several years of pressuring Facebook to do something, it has not until recently,” she said.
“Facebook needs to decide whether it will commit to further both growth and integral use of Facebook in Myanmar or not.”