CEO explains decision not to censor conspiracy theories but says the platform will try to reduce distribution of content

Mark Zuckerberg defended the rights of Facebook users to publish Holocaust denial posts, saying he didn’t “think that they’re intentionally getting it wrong”.

In an interview with Recode published on Wednesday, the CEO also explained Facebook’s decision to allow the far-right conspiracy theory website Infowars to continue using the platform, saying the social network would try to “reduce the distribution of that content”, but would not censor the page.

Zuckerberg’s comments came the same day that Facebook announced a new policy pledging to remove misinformation used to incite physical harm.

The CEO’s remarks to Recode have reignited debates about free speech on the social network at a time when Facebook is continuing to face scrutiny over its role in spreading misinformation, propaganda and hate speech across the globe.

Last year, the Guardian reported on internal Facebook moderation documents which suggested that the company flouted Holocaust denial laws except in countries where it was likely to be sued or prosecuted.

Zuckerberg, who has also come under fire for Facebook’s role in election interference efforts and the company’s misuse of personal data, reiterated his commitment to allowing abhorrent content on the platform in the latest interview.

He said Holocaust deniers were “deeply offensive”, but “I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong … It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly.”

Zuckerberg said offensive speech can cross a line and face removal when it is harassing or endangering people: “We are moving towards the policy of misinformation that is aimed at or going to induce violence, we are going to take down … If it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform.”

Platforms like Facebook and YouTube have faced intense scrutiny for allowing the far-right commentator Alex Jones to continue to host his Infowars site, which most infamously has spread the false claim that the Sandy Hook mass shooting that killed 20 schoolchildren was a hoax.

That content, Zuckerberg said, would be removed if it was abusive towards an individual: “Going to someone who is a victim of Sandy Hook and telling them, ‘Hey, no, you’re a liar’ – that is harassment, and we actually will take that down.”

Zuckerberg also faced repeated questions about his feelings on Facebook’s influence in Myanmar, where hate speech has exploded on the platform, with some linking killings to content on the site. He did not answer directly, but said: “People use tools for good and bad, but I think that we have a clear responsibility to make sure that the good is amplified and to do everything we can to mitigate the bad.”

After publication of this story, Zuckerberg issued a statement clarifying his Holocaust remarks, saying: “I absolutely didn’t intend to defend the intent of people who deny [the Holocaust]. Our goal with fake news is not to prevent anyone from saying something untrue – but to stop fake news and misinformation spreading across our services.”

If something false were to spread, fact checkers would rate it false and the post “would lose the vast majority of its distribution” on the newsfeed, the statement added.

On Wednesday, Facebook told reporters at its Menlo Park headquarters that it would be taking down misinformation used to provoke physical harm, rather than just de-ranking it in the news feed.

An
An anti-Facebook protest in London. Photograph: Facundo Arrizabalaga/EPA

The announcement was spurred by outbreaks of anti-Muslim violence in Sri Lanka. One of the triggers for the violence was posts spreading misinformation about the Muslim community. The situation escalated to the point that the Sri Lankan government temporarily blocked Facebook’s services in March after misinformation inciting violence remained online for days after it was reported.

The company said it was working with civil society groups to better understand how misinformation spurs local tensions but that it had yet to draw up clear criteria for what constitutes violence, and which other countries apart from Sri Lanka would be the initial focus of the policy.

“At the moment we are starting the work in countries where we’ve recently seen instances where misinformation has been perceived to contribute to physical violence offline,” explained Facebook’s Tessa Lyons.

Facebook said that last month it had removed content falsely alleging that Muslims were poisoning food intended for Buddhists after local partners indicated it could incite violence.

The policy would eventually be implemented globally, the company said.

In his interview, Zuckerberg was also asked about the Cambridge Analytica data scandal, which the Guardian first reported in 2015. In March, the Observer and the Guardian revealed that millions of Americans’ personal data was harvested from the site through an app and improperly shared with Cambridge Analytica, a political consultancy.

In the interview, the CEO noted that the Guardian “initially” alerted Facebook to the work of Aleksandr Kogan, the academic researcher who harvested the data, saying: “And when we learned about that, we immediately shut down the app, took away his profile, and demanded certification that the data was deleted.”

Facebook, however, did not suspend Kogan and the associated company until March of 2018, despite the Guardian’s reporting several years prior. A spokesperson later said that Zuckerberg had misspoken when he claimed the company “immediately … took away his profile”, admitting that this removal had not happened until this year.

Read more: www.theguardian.com