Facebook papers: what Muslims need to know
A trove of internal documents released by a whistleblower is painting a damaging picture of Facebook. Frances Haugen said she copied more than 10,000 pages of internal Facebook documents to shed light on the company's inner workings.
News organisations have started publishing their stories based on those documents. We have trawled through the press reports to find out the three things, which we think everyone - but especially Muslims - should know about the Facebook papers.
Facebook suppresses Arabic content but fails to curb hate speech elsewhere
Facebook's platforms have not only failed to build solutions that can catch harmful content outside the US, but have also disproportionately penalised those communicating in Arabic. It is claimed that this is mainly due to translation issues and lack of local cultural knowledge.
According to NBC, this has meant hate speech and calls to violence against religious minorities and ethnic groups, such as the Muslim communities in Myanmar and Sri Lanka, have gone unchecked.
In contrast, in Syria and the occupied Palestinian territories, the tech giant has suppressed ordinary free speech and imposed bans on common Arabic words, the AP reports. For example, Instagram briefly banned the hashtag #AlAqsa during the Israeli attack on Gaza in May.
The AP says the documents revealed that the company had been 'incorrectly enforcing counterterrorism content in Arabic', which was 'impeding' freedom of expression for people throughout the Middle East.
The company failed to tackle Islamophobia in India
In India — Facebook’s largest market — the company is not stopping anti-Muslim hate speech and misinformation, according to Al Jazeera. It says that many critics say Facebook fails to act, especially if cases are linked to Prime Minister Narendra Modi’s ruling Bharatiya Janata Party.
The Wall Street Journal also adds that Hindu nationalist groups that have ties to the Indian prime minister and his party have not been banned despite spreading anti-Muslim content and incitement to violence.
The company's internal research confirmed the problem, but it was not tackled, says The New York Times.
Facebook allowed right-wing outlets to push misinformation
The Financial Times reports that Facebook executives let celebrities and mostly right-wing politicians skirt the platform’s rules despite objections from employees. It said the company was influenced by accusations of bias from conservatives.
According to The Wall Street Journal, Facebook gives high-performing right-wing publishers special treatment, which allows them to avoid punishment for misinformation.
The social network has known its algorithms and recommendation systems push some users to extremes, says NBC. Researchers employed by the company produced a report - “Carol’s Journey to QAnon” - which uncovered how the Facebook algorithms can push people into extremism.
The firm last week released a statement to get ahead of the press reports. "A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us." it said in a tweet thread.
"Internally, we share work in progress and debate options. Not every suggestion stands up to the scrutiny we must apply to decisions affecting so many people,"