Facebook admits site appears hardwired for misinformation, memo reveals

Read More

Facebook admits site appears hardwired for misinformation, memo reveals

Papers reveal struggle to tackle hate speech and reluctance to censor rightwing US news organisations

Last modified on Mon 25 Oct 2021 10.10 EDT

Facebook has admitted core parts of its platform appear hardwired for spreading misinformation and divisive content, according to a fresh wave of internal documents that showed the social media company struggled to contain hate speech in the developing world and was reluctant to censor rightwing US news organisations.

An internal memo warned Facebook’s “core product mechanics”, or its basic workings, had let hate speech and misinformation grow on the platform. The memo added that the basic functions of Facebook were “not neutral”.

“We also have compelling evidence that our core product mechanics, such as vitality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform,” said the 2019 memo.

Referring to Facebook’s safety unit, the document added: “If integrity takes a hands-off stance for these problems, whether for technical (precision) or philosophical reasons, then the net result is that Facebook, taken as a whole, will be actively (if not necessarily consciously) promoting these types of activities. The mechanics of our platform are not neutral.”

The document was disclosed by the New York Times on Monday as part of a wave of stories by a US-led consortium of news organisations. The NYT stories, and others, were based on disclosures made to the Securities and Exchange Commission – the US financial watchdog – and provided to Congress in redacted form by the former Facebook employee turned whistleblower Frances Haugen’s legal counsel.

The documents have also been obtained by the Wall Street Journal, which since last month has published a series of damaging exposes about Facebook.

Other stories released on Monday as part of the Facebook Papers referred to Facebook’s inability to tackle hate speech and harmful content outside the US. Incitement to hatred and disinformation is substantially worse among non English-speaking users, according to multiple reports by the Facebook Papers partners. Much of Facebook’s moderation infrastructure is underresourced for languages other than English, and its software struggles to understand certain dialects of Arabic, the Associated Press (AP) reported.

The company’s algorithmic moderation software could only identify 0.2% of harmful material in Afghanistan, according to an internal report carried out earlier this year that was reported by Politico. The remainder of the harmful material had to be flagged by staff, even though the company lacked moderators who could speak Pashto or Dari, the country’s principal languages. Tools for reporting harmful material in the country were only available in English, despite it not being widely spoken in Afghanistan.

According to AP, two years ago Apple threatened to remove Facebook and Instagram from its app store over concerns the platforms were being used to trade in domestic servants, a sector that is high-risk for abuse and human slavery. The threat was dropped after Facebook shared details of its attempts to tackle the problem.

Elsewhere in the papers, a document seen by the Financial Times showed a Facebook employee claiming Facebook’s public policy team blocked decisions to take down posts “when they see that they could harm powerful political actors”. The memo said moves to take down content by repeat offenders against Facebook’s guidelines, such as rightwing publishers, were often reversed because the publishers might retaliate.

“In the US it appears that interventions have been almost exclusively on behalf of conservative publishers,” said the memo, referring to companies such as Breitbart and PragerU.

A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we’ve invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”

Related articles

You may also be interested in

Headline

Never Miss A Story

Get our Weekly recap with the latest news, articles and resources.
Cookie policy

We use our own and third party cookies to allow us to understand how the site is used and to support our marketing campaigns.