How Facebook’s quest for profits is paved on hate and lies

Read More

Former Facebook employee Frances Haugen, in an interview on 60 Minutes in the US explained to host Scott Pelley that the social media giant has conducted internal experiments that demonstrate just how quickly and efficiently its users are driven down rabbit holes of white supremacist beliefs.

The data scientist, who resigned from Facebook earlier this year and became a whistleblower, explained how the company knows its algorithms lead users down extremist paths. Facebook, according to Haugen, created new test accounts that followed former president Donald Trump, his wife Melania Trump, Fox News and a local news outlet. 

After simply clicking on the first suggested links that Facebook’s algorithm offered up, those accounts were then automatically shown white supremacist content. “Within a week you see QAnon; in two weeks you see things about ‘white genocide’,” Haugen said.

Haugen’s testimony and the documents she shared confirm what critics have known for a long time. “We’ve already known that hate speech, bigotry, lies about Covid, about the pandemic, about the election, about a number of other issues, are prolific across Facebook’s platforms,” Jessica González, co-chief executive of Free Press, said in an interview. However, “what we didn’t know is the extent of what Facebook knew,” she added.

Three and a half years ago, in the midst of the Trump presidency, I wrote about giving up on an older white man related to me via marriage and who, generally speaking, has been a loving and kind parent and grandparent to his non-white relatives. This man’s hate-filled and lie-filled Facebook reposts alienated me so deeply that I cut off ties with him. In light of Haugen’s testimony, the trajectory of hate that he followed makes far more sense to me now than it did in 2018. Active on Facebook, he constantly reposted memes and fake news posts that he likely didn’t seek out but that he was exposed to.

I imagine such content resonated with some nascent sense of outrage he harboured over fears that immigrants and people of colour were taking advantage of a system that was rigged against whites by politicians like Barack Obama and Ilhan Omar. My relative fit the profile of the thousands of right-wing white Americans who mobbed the Capitol building on 6 January, 2021, egged on by a sense of outrage that Facebook helped whip up.

In fact, Haugen related that Facebook turned off its tools to stem election misinformation soon after the November 2020 election — a move that she says the company’s employees cited internally as a significant contributor to the 6 January riot in the nation’s capital. The House Select Committee investigating the riot has now invited Haugen to meet with members about Facebook’s role.

Facebook founder and chief executive Mark Zuckerberg understands exactly what Haugen blames his company for, saying in a lengthy post, “At the heart of these accusations is this idea that we prioritise profit over safety and well-being.” Of course, he maintains, “That’s just not true,” and goes on to call her analysis “illogical,” and that it is a “false picture of the company that is being painted”.

Except that Haugen isn’t just sharing her opinions of the company’s motives and practices. She has a massive trove of internal documents from Facebook to back up her claims — documents that were analysed and published in an in-depth investigation in the Wall Street Journal, hardly a marginal media outlet.

The Wall Street Journal says that its “central finding” is that “Facebook knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”

The crux of Facebook’s defence against such accusations is that it does its best to combat misinformation while balancing the need to protect free speech and that if it were to crack down anymore, it would violate the First Amendment rights of users. In his testimony before House Representatives this March, Zuckerberg said, “It’s not possible to catch every piece of harmful content without infringing on people’s freedoms in a way that I don’t think that we’d be comfortable with as a society.”

In other words, the social media platform maintains that it is doing as much as it possibly can to combat hate speech, misinformation and fake news on its platform. One might imagine that this means a majority of material is being flagged and removed. 

However, Haugen maintains that while Facebook says it removes 94% of hate speech, its “internal documents say we get 3% to 5% of hate speech”. Ultimately, “Facebook makes more money when you consume more content,” she explained. And hate and rage are great motivators for keeping people engaged on the platform.

Based on what Haugen has revealed, González concluded that “Facebook had a very clear picture about the major societal harms that its platform was causing.” And, worse, the company “largely decided to do nothing to mitigate those problems, and then it proceeded to lie and mislead the US public, including members of Congress.”

González is hopeful that Haugen’s decision to become a whistleblower will have a positive impact on an issue that has stymied Congress. During Haugen’s testimony to a Senate panel on October 5, she faced largely reasonable and thoughtful questioning from lawmakers with little of the partisan political grandstanding that has marked many hearings on social media-based misinformation. “We saw senators from both sides of the aisle asking serious questions,” she said. “It was much less of a circus than we usually see in the US Senate.”

What González hopes is that Congress passes a data privacy law that treats the protection of data gathered from users as a civil right. This is critical because Facebook makes its money from selling user data to advertisers, and González wants to see that “our personal data and the personal data of our children isn’t used to push damaging content… that doesn’t provoke hate and violence and spread massive amounts of lies.”

The calculus of Facebook’s intent is very simple. In spite of Zuckerberg’s denials, González says, “The system is built on a hate-and-lie-for-profit model, and Facebook has made a decision that it would rather make money than keep people safe.” It isn’t as though Facebook is selling hate because it has an agenda to destroy democracy. It’s just that destroying democracy is not a deal breaker when huge profits are at stake.

This article was produced by Economy for All, a project of the Independent Media Institute

Related articles

You may also be interested in

Mammoth Remains Found Nearly Intact in Siberia

Researchers in Siberia are conducting tests on a juvenile mammoth whose remarkably well-preserved remains were discovered in thawing permafrost after more than 50,000 years.[#item_full_content]

Headline

Never Miss A Story

Get our Weekly recap with the latest news, articles and resources.
Cookie policy

We use our own and third party cookies to allow us to understand how the site is used and to support our marketing campaigns.