Skip to Content

The big takeaways from the Facebook Papers

By Tara Subramaniam, CNN Business

Facebook is no stranger to the limelight. While the company has repeatedly come under fire over the past few years for its role in disseminating misinformation, especially related to the 2016 election, the last two months have been especially turbulent as a whistleblower and top officials have been called to testify in front of Congress following the release of leaked internal research and documents.

These disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel have shed new light on the inner workings of the tech giant. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions of the documents received by Congress. She also shared some of the documents with the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its platforms.

Facebook has pushed back on Haugen’s assertions, with CEO Mark Zuckerberg even issuing a 1,300-word statement suggesting that the documents are cherry picked to present a misleading narrative about the company.

Here are some key takeaways from the tens of thousands of pages of internal documents.

Spread of misinformation

In one SEC disclosure, Haugen alleges “Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection.”

One of the documents details a June 2019 study called “Carol’s Journey to QAnon,” designed to see what pages and groups Facebook’s algorithms would promote to an account designed to look like it was run by a 41-year-old conservative mom named Carol Smith. After “Carol” followed verified pages for conservative figures such as Fox News and Donald Trump, it took just two days for Facebook’s algorithm to recommend she follow a QAnon page.

“While this was a study of one hypothetical user, it is a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform,” a Facebook spokesperson told CNN.

Another document, entitled “Stop the Steal and Patriot Party: The Growth and Mitigation of an Adversarial Harmful Movement,” presents an analysis conducted after January 6th suggesting Facebook could have done more to stop the spread of the “Stop the Steal” movement, which played a pivotal role in the Capitol riots.

And leaked comments from some Facebook employees on January 6 suggest the company might have had some culpability in what happened by not moving more quickly to halt the growth of Stop the Steal groups.

In response to these documents a Facebook spokesperson told CNN, “The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them.”

Global lack of support

Internal Facebook documents and research shared as part of Haugen’s disclosures highlight gaps in Facebook’s ability to prevent hate speech and misinformation in countries such as Myanmar, Afghanistan, India, Ethiopia and much of the Middle East, where coverage of many local languages is inadequate.

Although Facebook’s platforms support more than 100 different languages globally, a company spokesperson told CNN Business that its global content moderation teams are comprised of “15,000 people who review content in more than 70 languages working in more than 20 locations” around the world.

For example, in India, which represents Facebook’s largest user base, Facebook for several years did not have hate speech classifiers for Hindi or Bengali, two of the country’s most popular languages spoken collectively by more than 600 million people in India. In an internal presentation on anti-Muslim hate speech, Facebook researchers wrote, “Our lack of Hindi and Bengali classifiers means much of this content is never flagged or actioned.”

A Facebook spokesperson told CNN the company added hate speech classifiers for “Hindi in 2018, Bengali in 2020 and Tamil and Urdu more recently.”

In a statement on October 23 addressing reports concerning the leaked research, Miranda Sissons, Facebook’s director of human rights policy, and Nicole Isaac, Facebook’s international strategic response director, wrote, “We have an industry-leading process for reviewing and prioritizing countries with the highest risk of offline harm and violence, every six months. When we respond to a crisis, we deploy country-specific support as needed.”

Human Trafficking

Facebook has known about human traffickers using its platforms since at least 2018, but has struggled to crack down on related content, company documents reviewed by CNN show.

According to one internal report from September 2019 2019 entitled a Facebook investigation found that “our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via real-world networks. … The traffickers, recruiters and facilitators from these ‘agencies’ used FB [Facebook] profiles, IG [Instagram] profiles, Pages, Messenger and WhatsApp.”

Other documents chronicled how Facebook researchers had flagged and removed Instagram accounts purporting to offer domestic workers for sale, and outlined a variety of steps the company has taken to address the problem, including removing certain hashtags. However, CNN found several similar Instagram accounts still active last week advertising domestic workers for sale. After CNN asked Facebook about the accounts, a spokesperson confirmed they violated the company’s policies. The accounts have since been removed and the posts deleted.

“We prohibit human exploitation in no uncertain terms,” Facebook spokesperson Andy Stone said. “We’ve been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.”

Inciting Violence Internationally

Internal documents indicate Facebook knew its existing strategies were insufficient to curb the spread of posts inciting violence in countries “at risk” of conflict, like Ethiopia.

Facebook relies on third-party fact-checking organizations to identify, review and rate potential misinformation on its platform using an internal Facebook tool, which surfaces content flagged as false or misleading through a combination of AI and human moderators.

Facebook ranks Ethiopia, where a civil war has raged for the past year, in its highest priority tier for countries at risk of conflict. However, an internal report distributed in March, entitled “Coordinated Social Harm,” said that armed groups in Ethiopia were using Facebook to incite violence against ethnic minorities in the “context of civil war.” And in a bold headline the report warned: “Current mitigation strategies are not enough.”

This is not the first time concerns have been raised about Facebook’s role in the promotion of violence and hate speech. After the United Nations criticized Facebook’s role in the Myanmar crisis in 2018, the company acknowledged that it didn’t do enough to prevent its platform being used to fuel bloodshed, and Zuckerberg promised to increase Facebook’s moderation efforts.

In comments made to the consortium, Haugen said, “I genuinely think there’s a lot of lives on the line — that Myanmar and Ethiopia are like the opening chapter.”

A Facebook spokesperson said the company had invested “$13 billion and have 40,000 people working on the safety and security on our platform, including 15,000 people who review content in more than 70 languages working in more than 20 locations all across the world to support our community. Our third party fact-checking program includes over 80 partners who review content in over 60 languages, and 70 of those fact checkers are outside of the US.”

Impact on Teens

According to the documents, Facebook has actively worked to expand the size of its young adult audience even as internal research suggests its platforms, particularly Instagram, can have a negative effect on their mental health and well-being.

Although Facebook has previously acknowledged young adult engagement on the Facebook app was “low and regressing further,” the company has taken steps to target that audience. In addition to a three-pronged strategy aimed at having young adults “choose Facebook as their preferred platform for connecting to the people and interests they care about,” the company focused on a variety of strategies to “resonate and win with young people.” These included “fundamental design & navigation changes to promote feeing close and entertained,” as well as continuing research to “focus on youth well-being and integrity efforts.”

However, Facebook’s internal research, first reported by the Wall Street Journal, claims Facebook’s platforms “make body image issues worse for 1 in 3 teen girls.” Its research also found that “13.5% of teen girls on Instagram say the platform makes thoughts of ‘Suicide and Self Injury’ worse” and 17% say the platform, which Facebook owns, makes “Eating Issues” such as anorexia worse.

In a September 14 statement, Instagram’s head of public policy, Karina Newton, said that they “stand by” the internal research, but argued that the Wall Street Journal “focuses on a limited set of findings and casts them in a negative light.”

Algorithms fueling divisiveness

In 2018, Facebook pivoted its News Feed algorithm to focus on “meaningful social interactions.” Internal company documents reviewed by CNN reveal Facebook discovered shortly afterwards that the change led to anger and divisiveness online.

A late 2018 analysis of 14 publishers on the social network, entitled “Does Facebook reward outrage,” found that the more negative comments incited by a Facebook post, the more likely the link in the post was to get clicked.

“The mechanics of our platform are not neutral,” one staffer wrote.

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

CNN’s Clare Duffy, Donie O’Sullivan, Eliza Mackintosh, Rishi Iyengar and Rachel Metz contributed reporting.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content