Key quotes from the Facebook Papers
By Tara Subramaniam, CNN Business
One week ago, a consortium of 17 news outlets, including CNN, began publishing a damning series of stories — collectively called “The Facebook Papers” — based on thousands of pages of internal company documents.
These documents are disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen‘s legal counsel. The redacted versions were obtained by the consortium.
The documents, and the stories based on them, raised concerns about the potential real-world harms from Facebook’s various platforms. They also offer insight into the inner workings of the company, including its approach to misinformation and hate speech moderation, both in the US and internationally, as well as employee reactions to concerns about company decisions.
The Wall Street Journal previously published a series of stories based on internal Facebook documents leaked by Haugen, which raised concerns about the impact of Instagram on teen girls, among other issues. (The consortium’s work is based on many of the same documents.)
Facebook has repeatedly tried to discredit Haugen and said reports on the documents mischaracterize its actions. “At the heart of these stories is a premise which is false,” a Facebook spokesperson previously said in a statement to CNN. “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie.”
Here are some key quotes from the documents provided so far.
Vaccine misinformation
In February 2021, a year into the pandemic, a Facebook research report was shared internally and noted a concern: “Our internal systems are not yet identifying, demoting and/or removing anti-vaccine comments often enough.”
Another report a month later stated: “The aggregate risk from [vaccine hesitancy] in comments may be higher than that from posts, and yet we have under-invested in preventing hesitancy in comments compared to our investment in content.”
A Facebook spokesperson said the company has made improvement on the issues raised in the internal memos.
Human trafficking
According to one internal report from January 2020 entitled “Domestic Servitude and Labor Trafficking in the Middle East,” a Facebook investigation found the following: “Our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks. … The traffickers, recruiters and facilitators from these ‘agencies’ used FB [Facebook] profiles, IG [Instagram] profiles, Pages, Messenger and WhatsApp.”
A Facebook spokesperson told CNN: “We prohibit human exploitation in no uncertain terms.” The spokesperson said it has “been combatting human trafficking on our platform for many years.”
The algorithm’s impact
An internal research note from April 2019 pointed out that multiple European political parties claimed Facebook’s 2018 decision to refocus its News Feed algorithm on a new metric, referred to as “meaningful social interactions,” had “changed the nature of politics. For the worse.”
Facebook told CNN the introduction of the metric wasn’t a “sea change” in how the company ranked users’ activity on the social network as it previously considered likes, comments, and shares as part of its ranking.
Gaps in international coverage
In June 2020, a Facebook employee posted a report to an internal group with about 1,500 members noting an ongoing audit into the effectiveness of its signals to detect misinformation and hate speech in at-risk countries. According to the report, the audit “found significant gaps in our coverage (especially in Myanmar & Ethiopia), showcasing that our current signals may be inadequate.”
In a public statement addressing reports concerning the leaked research, Facebook said, “We have an industry-leading process for reviewing and prioritizing countries with the highest risk of offline harm and violence, every six months. When we respond to a crisis, we deploy country-specific support as needed.”
Internal reactions to January 6th insurrection
Commenting on a post about the Capitol insurrection written by Mike Schroepfer, Facebook’s chief technology officer, one staffer wrote: “Leadership overrides research based policy decisions to better serve people like the groups inciting violence today. Rank and file workers have done their part to identify changes to improve our platforms but have been actively held back.”
Another staffer, referencing years of controversial and questionable decision-making by Facebook leadership around political speech concluded, “history will not judge us kindly.”
In response to the documents, Facebook told CNN, “the responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them.” The company also stressed the steps it took before and after the insurrection to crack down on content related to the “Stop the Steal” movement.
The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.