Facebook researchers previously proposed studying whether features were ‘addictive,’ documents show
By Clare Duffy, CNN
New York (CNN) — Meta executives testified in a Los Angeles courtroom earlier this month that while use of the company’s platforms could become problematic, it couldn’t be considered addiction. But inside the company, researchers have sought to study whether certain Facebook features could contribute to “addiction” or “addictive’-like” behaviors among users.
That’s according to new internal documents released in legal filings in another lawsuit against Meta.
The files, released on Friday, raise new questions about what Meta has known about the risks of its platforms, especially to young people — a question at the heart of the legal battle currently underway against the company. Some of the features the company’s researchers raised questions about, including autoplay and endlessly scrolling feeds, are among the same features that lawsuits now claim contribute to youth addiction and harm.
Employees at the company, then known as Facebook, proposed a public audit of design features that might contribute to compulsive use of the platform in the fall of 2018, citing growing public concern that tech companies were intentionally manipulating users, according to the documents.
They suggested working with outside researchers to lend expertise and credibility to the effort. One suggested expert was Tristan Harris, who at the time had recently left his job as an ethicist at Google to found the Center for Humane Technology to address concerns about social media and smartphone addiction. But the documents show employees expressing concerns that Harris might suggest changes to Facebook that the company’s product teams wouldn’t be willing to make.
The researchers hypothesized that certain features could “promote frequent, automatic, undesired behaviors” that build habits users may not want or intend.
“These may lead to feelings of being manipulated, feeling lack of control related to certain behaviors, and feelings of dependence on checking or going on Facebook that could be related to lower well-being — and could be fueling the subjective, colloquial experiences of feeling ‘addicted’ to Facebook,” they wrote.
The documents were released as evidence in a lawsuit brought by hundreds of school districts and attorneys general from across the United States against Meta, Snap, TikTok and YouTube-parent Google in the Northern District Court of California. The case is set to go to trial later this year.
It will follow the conclusion of the social media addiction case against Meta and YouTube currently on trial in Los Angeles, the first of more than 1,500 lawsuits brought by individuals against the companies. Meta has denied the lawsuits’ claims.
Parents and safety advocates have for years raised concerns that social media platforms are designed to get users hooked and keep them scrolling for as long as possible to serve them more ads. Meta CEO Mark Zuckerberg objected to this claim in his testimony in the LA trial last week, saying the company was focused on maximizing “value” to users.
Meta never carried out the proposed audit, although Meta spokesperson Liza Crenshaw said it has conducted other research on the topic that informed design changes such as parental control tools and default teen safety settings that were introduced in recent years.
“We’ve intentionally designed automatic defaults like Sleep Mode that encourage teens to leave the app and pause notifications over night,” Crenshaw said in a statement. “Parents can go even further by restricting their teens’ total time to as little as 15 minutes a day or setting scheduled breaks when teens are required to exit our apps.”
Friday’s release is just the latest tranche of internal documents to be made public in the Northern California case.
Documents released earlier also showed Meta researchers in an internal chat raising concerns about compulsive use, saying, “IG (Instagram) is a drug… we’re basically pushers.” Internal documents from the other tech companies similarly suggest the firms were aware that their apps could harm teens. The companies said at the time the documents paint a misleading picture of their platforms and safety efforts.
‘What are the real people problems in this space?’
Meta and other tech firms have long argued publicly that there is no conclusive evidence linking social media to addiction or other mental health challenges.
“I think it’s important to differentiate between clinical addiction and between problematic use, so, using something more than you feel good about,” Instagram head Adam Mosseri testified in the Los Angeles case earlier this month.
“Sometimes we use the word ‘addiction’ to refer to things more casually,” he said. “I’m sure I’ve said that I’ve been addicted to a Netflix show, you know, when I binged it really late one night, but I don’t think it’s the same thing as clinical addiction.”
But the 2018 document suggests the company’s researchers believed certain Facebook features contributed to repeated use that made users feel worse, or like they had little control of their behavior. In the proposal, researchers also suggested expanding the study to Instagram.
“Given that currently there is NO medically defined FB addiction disorder -what are the real people problems in this space?” they wrote. “The well-being team has reframed the addiction narrative to focus on the ways that FB may be contributing to use patterns that people find difficult to control despite negative impacts on their lives, and to identify and correct those contributing factors.”
The researchers expressed a desire to identify and change problematic features — such as video autoplay, “like” count notifications and “endless scroll” — and noted that the platform should only promote “frequent” behaviors that also provide real value to users.
A separate document described “comms considerations” around the proposed audit, including the opportunity to address “extreme” claims in the media that Facebook was “sprinkling behavioral cocaine” all over its products and to “preempt any regulations.” The document notes the audit could improve user well-being, but said teams should consider possible trade-offs including a loss of engagement.
The company did not carry out the proposed audit. Harris, at The Center for Humane Technology, did not immediately respond to a request for comment about whether he had discussions with the company about the study.
But Meta’s Crenshaw said the company’s researchers continued to research users’ potentially negative experiences on its platforms with the intention of improving them. She said Meta researchers have also met with other academics working in the space, including United Kingdom psychologists and digital psychology researchers Daria Kuss and Mark Griffiths, who were mentioned in the 2018 proposal along with Harris.
Several months after the proposal, in May 2019, Meta publicly released a separate study conducted by internal researchers entitled “Understanding Perceptions of Problematic Facebook Use.”
The 2019 public study, which involved a survey of 20,000 Facebook users, found that around 3% of US Facebook users experienced “problematic use,” defined as feeling a lack of control over their use and experiencing trouble with sleep, work or relationships because of the platform. Problematic use was highest among teens and young adults. The researchers concluded that Facebook should make it easier for people to take a break from the platform and consider reducing the frequency of notifications, especially for teens.
In 2021, Meta rolled out “take a break” reminders for teens on Instagram. It added parental control tools — including the option to set a time limit for scrolling — the following year. In 2024, Meta compiled many of its teen safety measures into “Teen Accounts,” which implements default privacy and safety settings for teens such as pausing notifications overnight.
The-CNN-Wire
™ & © 2026 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.
