Skip to Content

TikTok sued by content moderator who claims she developed PTSD from reviewing disturbing content

By Clare Duffy and Jennifer Korn, CNN

A content moderator for TikTok is suing the social media platform after she says she developed psychological trauma as a result of her job, which she alleges required her to review videos featuring graphic violence, conspiracy theories and other disturbing imagery.

Candie Frazier, a Las Vegas-based contractor for TikTok parent company ByteDance, alleges that she and other content moderators often spend 12-hour days reviewing disturbing content. She claims that TikTok and ByteDance fail to provide adequate protections and psychological support for content moderators, according to the complaint.

“Plaintiff Frazier views videos of the genocide in Myanmar, mass shootings, children being raped, and animals being mutilated,” the complaint states. “As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Frazier has developed and suffers from significant psychological trauma including anxiety, depression, and posttraumatic stress disorder.”

The proposed class action lawsuit, filed last week in federal court in California is likely to heighten the scrutiny of problematic content and moderation practices at TikTok. The short video-based platform had previously flown under the radar compared to larger rivals such as Facebook and YouTube, but has gained attention in recent months from critics and lawmakers after exploding in popularity, especially among young people, during the pandemic. The company said in September that it had reached 1 billion monthly users.

A TikTok spokesperson said the company does not comment on ongoing litigation.

“We strive to promote a caring working environment for our employees and contractors,” the spokesperson said. “Our Safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”

Frazier is not a TikTok or ByteDance employee; rather, she works for a Canadian company called Telus International, which contracts out content moderation workers to TikTok and other social media platforms. But Frazier alleges in the suit that her work is dictated and supervised by TikTok and ByteDance. A spokesperson for Telus, which is not named as a party in the lawsuit, said that Frazier had never previously raised concerns about her work and “her allegations are entirely inconsistent with our policies and practices.”

“We have a robust resiliency and mental health program in place to support all our team members, as well as a comprehensive benefits program for access to personal health and well-being services,” the Telus spokesperson said. “Our team members can elevate questions and concerns about any aspect of their job through several internal channels, all of which the company takes very seriously.”

Facebook faced a similar lawsuit in 2018 from a content moderator who said she developed PTSD after being exposed to content featuring rape, suicide and violence on the job. Among the criticisms Facebook faced for its content moderation practices was that moderation contractors did not reap the same benefits as corporate employees, despite being tasked with such a taxing job. The social media giant ultimately agreed to a $52 million class action settlement, which involved payments and funding for mental health treatment for content moderators, as well as workplace changes.

A TikTok executive testified on Capitol Hill for the first time in October and acknowledged the need to increase protections for young users on the platform. “We’re seeking to earn trust through a higher level of action, transparency and accountability, as well as the humility, to learn and improve,” TikTok VP and head of public policy Michael Beckerman told a Senate subcommittee. But Frazier’s lawsuit may point to the challenges of improving such protections.

The complaint alleges that problematic content is only reviewed by moderators after it is uploaded to the platform if a user reports it. Because of the volume of content they are tasked with, moderators have only 25 seconds to review each video and view “three to ten videos at the same time,” it states. (TikTok did not immediately respond to a request for comment regarding these allegations.)

“These videos include animal cruelty, torture, suicides, child abuse, murder, beheadings, and other graphic content,” according to the complaint. “The videos are each sent to two content moderators, who review the videos and determine if the video should remain on the platform, be removed from the platform, or have its audio muted.”

Theo Bertram, then-TikTok’s director of public policy for Europe, the Middle East and Africa, told British lawmakers in September 2020 that the company had 10,000 people working on its “trust and safety” team worldwide. TikTok earlier this year also launched an automated moderation system to scan and remove videos that violate its policies “upon upload,” although the feature is only available for certain content categories.

The system handles “content categories where our technology has the highest degree of accuracy, starting with violations of our policies on minor safety, adult nudity and sexual activities, violent and graphic content, and illegal activities and regulated goods,” a July blog post from TikTok’s Head of US Safety, Eric Han, reads. “We hope this update also supports resiliency within our Safety team by reducing the volume of distressing videos moderators view and enabling them to spend more time in highly contextual and nuanced areas.”

TikTok says that 93% of the violating videos removed between April and June 2021 were done so within 24 hours of being posted — the majority of which had zero views and were flagged by its automated system rather than being reported by a user, according to a Community Guidelines Enforcement Report released in October. (TikTok did not comment on Frazier’s allegation that content moderators only review videos after they are reported by a user.)

Frazier also alleges that content moderators are required to sign non-disclosure agreements which “exacerbate the harm” caused by the work, according to the complaint. The practice of requiring workers to sign non-disclosure agreements has come under fire in the technology industry recently amid employee disputes at Pinterest, Apple and other big tech firms. TikTok did not immediately respond to a request for comment regarding its NDA practices.

With the lawsuit, Frazier seeks to have TikTok pay damages (in an amount that would be determined later) to herself and other content moderators, and to develop a “medical monitoring fund” to pay for screening, diagnosis and treatment of psychological issues for such workers, according to the complaint.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content