- MaryGrace Lerin
TikTok content moderator claims in lawsuit that job caused her to suffer PTSD
A content moderator is suing TikTok and its parent company, stating that she has suffered "psychological trauma" as a result of their failure to apply industry-standard safety procedures.
Candie Frazier, a Telus International contractor, evaluates inappropriate TikTok posts, which are flagged by users and edited or deleted if moderators like Frazier conclude that they violate the company's terms of service.
According to her attorney, Steve Williams, Frazier sees "horrific stuff constantly." The case, which seeks class-action status, was filed in U.S. District Court for the Central District of California on Thursday.

According to the lawsuit, she is exposed to postings that include "child sexual assault, rape, torture, bestiality, beheadings, suicide, and murder," as well as conspiracy theories, historical factual inaccuracies, and political disinformation.
Frazier has post-traumatic stress disorder, according to the lawsuit, "as a result of frequent and uncontrolled exposure to highly toxic and exceedingly upsetting imagery at work."
TikTok is accused of failing to warn Frazier that watching such posts "may have a severe negative mental health impact on content moderators," according to the suit.
The posts "may include graphic, violent, explicit, political, profane, and otherwise disturbing content," according to a Telus International job description for a content moderator, which is published online.
"Sound coping, emotion regulation, and stress-management skills" are listed as requirements in the job description. When Frazier applied for the job, it's unclear if the job description was available.
Telus International has "a robust resiliency and mental health program in place to support all of our team members, as well as a comprehensive benefits program for access to personal health and well-being services," according to a spokesperson for the company, which is not named as a defendant in the lawsuit.
"Team members can express questions and concerns regarding any area of their job through multiple internal channels," the spokeswoman stated.
In a statement, the spokesperson noted that Frazier "has never before addressed any concerns about her work environment, and her assertions are utterly incongruous with our policies and practices."
According to the lawsuit, content moderators at TikTok and its parent company, ByteDance, are more likely to develop PTSD as a result of the companies' failure to establish "workplace safety measures."
"The claim is to say, 'I want to do my job.' "All I want to do is do my job safely," Williams stated. "It's just like any other dangerous work."
While he admitted that the safety measures are not mandated by law, the complaint claims that "industry-recognized standards" exist.
Other companies and nonprofit organizations follow protocols such as restricting content moderators' shifts to four hours. As per the lawsuit, Frazier works for 12 hours a day, with two 15-minute breaks and an hour for lunch.
The Technology Coalition, of which ByteDance is a member, also suggests that content moderators receive therapy and be given the option of not viewing images of child sexual abuse.
According to the suit, the coalition, which includes Facebook, YouTube, Snap Inc., and Google, companies "must support those employees who are on the front lines of this war."
The National Center for Missing and Exploited Children urges companies to reduce the impact of unsettling images on employees by displaying them in black and white, blurring parts of films, showing movies in lower qualities, and muting videos.
"Despite being members of the Technology Coalition, ByteDance and TikTok have failed to apply the standards proposed by the Technology Coalition," it says.
TikTok does not comment on current lawsuits, according to a representative, but "we endeavor to establish a compassionate working environment for our workers and contractors."
"Our Safety team collaborates with third-party firms to assist secure the TikTok platform and community, and we continue to expand on a range of wellness programs to ensure that moderators are supported psychologically and emotionally," a spokesman said in a statement.