TikTok Content Moderator Lawsuit
TikTok Content Moderator Lawsuit
TikTok Content Moderator Lawsuit
10
UNITED STATES DISTRICT COURT
11
NORTHERN DISTRICT OF CALIFORNIA
12
19 Defendants.
20
21
22
23
24
25
26
27
28
1
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 2 of 30
1 Plaintiffs Reece Young and Ashley Velez, on behalf of themselves and all others similarly
2 situated, bring this Class Action Complaint against Defendants ByteDance Inc. (“ByteDance”) and
3 TikTok Inc. (“TikTok”) (collectively referred to as “Defendants”) for negligence, negligent exercise of
4 retained control, and violations of California Unfair Competition Law (“UCL”), Cal. Bus. & Prof. Code
5 § 17200 et seq., UCL §17200, demanding a trial by jury on all claims for which a jury is permitted.
6 Plaintiffs make the following allegations based on personal knowledge as to the facts pertaining to
7 themselves and upon information and belief, including the investigation of counsel, as to all other
8 matters.
9 INTRODUCTION
10 1. TikTok is a social media application that allows users to create and share short videos.
11 TikTok is owned by Bytedance. Plaintiffs are former content moderators for TikTok who seek remedies
12 as a result of Defendants’ failure to comport with applicable standards of care in the conduct of their
13 business, specifically in regard to the increased risks of psychological trauma and trauma-related
14 disorders resulting from exposure to graphic and objectionable content on ByteDance’s TikTok
15 application (“App”). Defendants have failed to provide a safe workplace for the thousands of
16 contractors who are the gatekeepers between the unfiltered, disgusting and offensive content uploaded
17 to the App and the hundreds of millions of people who use the App every day.
18 2. Each day, App users upload millions of videos, recently reported to be as many as 90
19 million a day. Many of these uploads include graphic and objectionable content including child sexual
20 abuse, rape, torture, bestiality, beheadings, suicide, and murder. In the second quarter of 2021 alone,
21 TikTok removed over 8.1 million videos that violated its rules. TikTok relies on people like Plaintiffs to
22 work as content moderators, viewing videos and removing those that violate Defendants’ terms of use.
23 Content moderators have the job of trying to prevent posts containing graphic violence or other
25 3. Plaintiff Ashley Velez worked as a content moderator for TikTok. She was hired by Telus
26 International (“Telus”), which provides content moderators for TikTok, a popular app owned by
28
2
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 3 of 30
1 4. Plaintiff Reece Young worked as a content moderator for TikTok. She was hired by
2 Atrium Staffing Services Ltd. (“Atrium”) to perform content moderation for TikTok.
3 5. Although Plaintiffs ostensibly worked for different companies, they performed the same
4 tasks, in the same way, using applications provided by Defendants. They had to meet quotas set by
6 6. While working at the direction of ByteDance, Plaintiffs and other content moderators
7 witnessed many acts of extreme and graphic violence as described above. As just a few examples,
8 Plaintiff Young saw a thirteen-year-old child being executed by cartel members, bestiality, and other
9 distressing images. Plaintiff Velez saw bestiality and necrophilia, violence against children, and other
10 distressing imagery. Content moderators like Plaintiffs spend twelve-hour days reviewing and
11 moderating such videos to prevent disturbing content from reaching TikTok’ s users.
12 7. Content moderators also face repeated exposure to conspiracy theories, including but not
13 limited to suggestions that the COVID-19 pandemic is a fraud, the distortions of historical facts such as
14 Holocaust denials, “challenges” that involve high-risk behavior, fringe beliefs, hate speech, and political
15 disinformation about census participation, candidate citizenship status or eligibility for public office, and
16 manipulated videos of elected officials. This type of content can cause traumatic reactions.
17 8. As a result of constant and unmitigated exposure to highly toxic and extremely disturbing
18 images at the workplace, Plaintiffs have suffered immense stress and psychological harm. Plaintiffs have
19 sought counseling on their own time and effort due to the content they were exposed to while providing
20 content moderation services for TikTok because they are not provided adequate prophylactic measures
22 9. Defendants are aware of the negative psychological effects that viewing graphic and
23 objectionable content has on content moderators. Despite this knowledge, Defendants fail to implement
25 10. To the contrary, Defendants impose productivity standards and quotas on their content
27 11. By requiring content moderators to review high volumes of graphic and objectionable
28 content, Defendants require content moderators to engage in abnormally dangerous activities. By failing
3
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 4 of 30
1 to implement acknowledged best practices to mitigate risks necessarily caused by such work, TikTok
3 the harm caused to content moderators by forcing them to keep inside the horrific things they see while
5 12. Without this Court’s intervention, Defendants will continue to breach the duties they
7 13. On behalf of themselves and all others similarly situated, Plaintiffs bring this action (1) to
8 compensate content moderators that were exposed to graphic and objectionable content on behalf of
9 TikTok; (2) to ensure that Defendants provide content moderators with tools, systems, and mandatory
10 ongoing mental health support to mitigate the harm reviewing graphic and objectionable content can
11 cause; and (3) to provide mental health screening and treatment to the thousands of current and former
14 14. This Court has subject matter jurisdiction over this action pursuant to 28 U.S.C. §
15 1332(d) and 1367 because: (i) this is a class action in which the matter in controversy exceeds the sum of
16 $5,000,000, exclusive of interest and costs; (ii) there are 100 or more class members; and (iii) some
17 members of the class, including Plaintiffs, are citizens of states different from some Defendants.
18 15. This Court has personal jurisdiction over Defendants because: (i) they transact business
19 in the United States, including in this District; (ii) they have substantial aggregate contacts with the
20 United States, including in this District; (iii) they engaged and are engaging in conduct that has and had
21 a direct, substantial, reasonably foreseeable, and intended effect of causing injury to persons throughout
22 the United States, including in this District, and purposely availed themselves of the laws of the United
23 States. Defendant ByteDance is headquartered in this District and regularly conducts business here.
24 16. Venue is proper in this District pursuant to 28 U.S.C. § 1391(b), (c) and (d), because a
25 substantial part of the events giving rise to Plaintiffs’ claims occurred in this District, a substantial
26 portion of the affected interstate trade and commerce was carried out in this District, and one or more
27 of the Defendants reside in this District or are licensed to do business in this District. Defendant
28 ByteDance transacts business, maintains substantial contacts, and committed tortious acts in this
4
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 5 of 30
1 District, causing injury to persons residing in, located in, or doing business throughout the United
2 States. Defendant ByteDance is headquartered in Mountain View, in this District, and conducts
3 substantial business activities here. Plaintiffs and the proposed class have been, and continue to be,
4 injured as a result of Defendant ByteDance’s illegal conduct in the Northern District of California.
5 PARTIES
7 months starting in 2021, Plaintiff worked as a content moderator reviewing content for ByteDance
8 remotely.
9 18. Plaintiff Ashley Velez is a resident of Las Vegas, Nevada. Starting in May 2021 to
10 November 2021, Plaintiff Velez worked as a content moderator reviewing content for ByteDance
11 remotely.
12 19. Defendant ByteDance is, and at all relevant times was, a Delaware corporation with its
14 20. Defendant TikTok is, and at all relevant times was, a California corporation with its
15 principal place of business at 5800 Bristol Pkwy, Culver City, Los Angeles County, California.
16 Defendant TikTok also maintains offices in Palo Alto, California and Mountain View, California.
18 21. In fiscal year 2020, ByteDance made approximately $34.3 billion in advertising revenue.
19 In 2019, that number was $17 billion, and in 2018 that number was $7.4 billion. ByteDance
20 accomplished this in large part due to the popularity of its App. TikTok is a booming social media
22 22. TikTok is attractive to companies and individuals that want to buy advertisements
23 because of its immense user base. TikTok has over 130 million active users. These users value TikTok
24 for its plethora of content and ability to share information. Further, TikTok is immensely popular with
26 23. According to a November 5, 2019, article in The Washington Post, “[t]he short-video app
27 has become a global phenomenon and has taken young American audiences by storm, blending silly
28
5
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 6 of 30
1 jokes, stunts and personal stories into a tech powerhouse downloaded more than 1.3 billion times
2 worldwide.”
3 24. To generate this content, ByteDance relies on users to upload videos to its platform.
4 TikTok users spend almost an hour on average a day on the App, with younger individuals spending
6 25. The amount of content on TikTok is massive, with TikTok having more than a billion
7 videos viewed on its platform each day and millions of active users.
8 FACTUAL ALLEGATIONS
9 A. Content moderators watch and remove depraved images on the internet so that
11 26. Content moderation is the job of removing online material that violates the terms of use
13 27. Defendants rely on users to report inappropriate content. Defendants receive millions of
14 user reports of potentially objectionable content on the App. These videos are then flagged for review
15 by content moderators to determine if the content violates Defendants’ policies. Human moderators
16 review the reported content – sometimes thousands of videos and images every shift – and remove
18 28. Human moderators are necessary to TikTok’s monitoring of posted content. In the
19 second quarter of 2021, 81,518,334 videos were removed from the App. The vast majority of these were
21 29. Upon receiving a report from a user about inappropriate content, Defendants send that
22 video to the content moderators. The videos that the content moderators review often include animal
23 cruelty, torture, suicides, child abuse, murder, beheadings, and other graphic content. The videos are
24 each sent to two content moderators, who review the videos and determine if the video should remain
25 on the platform, be removed from the platform, or have its audio muted.
26 30. In September 2020, Theo Bertram, TikTok’s Director of Government Relations and
27 Public Policy, told British politicians that TikTok has over 10,000 content moderators worldwide.
28
6
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 7 of 30
1 31. Defendants require content moderators to review very large volumes of potentially rule-
2 breaking posts per week through their proprietary review software. Due to the sheer volume of content,
3 content moderators usually have less than 25 seconds per video, and often view multiple videos
5 32. All of this work is done through Defendants’ proprietary TCS software, which each
6 content moderator logs into each day. It is through TCS that videos are sent to the content moderators,
7 and the content moderators use TCS to take whatever action is appropriate in regard to those videos.
8 TCS is also used by Defendants to constantly monitor the content moderators. The TCS software
9 allows Defendants to watch everything that the content moderators do while they are logged in, and
10 also allows Defendants to determine exactly how long a content moderator is logged out during lunch or
11 breaks. Both Plaintiffs used this software to do their jobs reviewing TikTok content.
12 33. Defendants recognize the dangers of exposing users to images and videos of graphic
14 https://newsroom.tiktok.com/en-us/refreshing-our-policies-to-support-community-well-being, to
15 foster well-being on its platform to address distressing content like suicide and self-harm.
18 34. It is well known that exposure to images of graphic violence can cause debilitating
19 injuries, including Post Traumatic Stress Disorder (“PTSD”), anxiety, and depression.
20 35. Whereas viewing or hearing about another person’s traumatic event used to be
21 considered “secondary traumatic stress,” the Diagnostic and Statistical Manual of Mental Disorders
22 (American Psychiatric Association, 5th ed. 2013) (“DSM-5”) recognizes that secondary or indirect
23 exposure to trauma, such as repeated or extreme exposure to aversive details of trauma through work-
25 36. In a study conducted by the National Crime Squad in the United Kingdom, seventy-six
26 percent of law enforcement officers surveyed reported feeling emotional distress in response to
27 exposure to child abuse on the internet. The same study, which was co-sponsored by the United
28 Kingdom’s Association of Chief Police Officers, recommended that law enforcement agencies
7
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 8 of 30
1 implement employee support programs to help officers manage the traumatic effects of exposure to
2 child pornography.
3 37. In a study of 600 employees of the Department of Justice’s Internet Crimes Against
4 Children task force, the U.S. Marshals Service found that a quarter of the cybercrime investigators
6 38. Another study of cybercrime investigators from 2010 found that “greater exposure to
7 disturbing media was related to higher levels of . . . secondary traumatic stress” and that “substantial
9 39. The Eyewitness Media Hub has also studied the effects of viewing videos of graphic
10 violence, including suicide bombing, and found that “40 percent of survey respondents said that
11 viewing distressing eyewitness media has had a negative impact on their personal lives.”
12 40. While there is no way to eliminate the risk created by exposure to graphic and
13 objectionable content, there are ways to mitigate it. It is known that specially demanding job
14 requirements or a lack of social support reduce resilience in the face of trauma exposure and increase
16 41. Depending on many factors, individuals who have experienced psychological trauma
17 may develop a range of subtle to significant physical and psychological symptoms, including extreme
18 fatigue, dissociation, difficulty sleeping, excessive weight gain, anxiety, nausea, and other digestive
19 issues.
20 42. PTSD symptoms may manifest soon after the traumatic experiences, or they may
21 manifest later, sometimes months or years after trauma exposure. The Americans with Disabilities Act
22 recognizes that certain diseases can manifest into disabilities and describes PTSD as a “hidden
24 43. Trauma exposure and PTSD are also associated with increased risk of chronic health
26 44. There is growing evidence that early identification and treatment of PTSD is important
27 from a physical health perspective, as several meta-analyses have shown increased risk of
28 cardiovascular, metabolic, and musculoskeletal disorders among patients with long-term PTSD.
8
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 9 of 30
1 45. Psychological trauma and PTSD are also often associated with the onset or worsening of
2 substance use disorders. Epidemiologic studies indicate that one-third to one-half of individuals with
3 PTSD also have a substance use disorder. Compared to individuals without PTSD, those with PTSD
4 have been shown to be more than twice as likely to meet the diagnostic criteria for alcohol abuse or
5 dependence; individuals with PTSD are also three to four times more likely to meet the diagnostic
8 through prevention measures, categorized as primary, secondary, and tertiary interventions. Primary
9 interventions are designed to increase resilience and lower the risk of future PTSD among the general
10 population. Secondary interventions are designed to lower the risk of PTSD among individuals who
11 have been exposed to trauma, even if they are not yet showing symptoms of traumatic stress. Finally,
12 tertiary interventions are designed to prevent the worsening of symptoms and improve functioning in
13 individuals who are already displaying symptoms of traumatic stress or who have been diagnosed with
14 PTSD.
15 47. Individuals who develop PTSD or other mental health conditions following traumatic
16 exposure require preventative measures as well as treatment. Unlike prevention, treatment measures
20 C. Defendants control the means and manner by which content moderation is done.
21 49. Plaintiff Velez was hired by Telus. Plaintiff Velez only performed content moderation
22 services for TikTok while employed by Telus. Plaintiff Velez used Defendants’ TCS to perform
24 50. Plaintiff Young was hired by Atrium. Plaintiff Young only performed content moderation
25 services for TikTok while employed by Atrium. Plaintiff Young used Defendants’ TCS to perform
27
28
9
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 10 of 30
1 51. ByteDance withholds payment to content moderators if they are not on the TCS
2 application beyond their allotted breaks (two fifteen-minute breaks and one hour-long lunch break for a
4 D. Defendants did not meet industry standards for mitigating the harm to content
5 moderators.
6 52. Defendants are and were aware of the damage that disturbing imagery could have on
7 content moderators. Through the App, they are members of the Technology Coalition, which was
8 created “to develop technology solutions to disrupt the ability to use the Internet to exploit children or
10 53. Other members of the Technology Coalition include Facebook, YouTube, Snap Inc.,
14 55. According to the Guidebook, the technology industry “must support those employees
16 56. The Guidebook recommends that internet companies implement a robust, formal
17 “resilience” program to support content moderators’ well-being and mitigate the effects of exposure to
18 trauma-inducing imagery.
23 National Center for Missing and Exploited Children] to learn about the problem”;
28
10
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 11 of 30
6 a. Limiting the amount of time an employee is exposed to child sexual abuse imagery;
8 c. Performing a controlled content exposure during the first week of employment with
9 a seasoned team member and providing follow up counseling sessions to the new
10 employee;
14 59. The Technology Coalition also recommends the following practices for minimizing
16 a. Limiting time spent viewing disturbing media to “no more than four consecutive
17 hours”;
18 b. “Encouraging switching to other projects, which will allow professionals to get relief
20 c. Using “industry-shared hashes to more easily detect and report [content] and in
21 turn, limit employee exposure to these images. Hash technology allows for
23 objectionable”;
24 d. Preventing content moderators from viewing child pornography one hour or less
27 60. According to the Technology Coalition, if a company contracts with a third-party vendor
28 to perform duties that may bring vendor employees in contact with graphic content, the company
11
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 12 of 30
1 should clearly outline procedures to limit unnecessary exposure and should perform an initial audit of
3 61. The National Center for Missing and Exploited Children (“NCMEC”) also
4 promulgates guidelines for protecting content moderators from psychological trauma. For instance,
5 NCMEC recommends changing the color or resolution of the image, superimposing a grid over the
6 image, changing the direction of the image, blurring portions of the image, reducing the size of the
8 62. Based on these industry standards, various internet companies take steps to minimize
9 harm to content moderators. Some notable measures include the use of filtering technology to distort
10 images, and the provision of mandatory psychological counseling for content moderators.
12 Technology Coalition. Instead, Defendants impose productivity standards and quotas on their content
14 64. This unmitigated exposure and callousness towards implementing standards of care,
15 resulted in Plaintiffs being exposed to thousands of graphic and objectionable videos, including graphic
16 violence, sexual assault, and child pornography. Incidentally, this harmful exposure to child
17 pornography and similar imagery is the kind of harm that the Guidebook mandates to prevent or
18 mitigate as provided in paragraphs 57 (a) through (f) and 58 (a) through (e) of this Complaint.
19 65. As a result of constant and unmitigated exposure to highly toxic and extremely
20 disturbing images at the workplace, content moderators, including Plaintiffs have suffered immense
21 stress and psychological harm. Furthermore, the lack of adequate prophylactic measures and the lack of
22 counseling services and/or ameliorative measures has led Plaintiffs to seek counseling on their own time
23 and effort.
25 66. Defendants failed to implement workplace safety measures that meet industry standards
26 that other companies and non-profits have implemented, and have failed as well to implement the
28
12
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 13 of 30
1 67. During the hiring and training process Defendants do not cause content moderators to
2 be informed about the nature of the work or the effect reviewing graphic content can have on their
3 mental health. Potential hires are not asked about their previous experience with graphic content.
4 Neither are they told that this content can have a significant negative mental health impact on content
5 moderators. Content moderators are not permitted to preview the graphic content or advised to seek
7 68. In addition to this, content moderators are not trained on how to address the reactions
8 they are going to have to the images they are going to see. Content moderators do not ease into their
9 jobs through controlled exposure to graphic content with a seasoned team member followed by
10 counseling sessions.
11 69. Training videos were significantly tamer than what Plaintiffs were exposed to while on
12 the job, leaving them unprepared for the mental stress and harm that they would be subjected to.
13 70. Before content moderators begin work they are required to sign non-disclosure
14 agreements. Only after these documents are signed does the training begin.
15 71. Defendants also failed to provide safeguards known to mitigate the negative effects of
17 72. Content moderators are required to review hundreds of graphic and disturbing videos
18 each week. To determine whether a video should be removed, Defendants create and continually revise
19 tags for content that content moderators must use to determine whether flagged content violates
20 Defendants’ policies. Defendants recently increased the number of “tags” content moderators must
21 use while moderating videos from 20 to 100. Content moderators are now expected not to just to review
22 the content of the video, but also to review video backgrounds and other aspects of the video to make
23 sure they conform to Defendants’ rules while trying to meet oppressive quotas.
24 73. Defendants also impose strict quantity and accuracy quotas on content moderators.
25 Content moderators are required to review videos for no longer than 25 seconds and expected to have
26 an accuracy rate of 80%. Content moderators often review multiple videos at the same time in order to
27 meet the quotas. While all of this is happening, they are being continuously surveilled and pushed by
13
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 14 of 30
2 despite the potential harms to moderators’ psyches. Defendants are aware, or should have been
3 aware, that their harsh requirements create an increased risk that content moderators will
4 develop PTSD and related disorders. Despite this awareness Defendants failed to provide
5 adequate services to content moderators, including Plaintiffs, to cope with the unbearable
7 75. Defendants control how the videos are displayed (e.g., full screen versus thumbnails,
8 blurred versus unblurred, etc.), how the accompanying audio is broadcasted, and whether videos begin
9 automatically upon completion of the prior video or whether the content moderator can catch his or her
10 breath by controlling the start of the ensuing video. This is done through Defendants’ proprietary TCS
12 76. Despite their awareness of the impact of reviewing graphic content, Defendants fail to
13 implement well-accepted standards to mitigate harm to content moderators. Defendants could have,
14 but failed to, implement safeguards on their content moderation tools—including changing the color or
15 resolution of the video, superimposing a grid over the video, changing the direction of the video,
16 blurring portions of the video, reducing the size of the video, and muting audio—that could mitigate
18 77. This failure is especially glaring considering the reasonably uncomplicated nature of
19 many of the tool-related changes. Defendants have full control over the TCS software. Blurring images
20 and videos and providing tags for ultra-graphic violence would take little time to implement and could
22 78. Defendants also fail to provide appropriate psychological support to content moderators.
23 Defendants purportedly offered content moderators “wellness” benefits, including specified wellness
24 time. However, Defendants’ public claims about industry leading “wellness benefits” ring hollow as
25 Defendants repeatedly reduced wellness time from one hour a week to thirty minutes a week.
26
27
28
14
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 15 of 30
1 79. Content moderators are punished for time away from the TCS application, making any
2 of the meager wellness protections available illusory. Defendants withhold payment to content
3 moderators if they are not on the TCS application beyond their allotted two fifteen-minute breaks and a
4 one hour-long lunch break directly determining employee compensation. In this manner, Defendants
5 are punishing content moderators, including Plaintiffs by making them extremely ill-equipped to handle
6 the mentally devastating imagery their work required them to view without any meaningful counseling
8 F. Defendants know that exposure to graphic content can cause psychological trauma
10 80. In addition to failing to provide any wellness help, Defendants continuously increased
11 the workload on content moderators by increasing the number of tags attributed to videos and
13 81. At all times relevant to this complaint, ByteDance was a client of Telus
14 82. At all times relevant to this complaint, ByteDance was a client of Atrium.
15 83. During their employment as content moderators, Plaintiffs were exposed to thousands of
16 graphic and objectionable videos, including graphic violence, sexual assault, and child pornography. For
17 example, Plaintiffs witnessed videos of bestiality, violence against minors, suicide, and executions.
18 84. PTSD and related syndromes caused by exposure to harmful content can be triggered by
19 witnessing abuse; watching the news or seeing violence on television; hearing loud noises like gunshots,
20 fireworks, cars backfiring, or objects falling; seeing ISIS members or paraphernalia; and seeing racially
21 discordant posts sowing political dissension in America. Plaintiffs are highly susceptible to increased
22 rate of PTSD and related syndromes due to the content they were required to view.
24 85. Plaintiffs brings this class action on behalf of themselves, and all others similarly situated
25 pursuant to Rules 23(a), (b)(2) and (b)(3) of the Federal Rules of Civil Procedure, on behalf of the
26 following Class:
27 All individuals in the United States that performed content moderation work for or in relation to
ByteDance’s TikTok application at any time until the present.
28
15
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 16 of 30
1 86. The Class definition specifically excludes the following persons or entities:
6 e. the judges and chambers staff in this case, as well as any members of their immediate
7 families; and
9 87. The class is so numerous that joinder of all members is impracticable. Plaintiffs do not
10 know the exact size of the class since that information is within the control of Defendants. However,
11 upon information and belief, Plaintiffs allege that the number of class members is in the thousands.
12 Membership in the class is readily ascertainable from Defendants’ records as no one can perform
13 content moderation for TikTok unless logged into the TCS system. On information and belief,
14 Defendants maintain records of all activity that takes place on the TCS system and can ascertain the
16 88. The claims asserted by Plaintiffs are typical of the proposed class’s claims in that the
17 representative Plaintiffs, like all class members, were exposed to highly toxic, unsafe, and injurious
18 content while providing content moderation services for TikTok. Each member of the proposed class
20 89. There are numerous questions of law or fact common to the class, and those issues
21 predominate over any question affecting only individual class members. The common legal and factual
24 b. whether viewing graphic and objectionable conduct in the manner which content
27 of herein;
28
16
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 17 of 30
1 d. whether Plaintiffs and the class are entitled to medical screening, treatment, and
2 damages; and
5 90. Plaintiffs will fairly and adequately represent the proposed class and protect the interests
6 of the proposed class. Plaintiffs have retained attorneys experienced in class actions, complex litigation,
7 the applicable law, and issues involving content moderation. Plaintiffs intend to vigorously prosecute
8 this litigation. Neither Plaintiffs nor their counsel have interests that conflict with the interests of other
9 class members.
10 91. Plaintiffs and the proposed class members have all suffered and continue to suffer
12 92. A class action is superior to other available methods for the fair and efficient adjudication
13 of the controversy. Treatment as a class action will permit a large number of similarly situated persons
14 to adjudicate their common claims in a single forum simultaneously, efficiently, and without the
15 duplication of effort and expense that numerous individual actions would engender. Class treatment
16 will also permit the adjudication of claims by many members of the proposed class who could not
17 individually afford to litigate a claim such as is asserted in this complaint. This action likely presents no
22 93. Plaintiffs reallege and incorporate by reference herein all allegations above.
23 94. A company is strictly liable to individuals that are injured while the company engages in
25 95. An activity is abnormally dangerous if it (a) necessarily involves a risk of serious harm to
26 the person, land or chattels of others which cannot be eliminated by the exercise of the utmost care, and
17
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 18 of 30
1 abnormally dangerous activity. Content moderators are at risk of serious and debilitating psychological
2 trauma, including severe anxiety, depression and PTSD and there is no way to eliminate this risk.
3 Content moderation is also not a matter of common usage. Only a handful of technology companies,
5 97. Strict liability for a defendant that engages in abnormally dangerous activity represents a
6 social-policy determination that the defendant, while engaged in an enterprise tolerated by the law,
8 98. In fiscal year 2020, Defendants earned a combined approximately $1.9 billion in revenue
9 from TikTok.
10 99. Defendants derive this vast wealth from providing a platform safe from graphic and
11 objectionable content. Defendants rely on content moderators to ensure that TikTok is free from
12 graphic and objectionable content. Defendants monitor and control content moderators’ day to day
13 work and provide the software that allows content moderators to do their jobs. Therefore, Defendants
14 are required under the law to pay for the harm caused by requiring content moderators to review and
16 100. Studies show that as a piece of content gets close to the determined line of content
18
19
20
21
22
23
24
25
26
27
28
18
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 19 of 30
10
11
Source: TechCrunch.com
12
13 101. In addition to deriving a vast amount of wealth from the removal of graphic and
14 objectionable content by Plaintiffs, Defendants also derive engagement from objectionable content that
15 continues to remain on the platform in the period between AI review and human review by content
16 moderators, including Plaintiffs. Therefore, more revenue is churned for Defendants from keeping such
17 content on their platform. Yet, amidst this high-revenue churning process, Plaintiffs continue to remain
18 disadvantaged as a result of Defendants’ tortious conduct and flagrant disregard for ethical and
20 102. While Plaintiffs classified and removed graphic and objectionable content from the App,
21 such classifications conferred an added benefit to the machine learning model of the App. Plaintiffs’
22 efforts trained the App’s machine learning AI, helping to improve its accuracy and precision in sifting
23 out such content. Despite this lucrative benefit conferred upon Defendants, Plaintiffs remained
24 disadvantaged and adversely affected by Defendants’ failure to uphold the requisite standard of care.
25 103. Plaintiffs and the class are at an increased risk of developing serious mental health
26 injuries, including, but not limited to, PTSD, anxiety, and depression.
27 104. To remedy that injury, Plaintiffs and the class need medical monitoring that provides
28 specialized screening, assessment, and treatment not generally given to the public at large.
19
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 20 of 30
1 105. The medical monitoring regime includes, but is not limited to, baseline screening,
2 assessments, and examinations that will assist in diagnosing the adverse health effects associated with
3 exposure to trauma. This screening and assessment will also inform which behavioral and/or
4 pharmaceutical interventions are best suited to preventing or mitigating various adverse consequences
5 of post-traumatic stress and other conditions associated with exposure to graphic imagery.
6 106. In particular, the medical monitoring regime includes: (a) secondary preventative
7 interventions, designed to reduce the risk of later onset of PTSD among class members who are not yet
8 displaying symptoms of PTSD; (b) tertiary interventions, designed to reduce the worsening of
9 symptoms among those who are already experiencing symptoms associated with post-traumatic stress
10 or have a diagnosis of PTSD; and (c) evidence-based treatments to facilitate recovery from mental
11 health conditions.
13 Plaintiffs and the class will significantly reduce the risk of long-term injury, disease, and economic loss
14 that Plaintiffs and the class have incurred as a result of Defendants’ unlawful conduct.
15 108. Plaintiffs seek medical screening and treatment to facilitate the screening, diagnosis, and
16 adequate treatment of Plaintiffs and the class for psychological trauma, including to prevent or mitigate
18 109. Plaintiffs seek compensatory damages for the injuries they and the class have suffered.
23 111. Plaintiffs reallege and incorporate by reference herein all allegations above.
24 112. The hirer of an independent contractor is liable to an employee of the contractor insofar
25 as the hirer’s negligent exercise of retained control affirmatively contributed to the employee’s injuries.
26 113. If an entity hires an independent contractor to complete work but retains control over
27 any part of the work, the hiring entity has a duty to the independent contractor’s employees or
20
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 21 of 30
1 114. If the hiring entity negligently exercises its retained control in a manner that
2 affirmatively contributes to the injuries of the contractor’s employees or subcontractors, the hiring
4 115. At all times relevant to the allegations herein, Plaintiffs and class members were
6 TikTok.
7 116. Defendants retained control over certain aspects of the work performed by Plaintiffs and
11 safeguards;
13 confidentiality trainings that prohibit content moderators from discussing their work
16 d. Requiring that content moderators be sent daily adherence letters and weekly
17 calibration tests;
21 117. Based on its exercise of retained control, Defendants have had at all relevant times a duty
22 to exercise reasonable care with regard to the safety of Plaintiffs and the class.
23 118. Defendants negligently exercised their retained control in a manner that affirmatively
24 contributed to the injuries of Plaintiffs and the class, including by exacerbating Plaintiffs’ and class
27 moderators from risks associated with exposure to traumatic content via their TCS
28 software;
21
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 22 of 30
3 moderators from speaking about the content they reviewed or other related
6 training that met industry standards for the mental health of prospective content
7 moderators.
8 119. Defendants were aware of the psychological trauma that could be caused by viewing
9 graphic and objectionable content, including videos and/or images of child abuse, rape, torture,
11 120. Defendants were also aware or should have been aware that the review technology they
12 provided and mandated could be made safer if proper precautions were followed, that requiring content
13 moderators not to discuss their work or workplace conditions reduced their ability to deal with
14 traumatic content, and that Defendants’ overall quality and quantity standards had the effect of
15 imposing intense workplace stress and, accordingly, increasing content moderators’ risk of injury from
16 psychological trauma.
17 121. Defendants breached their duty to Plaintiffs and the class by failing to provide the
18 necessary and adequate technological safeguards, safety and instructional materials, warnings, social
19 support, and other means to reduce and/or minimize the physical and psychiatric risks associated with
21 122. Defendants continue to breach its duty to class members by failing to exercise retained
22 control with reasonable care; that breach continues to elevate class members’ risk of injury from
23 psychological trauma.
24 123. As a result of Defendants’ tortious conduct, Plaintiffs and the class are at an increased
25 risk of developing serious mental health injuries, including, but not limited to, PTSD, anxiety, and
26 depression.
27 124. To remedy that injury, Plaintiffs and the class need medical monitoring that provides
28 specialized screening, assessment, and treatment not generally given to the public at large.
22
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 23 of 30
1 125. The medical monitoring regime includes, but is not limited to, baseline screening,
2 assessments, and examinations that will assist in diagnosing the adverse health effects associated with
3 exposure to trauma. This screening and assessment will also inform which behavioral and/or
4 pharmaceutical interventions are best suited to preventing or mitigating various adverse consequences
5 of post-traumatic stress and other conditions associated with exposure to graphic imagery.
6 126. In particular, the medical monitoring regime includes: (a) secondary preventative
7 interventions, designed to reduce the risk of later onset of PTSD among class members who are not yet
8 displaying symptoms of PTSD; (b) tertiary interventions, designed to reduce the worsening of
9 symptoms among those who are already experiencing symptoms associated with post-traumatic stress
10 or have a diagnosis of PTSD; and (c) evidence-based treatments to facilitate recovery from mental
11 health conditions.
13 Plaintiffs and the class will significantly reduce the risk of long-term injury, disease, and economic loss
14 that Plaintiffs and the class have incurred as a result of Defendants’ unlawful conduct.
15 128. Plaintiffs seek medical screening and treatment to facilitate the screening, diagnosis, and
16 adequate treatment of Plaintiffs and the class for psychological trauma, including to prevent or mitigate
18 129. Plaintiffs seek compensatory damages for the injuries they and the class have suffered.
23 131. Plaintiffs reallege and incorporate by reference herein all allegations above.
24 132. An entity that hires an independent contractor to complete work is liable to the
25 independent contractor’s employees or subcontractors if the hiring entity negligently provides unsafe
27 133. Defendants provided to their independent contractors the review platform that Plaintiffs
23
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 24 of 30
1 134. Defendants had a duty to exercise reasonable care to furnish a safe review platform to
2 their contractors.
3 135. Defendants were aware of the psychological trauma that could be caused by viewing
4 graphic and objectionable content, including videos and/or images of child abuse, rape, torture,
5 bestiality, beheadings, suicide, murder, and other forms of extreme violence through its review
6 platforms.
7 136. Defendants were aware or should have been aware that their review platforms could be
9 137. Defendants nevertheless provided unsafe review tools to Plaintiffs and the class that
11 138. Defendants breached their duty to Plaintiffs and the class by failing to provide necessary
12 and adequate technological safeguards, safety and instructional materials, warnings, and other means to
13 reduce and/or minimize the physical and psychiatric risks associated with exposure to graphic imagery
15 139. Defendants continue to breach their duty to class members by failing to provide a
16 reasonably safe review platform; that breach continues to elevate class members’ risk of injury from
17 psychological trauma.
18 140. As a result of Defendants’ tortious conduct, Plaintiffs and the class are at an increased
19 risk of developing serious mental health injuries, including, but not limited to, PTSD, anxiety, and
20 depression.
21 141. To remedy that injury, Plaintiffs and the class need medical monitoring that provides
22 specialized screening, assessment, and treatment not generally given to the public at large.
23 142. The medical monitoring regime includes, but is not limited to, baseline screening,
24 assessments, and examinations that will assist in diagnosing the adverse health effects associated with
25 exposure to trauma. This screening and assessment will also inform which behavioral and/or
26 pharmaceutical interventions are best suited to preventing or mitigating various adverse consequences
27 of post-traumatic stress and other conditions associated with exposure to graphic imagery.
28
24
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 25 of 30
1 143. In particular, the medical monitoring regime includes: (a) secondary preventative
2 interventions, designed to reduce the risk of later onset of PTSD among class members who are not yet
3 displaying symptoms of PTSD; (b) tertiary interventions, designed to reduce the worsening of
4 symptoms among those who are already experiencing symptoms associated with post-traumatic stress or
5 have a diagnosis of PTSD; and (c) evidence-based treatments to facilitate recovery from mental health
6 conditions.
8 Plaintiffs and the class will significantly reduce the risk of long-term injury, disease, and economic loss
9 that Plaintiffs and the class have incurred as a result of Defendants’ unlawful conduct.
10 145. Plaintiffs seek medical screening and treatment to facilitate the screening, diagnosis, and
11 adequate treatment of Plaintiffs and the class for psychological trauma, including to prevent or mitigate
13 146. Plaintiffs also seeks compensatory damages for the injuries they and the class have
14 suffered.
18 148. Plaintiffs reallege and incorporate by reference herein all allegations above.
19 149. Solely in the alternative and to the extent that this Court concludes that Defendants are
20 not strictly liable for the harm caused by engaging in an abnormally dangerous activity, Plaintiffs bring
21 this fourth cause of action for violation of California Unfair Competition Law.
22 150. Defendants’ negligent exercise of retained control of the content moderation work
24 151. Defendants’ negligent provision of unsafe equipment and software to its independent
25 contractors for use by Plaintiffs and the class also violates California common law.
26 152. There were and are reasonably available alternatives to the conduct described herein that
28
25
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 26 of 30
1 153. Plaintiffs each suffered an injury in fact because of Defendants’ negligent conduct and it
2 would not be possible to quantify this irreparable harm in the form of legal remedies. Any such
5 to facilitate the screening, diagnosis, and adequate treatment of Plaintiffs and the class for psychological
6 trauma, including preventing or mitigating conditions such as PTSD, anxiety, and depression. The
7 program should include a fund to pay for the medical monitoring and treatment of Plaintiffs and the
10 Plaintiffs and the class will significantly reduce the risk of long-term injury, disease, and economic loss
11 that Plaintiffs and the class have incurred as a result of ByteDance’s unlawful conduct.
12 156. Plaintiffs seeks medical screening and treatment to facilitate the screening, diagnosis,
13 and adequate treatment of Plaintiffs and the class for psychological trauma, including to prevent or
15 157. Plaintiffs seek all appropriate injunctive relief pursuant to section 17203 of the California
16 Business and Professions Code, including an order requiring Defendants to implement safety guidelines
26
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 27 of 30
1 161. To protect employees from unsafe workplaces, California law requires that “[e]very
2 employer shall do every other thing reasonably necessary to protect the life, safety, and health of
3 employees.” Cal. Labor Code § 6401. This includes “establish[ing], implement[ing], and maintain[ing]
4 an effective injury prevention program.” Cal. Labor Code § 6401.7. Employers must “provide and use
5 safety devices and safeguards reasonably adequate to render the employment and place of employment
6 safe,” “adopt and use methods and processes reasonably adequate to render the employment and place
7 of employment safe,” and “do every other thing reasonably necessary to protect the life, safety, and
10 place of employment which is not safe and healthful.” Cal. Labor Code § 6402.
11 163. Defendants are the entities responsible for providing the things required by the
12 California Labor Code to the Plaintiffs and the Class, and they to do so and thus failed to provide a safe
13 working environment. Defendants routinely and repeatedly exposed Plaintiffs and the class to content
14 known to cause psychological trauma, including PTSD, anxiety, and depression. Even though
15 Defendants knew of and could have reasonably implemented adequate safety measures.
16 164. Defendants refused to implement necessary and adequate safety and instructional
17 materials, trainings, warnings, and means to reduce and/or minimize the risks associated with exposure
18 to graphic content.
19 165. Defendants’ failure to provide a safe workplace for Plaintiffs and the class violates, inter
20 alia, sections 6400, 6401, 6401.7, 6402, and 6403 of the California Labor Code.
21 166. In requiring content moderators to sign sweeping NDAs and instructing content
22 moderators not to disclose information about working conditions—including the traumatic nature of
23 the content, the intense stress from quantity and quality expectations, and the lack of training and safety
24 measures to protect moderators from trauma exposure—Defendants further violate section 232.5 of the
26 167. Defendants’ illegal conduct was and is willful and serious and has directly caused harm
28 168. There were reasonably available alternatives to the conduct described herein that would
27
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 28 of 30
2 169. Plaintiffs suffered an injury in fact because of Defendants’ conduct and it would not be
3 possible to quantify this irreparable harm in the form of legal remedies. Any such quantification may
5 170. Defendants’ failure to follow worker safety laws amounts to an unlawful, unfair, and
6 fraudulent business practice under California Business and Professions Code section 17200.
7 171. In the absence of any legal remedies, Plaintiffs seek an injunction creating a Defendants-
8 funded medical monitoring program to facilitate the screening, diagnosis, and adequate treatment of
9 Plaintiffs and the class for psychological trauma, including preventing or mitigating conditions such as
10 PTSD, anxiety, and depression. The program should include a fund to pay for the medical monitoring
11 and treatment of Plaintiffs and the class as frequently and appropriately as necessary.
12 172. Plaintiffs seek all appropriate injunctive relief pursuant to Business and Professions Code
13 section 17203, including an order requiring Defendants to implement safety guidelines for all content
14 moderators.
15 173. Furthermore, the filing of a previous case under similar circumstances against
16 Defendants led to the Plaintiff being wrongfully terminated, and the lawsuit was then voluntarily
17 dismissed. In this context, Plaintiffs maintain that there is a need for prospective injunctive relief to
18 prevent further injustice and irreparable harm as this situation is capable of repetition yet may evade
19 review.
21 Plaintiffs and the class will significantly reduce the risk of long-term injury, disease, and economic loss
22 that Plaintiffs and the class have incurred as a result of Defendants’ unlawful conduct.
23 175. Plaintiffs seek medical screening and treatment to facilitate the screening, diagnosis, and
24 adequate treatment of Plaintiffs and the class for psychological trauma, including to prevent or mitigate
26 176. Plaintiffs and the class will be irreparably harmed and/or denied an effective and
28
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 29 of 30
2 WHEREFORE, Plaintiffs, individually and on behalf of the class, requests that the Court:
4 b. Find that Plaintiffs are a proper representative of the class and appoint the
6 c. Order Defendants to pay to notify class members of the pendency of this suit;
7 d. Order Defendants to create a medical monitoring fund for the benefit of Plaintiffs
10 f. Award injunctive relief as is necessary to protect the interests of Plaintiffs and class
12 through the unlawful and unfair practices alleged herein, ordering Defendants to
13 implement safety guidelines for all prospective content moderation operations, and
15 facilitate the ongoing screening, diagnosis, and adequate treatment of Plaintiffs and
19 g. Award Plaintiffs and class members their reasonable litigation expenses and
21 h. Award any further relief that this Court deems just and equitable.
24
25
26
27
28
29
COMPLAINT AND DEMAND FOR JURY TRIAL
Case 3:22-cv-01883 Document 1 Filed 03/24/22 Page 30 of 30
12
Attorneys for Plaintiffs and the Proposed Class
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
30
COMPLAINT AND DEMAND FOR JURY TRIAL