Complaint File TikTok Smith Arroyo 7-1-22
Complaint File TikTok Smith Arroyo 7-1-22
Complaint File TikTok Smith Arroyo 7-1-22
11
23 Defendants.
24
25 //
26 //
27 //
28 //
COMPLAINT 1
1 COME NOW PLAINTIFFS CHRISTINA ARLINGTON SMITH, HERIBERTO ARROYO,
3 In these digital public spaces, which are privately owned and tend to be run for profit,
4 there can be tension between what’s best for the technology company and what’s best
for the individual user or for society. Business models are often built around
5 maximizing user engagement as opposed to safeguarding users’ health and ensuring
that users engage with one another in safe and healthy ways. . . . Technology companies
6 must step up and take responsibility for creating a safe digital environment for children
and youth.
7
19 viral and deadly TikTok Blackout Challenge. According to TikTok, its proprietary algorithm is “a
20 recommendation system that delivers content to each user that is likely to be of interest to that
21 particular user...each person’s feed is unique and tailored to that specific individual.” In other words,
22 TikTok has specifically curated and determined that these Blackout Challenge videos – videos
23 featuring users who purposefully strangulate themselves until losing consciousness – are appropriate
26 to encourage, enable, and push content to teens and children that Defendant knows to be problematic
28 //
COMPLAINT 2
1 4. Plaintiffs bring claims of strict product liability based upon TikTok’s defective design
2 of its social media product that renders such product addictive and not reasonably safe for ordinary
3 consumers and minor users. It is technologically feasible for TikTok to design social media products
4 that prevent young users from being affirmatively directed to highly dangerous content such as the
5 Blackout Challenges with a negligible increase in production cost. In fact, on information and belief,
6 the Blackout Challenge currently cannot be found on TikTok’s social media product or, in fact,
7 anywhere online. It appears to have been removed from archiving providers, such as
8 www.wayback.archive.org, as well.
9 5. Plaintiffs also bring claims for strict liability based on TikTok’s failure to provide
10 adequate warnings to minor users and their parents that TikTok is addictive and directs vulnerable
11 users to highly dangerous and harmful challenges including but not limited to the Blackout Challenge.
12 The addictive quality of TikTok’s product and its tendency to direct young users to highly dangerous
14 6. Plaintiffs also bring claims for common law negligence arising from TikTok’s
15 unreasonably dangerous social media product and their failure to warn of such dangers. TikTok knew,
16 or in the exercise or ordinary care should have known, that its social media product is addictive to
17 young users and directs them to highly dangerous content promoting self-harm yet failed to re-design
18 its product to ameliorate these harms or warn minor users and their parents of dangers arising out of
20 II. PARTIES
21 7. Plaintiff Christina Arlington Smith is the mother of Lalani Erika Walton who died on
23 8. Christina Arlington has not entered into a User Agreement or other contractual
24 relationship with TikTok herein in connection with Lalani Walton’s use of Defendants’ social media
25 product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or class action waiver
26 set forth in said User Agreements. Additionally, as successor-in-interest to the Estate of Lalani Walton,
27 Plaintiff expressly disaffirms any and all User Agreements with TikTok into which Lalani may have
28 entered.
COMPLAINT 3
1 9. Plaintiff Heriberto Arroyo is the father of Arriani Jaileen Arroyo who died on February
3 10. Heriberto Arroyo has not entered into a User Agreement or other contractual
4 relationship with TikTok herein in connection with Arriani Jaileen Arroyo’s use of Defendants’ social
5 media product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or class action
6 waiver set forth in said User Agreements. Additionally, as successor-in-interest to the Estate of Arriani
7 Jaileen Arroyo, Plaintiff expressly disaffirms any and all User Agreements with TikTok into which
9 11. Christal Arroyo is the mother of Arriani Jaileen Arroyo who died on February 26, 2021.
10 12. Christal Arroyo has not entered into a User Agreement or other contractual
11 relationship with TikTok herein in connection with Arriani Jaileen Arroyo’s use of Defendants’
12 social medial product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or
14 13. Defendant TikTok Inc. is a California corporation with its principal place of business
15 in Culver City, CA. Defendant TikTok owns and operates the TikTok social media platform, an
16 application that is widely marketed by TikTok and available to users throughout the United States.
17 14. At all times relevant hereto, Defendant TikTok Inc. was acting by and through its
18 employees, servants, agents, workmen, and/or staff, all of whom were acting within the course and
20 15. Defendant ByteDance Inc. is a Delaware corporation with its principal place of
21 business in Mountain View, CA. Defendant ByteDance owns TikTok Inc., and owns/operates the
23 16. At all times relevant hereto, Defendant ByteDance Inc. was acting by and through its
24 employees, servants, agents, workmen, and/or staff, all of whom were acting within the course and
26 17. TikTok is highly integrated with its Chinese parent, ByteDance. TikTok’s engineering
27 manager works on both TikTok and ByteDance’s similar Chinese app, Douyin. TikTok’s development
28 processes are closely intertwined with Douyin’s processes. TikTok employees are also deeply
COMPLAINT 4
1 interwoven into ByteDance’s ecosystem. They use a ByteDance product called Lark, a corporate
2 internal communications system like Slack but with aggressive performance-management features
5 18. This Court has general jurisdiction over Defendants because TikTok Inc. and
6 ByteDance Inc. have their principal places of business in California and are “at home” in this State.
7 19. Venue is proper in this Los Angeles County because TikTok is headquartered here.
10 20. TikTok is a video sharing social media application where users create, share, and view
11 short video clips. TikTok exclusively controls and operates the TikTok platform for profit, which
12 creates advertising revenue through maximizing the amount of time users spend on the platform and
13 their level of engagement. The greater the amount of time that young users spend on TikTok, the
15 21. Users on TikTok who open the TikTok application are automatically shown an endless
16 stream of videos selected by an algorithm developed by TikTok to show content on each user’s For
17 You Page (“FYP”) based upon each user’s demographics, likes, and prior activity on the app. In
18 addition, TikTok’s algorithm uses individualized user data and demographic information gleaned from
19 third party sources and statistical data, as well as other data points collected by TikTok, in directing
21 22. TikTok is a social media product designed to be used by children and actively marketed
22 to children across the United States including in the State of California. Further, TikTok is aware that
23 large numbers of children under the age of 13 use its product despite user terms or “community
24 standards” that purport to restrict use to individuals who are 13 and older.
25 23. In fact, this product is designed to be used by minors and is actively marketed to minors
26 across the United States. TikTok markets to minors through its own marketing efforts and design. But
27 also, TikTok works with and actively encourages advertisers to create ads targeted at and appealing to
28 teens, and even to children under the age of 13. TikTok spends millions researching, analyzing, and
COMPLAINT 5
1 experimenting with young children to find ways to make its product more appealing and addictive to
2 these age groups, as these age groups are seen as the key to TikTok’s long-term profitability and
3 market dominance.
4 24. TikTok is aware that large numbers of children under the age of 18 use its product
5 without parental consent. It designs its product in a manner that allows and/or does not prevent such
7 25. TikTok is likewise aware that large numbers of children under the age of 13 use its
8 product despite user terms or “community standards” that purport to restrict use to individuals who
9 are 13 and older. It has designed its product in a manner that allows and/or does not prevent such use
11 26. Moreover, even in instances where TikTok has actual and/or constructive knowledge
12 of underage users opening accounts, posting, and otherwise using its social media product, TikTok
13 fails to prevent and protect against such harmful and illegal use.
15 27. TikTok has designed its algorithms to addict users and cause them to spend as much
16 time on the application as possible through advanced analytics that create a variable reward system
18 28. There are four main goals for TikTok’s algorithm: which the company translates as
19 “user value,” “long-term user value,” “creator value,” and “platform value.”
20 29. An internal TikTok document entitled “TikTok Algo 101” was created by TikTok’s
21 engineering team in Beijing and offers details about both the product’s mathematical core and insight
22 into the company’s understanding of human nature. The document explains frankly that in the pursuit
23 of the company’s “ultimate goal” of adding daily active users, TikTok has chosen to optimize for two
24 closely related metrics in the stream of videos it serves: “retention” — that is, whether a user comes
25 back — and “time spent.” The document offers a rough equation for how videos are scored, in which
26 a prediction driven by machine learning and actual user behavior are summed up for each of three bits
27 of data: likes, comments and playtime, as well as an indication that the video has been played.
28 //
COMPLAINT 6
1 30. A recent Wall Street Journal report revealed how TikTok relies heavily on how much
2 time users spend watching each video to steer them toward more videos that will keep them scrolling,
3 and that process can sometimes lead young viewers down dangerous rabbit holes, in particular, toward
5 31. TikTok purports to have a minimum age requirement of 13-years-old but does little to
6 verify user’s age or enforce its age limitations despite having actual knowledge that use by underage
7 users is widespread. TikTok knows that hundreds of thousands of children as young as six years old
8 are currently using its social media product but undertakes no attempt to identify such users and
9 terminate their usage. On information and belief, the reason TikTok has not sought to limit usage of
10 its social media product by young children is because it would diminish the advertising revenue
11 TikTok earns through such users. TikTok also does not seek parental consent for underage users or
12 provide any warnings or controls that would allow parents to monitor and limit the use of TikTok by
13 their children, despite TikTok’s own current Terms of Service claiming that users under the age of 18
14 require parental consent to use its product. TikTok could quickly and reasonably implement tools to
15 verify age and identity of its users but knows that doing so would result in the loss of millions of
16 current TikTok users—due to some being under the age of 13 and others not having parental consent.
17 32. Until mid 2021, TikTok by default made all users profiles “public,” meaning that
18 strangers, often adults, could view and message underage users of the TikTok app. This is an inherently
19 harmful product feature, particularly when combined with TikTok’s failure to enforce legal and self-
20 imposed age limitations, as it makes small children available to predatory TikTok users in a manner
21 that actively interferes with parental oversight and involvement and puts them in an inherently
23 33. TikTok does not seek parental consent for underage users or provide any warnings or
24 controls that would allow parents to monitor and limit the use of TikTok by their children.
25 34. TikTok has developed images and memes to enact images for users to decorate the snap
26 pictures or videos they post. TikTok has also developed memes and other images for users to apply to
27 images they post on TikTok. TikTok also has acquired publication rights to music that its users can
28 incorporate in the pictures and videos they post on TikTok. When users incorporate images, memes
COMPLAINT 7
1 and music supplied by TikTok into their postings, TikTok becomes a co-publisher of such content. A
2 TikTok user who incorporates images, memes and musical content supplied by TikTok into their posts
3 is functionally equivalent to a novelist who incorporates illustrations into her story. TikTok can no
4 longer characterize the images, memes and musical content it supplies to its users as third-party content
5 as the novelist can disclaim responsibility for illustrations contained in her book.
6 35. TikTok has developed artificial intelligence technology that detects adult users of
7 TikTok who send sexually explicit content to children and receive sexually explicit images from
8 children. This technology furnishes TikTok with actual knowledge that a significant number of minor
9 TikTok users are solicited to send and actually do send sexually explicit photos and videos of
12 36. TikTok advertises its product as “free,” because it does not charge users for
13 downloading or using the product. What many users do not know is that, in fact, TikTok makes its
14 astronomical profits by targeting advertisements and harmful content to young users and by finding
15 unique and increasingly dangerous ways to keep those young users hooked on its social media product.
16 TikTok receives revenue from advertisers who pay a premium to target advertisements to specific
17 demographic groups of TikTok users including, and specifically, users in California under the age of
18 18. TikTok also receives revenue from selling its users’ data, including data belonging to users under
20 37. The amount of revenue TikTok receives is based upon the amount of time and user
21 engagement on its platform, which directly correlates with the number of advertisements that can be
23 38. TikTok is designed around a series of design features that do not add to the
24 communication and communication utility of the application, but instead seek to exploit users’
26 rewards, including “likes,” “followers” and “views.” In the hands of children, this design is
28 //
COMPLAINT 8
1 39. According to industry insiders, TikTok has employed thousands of engineers to help
2 make the TikTok product maximally addicting. For example, TikTok’s “pull to refresh” is based on
3 how slot machines operate. It creates an endless feed, designed to manipulate brain chemistry and to
4 prevent natural end points that would otherwise encourage users to move on to other activities.
5 40. TikTok does not warn users of the addictive design of the TikTok product. On the
6 contrary, TikTok actively tries to conceal the dangerous and addictive nature of its product, lulling
7 users and parents into a false sense of security. This includes consistently playing down its product’s
8 negative effects on teens in public statements and advertising, making false or materially misleading
9 statements concerning product safety, marketing TikTok as a family application that is fun and safe
10 for all ages, and refusing to make its research public or available to academics or lawmakers who have
12 41. TikTok product managers and designers attend and even present at an annual
13 conference held in Silicon Valley called the Habit Summit, the primary purpose of which is to learn
15 42. TikTok engineers its social media product to keep users, and particularly young users,
16 engaged longer and coming back for more. This is referred to as “engineered addiction,” and examples
17 include features like bottomless scrolling, tagging, notifications, and live stories.
19 43. TikTok has intentionally designed its product to maximize users’ ‘screen time, using
20 complex algorithms designed to exploit human psychology and driven by the most advanced computer
21 algorithms and artificial intelligence available to two of the largest technology companies in the
22 world.”
23 44. TikTok has designed and progressively modified its product to promote excessive use
25 45. One of these features present in TikTok is the use of complex algorithms to select and
26 promote content that is provided to users in an unlimited and never ending “feed.” TikTok is well-
27 aware that algorithm-controlled feeds promote unlimited “scrolling”—a type of use that studies have
28 identified as detrimental to users’ mental health – however, TikTok maintains this harmful product
COMPLAINT 9
1 feature as it allows TikTok to display more advertisements and, thus, obtain more revenue.
2 46. TikTok has also designed its algorithm-controlled feeds to promote content most likely
3 to increase user engagement, which often means content that TikTok knows to be harmful to their
4 users. This is content that users might otherwise never see but for TikTok affirmative pushing such
6 47. The addictive nature of TikTok’s product and the complex and psychologically
8 48. TikTok goes to significant lengths to prevent transparency, including posing as a “free”
9 social media platform, burying advertisements in personalized content, and making public statements
10 about the safety of the TikTok product that simply are not true.
11 49. TikTok also has developed unique product features designed to limit and has in other
12 ways limited parents’ ability to monitor and prevent problematic use by their children.
13 50. The algorithms that render TikTok’s social product addictive are designed to be content
14 neutral. They adapt to the social media activity of individual users to promote whatever content will
15 trigger a particular user’s interest and maximize their screen time. TikTok’s algorithm designs do not
16 distinguish, rank, discriminate or prioritize between particular types of content on their social media
17 platforms. If User One is triggered by elephants and User Two is triggered by moonbeams, TikTok’s
18 algorithm design will promote elephant content to User One and moonbeam content to User Two.
19 TikTok’s above-described algorithms are solely quantitative devices and make no qualitative
20 distinctions between the nature and type of content they promote to users.
24 51. The human brain is still developing during adolescence in ways consistent with
25 adolescents’ demonstrated psychosocial immaturity. Specifically, adolescents’ brains are not yet fully
26 developed in regions related to risk evaluation, emotional regulation, and impulse control.
27 52. The frontal lobes - and in particular the prefrontal cortex - of the brain play an essential
28 part in higher-order cognitive functions, impulse control and executive decision- making. These
COMPLAINT 10
1 regions of the brain are central to the process of planning and decision-making, including the
2 evaluation of future consequences and the weighing of risk and reward. They are also essential to the
3 ability to control emotions and inhibit impulses. MRI studies have shown that the prefrontal cortex is
5 53. During childhood and adolescence, the brain is maturing in at least two major ways.
6 First, the brain undergoes myelination, the process through which the neural pathways connecting
7 different parts of the brain become insulated with white fatty tissue called myelin. Second, during
8 childhood and adolescence, the brain is undergoing “pruning” - the paring away of unused synapses,
9 leading to more efficient neural connections. Through myelination and pruning, the brain’s frontal
10 lobes change to help the brain work faster and more efficiently, improving the “executive” functions
11 of the frontal lobes, including impulse control and risk evaluation. This shift in the brain’s composition
14 particularly those involving the brain’s executive functions and the coordinated activity of regions
15 involved in emotion and cognition. As such, the part of the brain that is critical for control of impulses
16 and emotions and mature, considered decision-making is still developing during adolescence,
18 55. The algorithms in TikTok’s social media product exploit minor users’ diminished
19 decision-making capacity, impulse control, emotional maturity, and psychological resiliency caused
20 by users’ incomplete brain development. TikTok knows, or in the exercise of reasonable care should
21 know, that because its minor users’ frontal lobes are not fully developed, such users are much more
22 likely to sustain serious physical and psychological harm through their social media use than adult
23 users. Nevertheless, TikTok has failed to design the TikTok product with any protections to account
25 F. TikTok Misrepresents the Addictive Design and Effects of its Social Media Product
26 56. During the relevant time period, TikTok stated in public comments that the TikTok
27 product is not addictive and was not designed to be addictive. TikTok knew or should have known
COMPLAINT 11
1 57. TikTok did not warn users or their parents of the addictive and mentally harmful effects
2 that the use of its product was known to cause amongst minor users, like Lalani Walton and Arriani
3 Arroyo. On the contrary, TikTok has gone to significant lengths to conceal and/or avoid disclosure as
5 G. TikTok Promotes “TikTok Challenges” to Young Users and Knowingly Directs Them to
6 Dangerous Content
7 58. TikTok also features and promotes various “challenges” where users film themselves
8 engaging in behavior that mimics and “one ups” other users posting videos related to a particular
9 challenge. TikTok promotes users creating and posting videos of challenges identified by a system of
11 59. At all times relevant, TikTok’s algorithm was designed to promote “TikTok
12 Challenges” to young users to increase their engagement and maximize TikTok’s profits. TikTok
13 “challenges” involve users filming themselves engaging in behavior that mimics and often times “one-
14 ups” other users posting videos performing the same or similar conduct. These TikTok “challenges”
15 routinely involve dangerous or risky conduct. TikTok’s algorithm presents these often-dangerous
16 “challenges” to users on their FYP and encourages users to create, share, and participate in the
17 “challenge.”
18 60. There have been numerous dangerous TikTok challenges that TikTok’s app and
19 algorithm have caused to spread rapidly, which promote dangerous behavior, including:
• Fire Mirror Challenge – involves participants spraying shapes on their mirror with a
20
flammable liquid and then setting fire to it.
21 • Orbeez Shooting Challenge – involves participants shooting random strangers with tiny
water-absorbent polymer beads using gel blaster guns.
22 • Milk Crate Challenge – involves participants stacking a mountain of milk crates and
attempting to ascend and descend the unstable structure without falling.
23
• Penny Challenge – involves sliding a penny behind a partially plugged-in phone
24 charger.
• Benadryl Challenge – involves consuming a dangerous amount of Benadryl in order to
25 achieve hallucinogenic effects.
• Skull Breaker Challenge – involves users jumping in the air while friends kick their feet
26 out from underneath them, causing the users to flip in the air and fall back on their
27 head.
• Cha-Cha Slide Challenge – involves users swerving their vehicles all over the road to
28 the famous song by the same name.
COMPLAINT 12
• Dry Scoop Challenge – involves users ingesting a heaping scoop of undiluted
1
supplemental energy powder.
2 • Nyquil Chicken Challenge – involves soaking chicken breast in cough medicine like
Nyquil and cooking it, boiling off the water and alcohol in it and leaving the chicken
3 saturated with a highly concentrated amount of drugs in the meat.
• Tooth Filing Challenge – involves users filing down their teeth with a nail file.
4
• Face Wax Challenge – involves users covering their entire face, including their eyes,
5 with hot wax before ripping it off.
• Coronavirus Challenge – involves users licking random items and surfaces in public
6 during the midst of the global COVID-19 pandemic.
• Scalp Popping Challenge – involves users twisting a piece of hair on the crown of
7 someone's head around their fingers and pulling upward, creating a “popping” effect on
8 their scalp.
• Nutmeg Challenge – involves users consuming dangerously large amounts of nutmeg
9 with the aim of achieving an intoxicating high.
• Throw it in the Air Challenge – involves users standing in a circle looking down at a
10 cellphone on the ground as someone throws an object into the air, and the goal is to not
flinch as you watch the object fall on one of the participant’s heads.
11
• Corn Cob Challenge – involves users attaching a corn cob to a power drill and
12 attempting to each the corn as it spins.
• Gorilla Glue Challenge – involves users using a strong adhesive to stick objects to
13 themselves.
• Kiki Challenge – involves users getting out of moving vehicles to dance alongside in
14
the roadway.
15 • Salt and Ice Challenge – involves users putting salt on their skin and then holding an
ice cube on the spot for as long as possible, creating a chemical reaction that causes
16 pain and can lead to burns.
• Snorting Challenge – involves users snorting an entire latex condom into their nose
17 before pulling it out of their mouth.
18 • Hot Water Challenge – involves users pouring boiling hot water on someone else.
• Fire Challenge – involves users dousing themselves in a flammable liquid and then
19 lighting themselves on fire
20 H. TikTok Had Actual Knowledge that Children Were Dying From its Blackout Challenge
Yet Failed to Redesign its Algorithm to Prevent Such Deaths
21
22 61. The deadliest “TikTok Challenge” being promoted by TikTok’s algorithm is the
23 “TikTok Blackout Challenge,” which encourages users to choke themselves with belts, purse strings,
24 or anything similar until passing out. Tragically, Lalani Walton and Arriani Jaileen Arroyo are just the
25 latest in a growing list of children killed because of TikTok’s algorithm and promotion of the Blackout
26 Challenge to kids.
27 62. On January 21, 2021, a 10-year-old girl in Italy died after TikTok’s app and algorithm
28 recommended the Blackout Challenge to her vis-à-vis her FYP. According to Italian news reports,
COMPLAINT 13
1 after the young girl saw the Blackout Challenge on her TikTok app, she tied a belt around her neck
2 and choked herself, causing her to go into cardiac arrest. She was rushed to the hospital but was
4 63. TikTok had knowledge of this death and its connection to TikTok’s promulgation of
5 the Blackout Challenge sometime after the death but before the deaths of Lalani and Arriani, and
6 several other children, and failed to take reasonable and appropriate steps to fix its social media
7 product, including by verification of age and identity of users, by blocking or removal of the TikTok
8 Blackout Challenge videos from its social media product, or even by removing the TikTok Blackout
10 64. On March 22, 2021, a 12-year-old boy, Joshua Haileyesus, died after attempting the
11 Blackout Challenge that TikTok’s app and algorithm recommended to him through his FYP. Joshua
12 was discovered breathless and unconscious by his twin brother and ultimately died after 19 days on
13 life support. Joshua attempted the Blackout Challenge by choking himself with a shoelace.
14 65. On June 14, 2021, a 14-year-old boy died in Australia while attempting to take part in
15 TikTok’s Blackout Challenge after TikTok’s app and algorithm presented the deadly challenge to him
17 66. In July 2021, a 12-year-old boy died in Oklahoma while attempting the Blackout
18 Challenge after TikTok’s app and algorithm recommended the dangerous and deadly video to him
20 67. In December 2021, a 10-year-old girl, Nyla Anderson died in Pennsylvania after
21 attempting the Blackout Challenge that the TikTok’s algorithm recommended to her through her FYP.
23 68. TikTok unquestionably knew that the deadly Blackout Challenge was spreading
24 through their app and that their algorithm was specifically feeding the Blackout Challenge to children,
26 69. TikTok knew or should have known that failing to take immediate and significant
27 action to extinguish the spread of the deadly Blackout Challenge would result in more injuries and
28 deaths, especially among children, because of these young users attempting the viral challenge.
COMPLAINT 14
1 70. TikTok knew or should have known that its product was dangerously defective and in
2 need of immediate and significant change to prevent users, especially children, from being directed to
3 dangerous challenges that were known to have killed children and, even if not known, where such
4 deaths were reasonably foreseeable based on the inherently dangerous and defective nature of
5 TikTok’s product.
6 71. TikTok knew or should have known that a failure to take immediate and significant
7 corrective action would result in an unreasonable and unacceptable risk that additional users, and
9 72. Despite this knowledge, TikTok outrageously took no and/or completely inadequate
10 action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent its
11 algorithm from directing children to the Blackout Challenge, despite notice and/or foreseeability that
12 such a failure would inevitably lead to more injuries and deaths, including those of children.
13 73. Despite this knowledge, TikTok outrageously failed to change, update, and/or correct
14 its algorithm to prevent it from directing users, specifically children, with the dangerous and deadly
15 Blackout Challenge despite knowing that such a failure would inevitably lead to more injuries and
17 74. TikTok failed or refused to take the necessary corrective action to cure its defective
18 algorithm because TikTok knew that such fixes would result in less user engagement and, thus, less
19 profits.
20 75. TikTok prioritized greater corporate profits over the health and safety of its users and,
21 specifically, over the health and safety of vulnerable children TikTok knew or should have known
23 I. Plaintiffs Expressly Disclaim Any and All Claims Seeking to Hold TikTok Liable as the
Publisher or Speaker of Any Content Provided, Posted or Created by Third Parties
24
76. Plaintiffs seek to hold TikTok accountable for their own alleged acts and omissions.
25
Plaintiffs’ claims arise from TikTok’s status as designers and marketers of a dangerously defective
26
social media product, as well as TikTok’s own statements and actions, and are not based on TikTok
27
as the speaker or publisher of third-party content.
28
COMPLAINT 15
1 77. TikTok also failed to warn minor users and their parents of known dangers arising from
2 anticipated use of its social media platform in general and the Blackout Challenge in particular. These
3 dangers, which are unknown to ordinary consumers, do not arise from third-party content contained
4 on the TikTok social media product, but rather, from TikTok’s algorithm designs that 1) addict minor
5 users to the TikTok product; 2) affirmatively select and promote harmful content to vulnerable users
6 based on their individualized demographic data and social media activity; and 3) put minor users in
8 78. TikTok’s product is addictive on a content neutral basis. For example, TikTok designs
9 and operates its algorithms in a manner intended to and that does change behavior and addict users,
10 including through a natural selection process that does not depend on or require any specific type of
11 third-party content.
12 79. TikTok’s product features are designed to be and are addictive and harmful in
13 themselves, without regard to any content that may exist on TikTok’s platform, for example, TikTok’s
14 “like” feature.
15 80. TikTok has designed other product features for the purpose of encouraging and
16 assisting children in evasion of parental oversight, protection, and consent, which features are wholly
18 81. TikTok has information and knowledge that can determine with reasonably certainty
19 each user’s age, habits, and other personal information, regardless of what information the user
20 provides at the time of account setup. In other words, TikTok knows when a user claims to be 21 but
21 is really 12 and, likewise, it knows when a user claims to be 13 but is really 31.
22 82. In short, none of Plaintiffs’ claims rely on treating TikTok as the publisher or speaker
23 of any third party’s words or content. Plaintiffs’ claims seek to hold TikTok accountable for TikTok’s
24 own allegedly wrongful acts and omissions, not for the speech of others or for TikTok’s good faith
26 83. Plaintiffs are not alleging that TikTok is liable for what third parties said or did, but for
28 //
COMPLAINT 16
1 84. None of Plaintiffs’ claims set forth herein treat TikTok as the speaker or publisher of
2 content posted by third parties. Rather, Plaintiffs seek to hold TikTok liable for its own speech and its
3 own silence in failing to warn of foreseeable dangers arising from anticipate use of its social media
4 product. TikTok could manifestly fulfill its legal duty to design a reasonably safe social product and
5 furnish adequate warnings of foreseeable dangers arising out of the use of TikTok’s product without
7 V. PLAINTIFF-SPECIFIC ALLEGATIONS
10
11
12
13
14
15
16
17
18
19
20
21
22
85. Lalani Erika Renee Walton was born on April 23, 2013. Lalani had a large, blended
23
family with many siblings.
24
86. Lalani was extremely sweet and outgoing. She loved dressing up as a princess and
25
playing with makeup. She enjoyed being the center of attention and didn’t shy away from the spotlight.
26
When she grew up, she wanted to be a famous rapper, like Cardi B.
27
28 //
COMPLAINT 17
1 87. Lalani got her first cellphone on her 8th birthday on April 23, 2021. Shortly thereafter
2 she downloaded TikTok. Parental controls were installed on Lalani’s TikTok account by Lalani’s
4 88. Lalani quickly became addicted to watching TikTok videos and posted many TikTok
5 videos of herself singing and dancing, in the hopes of becoming TikTok famous.
6 89. In 2020, Lalani was involved in a car accident in which one of her stepbrothers died
7 and in which Lalani was seriously injured. Following the accident, Lalani’s stepmother, Rashika,
8 struggled with the loss of her son so Lalani asked to spend a year living with Rashika.. Plaintiff agreed
9 and allowed Lalani to live with Rashika for a one year period, but maintained constant contact with
12 TikTok’s algorithm directed Lalani to the “TikTok Blackout Challenge.” On or about July 13, 2021,
13 Lalani had some bruises on her neck but explained those away to her family as having fallen and
14 bumped herself on her bedframe. Neither Rashika nor Lalani’s siblings attributed those bruises to self-
15 harmful behavior. Likewise, upon information and belief and as was told to Rashika after Lalani’s
16 death, the daughter of one of Rashika’s neighbors was sent the “TikTok Blackout Challenge”
17 sometime in July of 2021. Luckily, in that instance, the mother found her daughter in the act of
18 performing the TikTok Blackout Challenge and made her stop immediately.
19 91. Lalani, Rashika, and Plaintiff Christina Arlington Smith were not so fortunate.
20 92. From July 14 to July 15, 2021, Lalani was with Rashika Walton and two of her step
21 siblings. Rashika was taking two of her children to stay with their grandparents. During the 20-hour
22 round trip, Lalani sat in the backseat watching TikTok videos. For most of that time, Rashika was
23 driving the car and could not see what Lalani was watching on TikTok but, even on the few occasions
24 where they pulled over and/or Rashika asked, Lalani appeared to be watching age-appropriate videos.
25 Plaintiff subsequently learned that Lalani had been watching the “TikTok Blackout Challenge” during
27 93. When Rashika and Lalani returned to their home, Rashika told Lalani to clean up her
28 room and that they would then go swimming. Rashika was tired from the long trip and took a short
COMPLAINT 18
1 nap. When she awoke approximately an hour later, she walked upstairs to Lalani’s room and was
2 surprised to find the door closed. She walked in and found Lalani hanging from her bed with a rope
3 around her neck, and still warm to the touch. Rashika called a neighbor who cut Lalani down and
4 called 9-1-1. The last thing Rashika remembers before passing out was seeing the paramedics put
6 94. Lalani was a happy child who never suffered from depression. Before taking the
7 “TikTok Blackout Challenge” she had laid out her bathing suit, expecting to go swimming when her
8 stepmom woke up. However, she was also under the belief that if she posted a video of herself doing
9 the Blackout Challenge, then she would become famous and so she decided to give it a try. Lalani was
10 eight years old at the time and did not appreciate or understand the dangerous nature of what TikTok
12 95. After Lalani’s death, the police took Lalani’s phone and tablet and informed Rashika
13 that Lalani did not commit suicide. The police officer showed Rashika videos of the Blackout
14 Challenge and said that Lalani had been watching the video on repeat and had been attempting the
15 challenge herself.
16 96. TikTok’s app and algorithm directed exceedingly and unacceptably dangerous
17 challenges and videos to Lalani’s FYP, thus encouraging her to engage and participate in the
19 97. This tragedy and the unimaginable suffering endured by Plaintiff and Lalani’s family
20 was entirely preventable had TikTok not ignored the health and safety of its users, particularly children
22 98. TikTok’s algorithm intentionally thrust an unacceptably dangerous video that TikTok
24 99. TikTok tracks usage data and knew that Lalani watched the TikTok Blackout Challenge
25 not one time, but several times and possibly even over the span of several days.
27 defective algorithm, Lalani attempted the TikTok Blackout Challenge and died as a result.
28 //
COMPLAINT 19
1 101. As a direct and proximate result of the TikTok’s unreasonably dangerous product,
2 failure to warn, and negligence Lalani suffered serious, severe, disabling injuries including, but not
4 102. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
5 to warn, and negligence, which resulted in the death of Lalani Walton, Lalani’s beneficiaries have in
6 the past and will in the future continue to suffer great pecuniary loss, including, but not limited to, loss
7 of support, loss of aid, loss of services, loss of companionship, loss of consortium and comfort, loss
9 103. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
10 to warn, and negligence Plaintiff claims all damages suffered by the Estate of Lalani Walton and her
11 wrongful death beneficiaries by reason of the death of Lalani Walton, including, without limiting the
12 generality thereof, the following: the severe injuries to Lalani which resulted in her death; the anxiety,
13 horror, fear of impending death, mental disturbance, pain, suffering, and other intangible losses which
14 Lalani suffered prior to her death; the loss of future earning capacity suffered by Lalani from the date
15 of her death until the time in the future that she would have lived had she not died as a result of the
16 injuries she sustained; and the loss and total limitation and deprivation of her normal activities, pursuits
17 and pleasures from the date of her death until such time in the future as she would have lived had she
18 not died as a result of the injuries sustained by reason of TikTok’s carelessness, negligence, gross
20 104. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
21 to warn, and negligence, Plaintiff Christina Arlington Smith has been forced to suffer the death and
23 //
24 //
25 //
26 //
27 //
28 //
COMPLAINT 20
1 Arriani Jaileen Arroyo (2011-2021)
10
11
12 105. Arriani Jaileen Arroyo was born on May 18, 2011, and lived with her parents Heriberto
13 and Christal Arroyo. She had a younger brother Edwardo, who was three years younger. Arriani was
14 an active child and enjoyed playing basketball, kickball, and riding her bicycle. She loved to dance
16 106. Arriani received a phone when she was seven and shortly thereafter downloaded
17 TikTok. She used TikTok multiple times a day, including watching videos of other people dancing
18 and singing and posting videos of herself dancing and singing. Arriani gradually became obsessive
19 about posting dance videos on TikTok, and become addicted to the TikTok product.
20 107. As her social media obsession increased, Arriani began receiving from TikTok and
21 trying TikTok Challenges. She would sometimes discuss these with her parents and because all of the
22 challenges they discussed involved eating and drinking challenges, which seemed harmless and not at
23 all dangerous, Arriani’s parents did not regard these activities as dangerous. They understood that
24 TikTok was a family oriented social media product, marketed to and safe for children to use.
25 108. On or about January 2021, Arriani told her mother Christina about a young girl in Italy
26 who died while attempting a social media challenge. Christina Arroyo told Arriani that she was never
28 //
COMPLAINT 21
1 109. On February 26, 2021, Christina Arroyo was attending a church event. Heriberto
2 Arroyo was working on a project in the basement and Arriani and Edwardo were playing in Arriani’s
3 bedroom. Five-year-old Edwardo came downstairs and told his father that Arriani was not moving.
4 Heriberto Arroyo rushed upstairs and found Arriani hanging from the family dog’s leash, which she
6 110. Heriberto Arroyo called 9-1-1 and Arriani was rushed to Children’s Hospital where
7 physicians placed her on a ventilator and were able to restore her pulse. However, testing revealed that
8 Arriani had permanent, irreversible, and complete loss of brain function and life support was
9 withdrawn.
10 111. TikTok’s product and its algorithm directed exceedingly and unacceptably dangerous
11 challenges and videos to Arriani’s FYP, thus encouraging Arriani to engage and participate in the
13 112. This tragedy and the unimaginable suffering endured by Arriani’s parents and younger
14 brother was entirely preventable, and would not have happened but for TikTok making a calculated
15 business decision to ignore the health and safety of its users, particularly young users, in an effort to
17 113. TikTok’s algorithm intentionally thrust an unacceptably dangerous video that TikTok
20 defective algorithm, Arriani attempted the TikTok Blackout Challenge and died as a result.
21 115. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
22 to warn, and negligence Arriani suffered serious, severe, disabling injuries including, but not limited
24 116. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
25 to warn, and negligence which resulted in the death of Arriani Arroyo, her estate and her beneficiaries
26 have in the past and will in the future continue to suffer great pecuniary loss, including, but not limited
27 to, loss of support, loss of aid, loss of services, loss of companionship, loss of consortium and comfort,
COMPLAINT 22
1 117. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
2 to warn, and negligence Plaintiff claims all damages suffered by the Estate of Arriani Arroyo and her
3 wrongful death beneficiaries by reason of the death of Arriani Arroyo, including, without limiting the
4 generality thereof, the following: the severe injuries to Arriani which resulted in her death; the anxiety,
5 horror, fear of impending death, mental disturbance, pain, suffering, and other intangible losses which
6 Arriani suffered prior to her death; the loss of future earning capacity suffered by Arriani from the
7 date of her death until the time in the future that she would have lived had she not died as a result of
8 the injuries she sustained; and the loss and total limitation and deprivation of her normal activities,
9 pursuits and pleasures from the date of her death until such time in the future as she would have lived
10 had she not died as a result of the injuries sustained by reason of the TikTok’s unreasonably dangerous
12 118. As a direct and proximate result of TikTok’s unreasonably dangerous product, failure
13 to warn and negligence Heriberto and Christal Arroyo, have been forced to suffer the death and loss
17 119. Plaintiffs reallege each and every allegation contained in paragraphs 1 through 116 as
19 120. TikTok’s social media product is defective because the foreseeable risks of harm posed
20 by the product’s design could have been reduced or avoided by the adoption of a reasonable alternative
21 design by TikTok and the omission of the alternative design renders the product not reasonably safe.
22 This defective condition rendered the product unreasonably dangerous to persons or property and
23 existed at the time the product left TikTok’s control, reached the user or consumer without substantial
24 change in the condition and its defective condition was a cause of Plaintiffs’ injuries.
25 121. TikTok designed, manufactured, marketed, and sold a social media product that was
26 unreasonably dangerous because it was designed to be addictive to the minor users to whom TikTok
27 actively marketed and because the foreseeable use of TikTok’s product causes mental and physical
COMPLAINT 23
1 122. TikTok’s product was unreasonably dangerous because it contained numerous design
2 characteristics that are not necessary for the utility provided to the user but are unreasonably dangerous
3 and implemented by TikTok solely to increase the profits derived from each additional user and the
4 length of time TikTok could keep each user dependent on their product.
5 123. At all times mentioned herein, TikTok’s product failed to perform as safely as an
6 ordinary consumer and/or ordinary user would expect when used in an intended or reasonably
7 foreseeable manner, and/or the risk of danger inherent in this product outweighed the benefits of said
8 product.
10 124. As designed, TikTok’s algorithms are not reasonably safe because they affirmatively
11 direct minor users to harmful and exploitative content, including but not limited to the TikTok
12 Blackout Challenge, while failing to deploy feasible safeguards to protect vulnerable children from
13 such harmful exposures. It is feasible to design an algorithm that substantially distinguishes between
14 harmful and innocuous content and protects minor users from being exposed to harmful content
15 without altering, modifying, or deleting any third-party content posted on TikTok’s social media
16 product. The cost of designing TikTok’s algorithms to incorporate this safeguard would be negligible
17 while the benefit would be high in terms of reducing the quantum of mental and physical harm
19 125. Defendants also engage in conduct, outside of the algorithms themselves, which is
20 designed to promote harmful and exploitative content as a means of increasing their revenue from
21 advertisements. This includes but is not limited to efforts to encourage advertisers to design ads that
22 appeal to children under the age of 13; and product design features intended to attract and engage
23 minor users to these virtual spaces where harmful ad content is then pushed to those users in a manner
24 intended to increase user engagement, thereby increasing revenue to TikTok at the direct cost of user
25 wellbeing.
26 126. Reasonable users (and their parents) would not expect that TikTok would knowingly
27 expose them to such harmful content and/or that TikTok’s product would direct them to harmful
28 content at all, much less in the manipulative and coercive manner that they do. TikTok has and
COMPLAINT 24
1 continues to knowingly use its algorithms on users in a manner designed to affirmatively change user
2 behavior, which methods are particularly effective on (and harmful to) TikTok’s youngest users, like
4 127. Outrageously, TikTok knowingly exposes the public and innocent children, including
5 Lalani Walton and Arriani Arroyo, to addiction, manipulation, and control causing them to promote,
6 engage, and participate in dangerous and deadly videos and challenges, including but not limited to
8 128. TikTok knew that dangerous and deadly videos and challenges, including but not
9 limited to the Blackout Challenge, were circulating via its social media product and was being
10 recommended to users by the TikTok’s algorithm, including through users’ FYP. But TikTok also
11 knew that these challenges, including its Blackout Challenge, were viral and wildly popular,
12 particularly among TikTok’s youngest users, and that TikTok’s continued promotion and
13 amplification of these challenges was making TikTok significant revenue – which is why TikTok
14 continued to promote and amplify this harmful content to its youngest users.
15 129. TikTok knew that children were dying from attempting to participate in dangerous and
16 deadly videos and challenges, including but not limited to the Blackout Challenge, that TikTok’s
17 algorithm was recommending to them through the children’s FYPs. TikTok knew of at least one such
18 death prior to the death of Lalani and prior to the death of Arriani.
20 130. As designed, TikTok’s product is not reasonably safe because TikTok does not provide
21 for adequate age verification by requiring users to document and verify their age and identity.
22 131. Adults frequently set up user accounts on TikTok’s social media product posing as
23 minors to groom unsuspecting minors and to exchange sexually explicit content and images, which
25 132. Minor users of social media and their parents do not reasonably expect that prurient
26 adults set up fraudulent accounts on Defendant’s social media product and pose as minors for malign
27 purposes.
28 //
COMPLAINT 25
1 133. Likewise, minor users who are under the age of 13 often open and/or access TikTok
2 accounts, and TikTok knows or has reason to know when a user is underage. TikTok already has the
3 information and means it needs to ascertain with reasonable certainty each user’s actual age and, at
4 least in some cases, TikTok utilizes these tools to investigate, assess, and report on percentages and
5 totals of underage users for internal assessment purposes. TikTok simply then chooses to do nothing
7 134. TikTok employees have also reported that TikTok has actual knowledge of users under
8 the age of 13, including because it is clear from the videos they post of themselves that they are too
9 young to legally be using TikTok’s social media product. Despite such knowledge, TikTok often is
10 slow to act or does not act at all, in the interest of increased profits.
11 135. Moreover, reasonably accurate age and identity verification is not only feasible but
13 136. The cost of incorporating age and identify verification into TikTok’s product would be
14 negligible whereas the benefit of age and identity verification would be a substantial reduction in
15 severe mental health harms, sexual exploitation, and abuse among minor users of TikTok’s product.
17 137. TikTok’s product is also defective for lack of parental controls, permission, and
19 138. TikTok’s product is designed with specific product features intended to prevent and/or
20 interfere with parents’ reasonable and lawful exercise of parental control, permission, and monitoring
24 Walton and Arriani Arroyo, purposefully steered those users toward content TikTok knows to be
26 140. Ad content pushed to new child users, including Lalani Walton and Arriani Arroyo,
27 because of their age and vulnerability, purposefully steer those users toward content TikTok knows to
COMPLAINT 26
1 E. Design of Addictive Social Media Products
2 141. As designed, TikTok’s social media product is addictive to child users as follows:
3 When minors use design features such as “likes” it cause their brains release dopamine which creates
4 short term euphoria. However, as soon as dopamine is released, minor users’ brains adapt by reducing
5 or “downregulating” the number of dopamine receptors that are stimulated and their euphoria is
6 countered by dejection. In normal stimulatory environments, this dejection abates, and neutrality is
7 restored. However, TikTok’s algorithms are designed to exploit users’ natural tendency to counteract
8 dejection by going back to the source of pleasure for another dose of euphoria. As this pattern
9 continues over a period of months and the neurological base line to trigger minor users’ dopamine
10 responses increases, they continue to use TikTok, not for enjoyment, but simply to feel normal. Once
11 they stop using TikTok, minor users experience the universal symptoms of withdrawal from any
15 Association's 2013 Diagnostic and Statistical Manual of Mental Disorders (DSM-5), which is used by
16 mental health professionals to diagnose mental disorders. Gaming addiction is a recognized mental
17 health disorder by the World Health Organization and International Classification of Diseases and is
18 functionally and psychologically equivalent to social media addition. The diagnostic symptoms of
19 social media addiction among minors are the same as the symptoms of addictive gaming promulgated
21 143. Preoccupation with social media and withdrawal symptoms (sadness, anxiety,
22 irritability) when device is taken away or not possible (sadness, anxiety, irritability).
23 a. Tolerance, the need to spend more time using social media to satisfy the urge.
28
COMPLAINT 27
1 e. Deceiving family members or others about the amount of time spent on social
2 media.
3 f. The use of social media to relieve negative moods, such as guilt or hopelessness.
5 media usage.
6 144. TikTok’s advertising profit is directly tied to the amount of time that TikTok’s users
7 spend online, and TikTok’s algorithms and other product features are designed to maximize the time
8 users spend using the product by directing them to content that is progressively more and more
9 stimulative. TikTok enhances advertising revenue by maximizing users’ time online through a product
10 design that addicts them to the platform. However, reasonable minor users and their parents do not
11 expect that on-line social media platforms are psychologically and neurologically addictive.
12 145. It is feasible to make TikTok’s product less addictive to child users by limiting the
13 frequency and duration of access and suspending service during sleeping hours.
14 146. Designing software that limits the frequency and duration of child users’ screen use
15 and suspends service during sleeping hours could be accomplished at negligible cost; whereas the
16 benefit of minor users maintaining healthy sleep patterns would be a significant reduction in
17 depression, attempted and completed suicide and other forms self-harm among this vulnerable age
18 cohort.
20 Minor Users
21 147. TikTok’s product is not reasonably safe as designed because it does not include any
22 safeguards to notify users and their parents of usage that TikTok know to be problematic and likely to
23 cause negative mental health effects to users, including excessive passive use and use disruptive of
25 148. It is reasonable for parents to expect that social media products that actively promote
26 their platform to minors will undertake reasonable efforts to notify parents when their child’s use
27 becomes excessive or occurs during sleep time. It is feasible for TikTok to design a product that
28 identifies a significant percentage of their minor users who are using the product more than three hours
COMPLAINT 28
1 per day or using it during sleeping hours at negligible cost.
2 149. TikTok’s product is not reasonably safe as designed because, despite numerous
3 reported instances of child sexual solicitation and exploitation by adult users, TikTok has not
4 undertaken reasonable design changes to protect underage users from this abuse, including notifying
5 parents of underage users when they have been messaged or solicited by an adult user or when a user
6 has sent inappropriate content to minor users. TikTok’s entire business is premised upon collecting
7 and analyzing user data and it is feasible to use TikTok’s data and algorithms to identify and restrict
9 150. It is reasonable for parents to expect that platforms such as TikTok, which actively
10 promotes its services to minors, will undertake reasonable efforts to identify users suffering from
11 mental injury, self-harm, or sexual abuse and implement technological safeguards to notify parents by
13 151. As a proximate result of these dangerous and defective design attributes of TikTok’s
14 product, Lalani Walton and Arriani Arroyo were killed. Plaintiffs did not know, and in the exercise of
15 reasonable diligence could not have known, of these defective designs in TikTok’s product until 2021.
17 emotional distress, past and future medical expenses, and pain and suffering.
18 153. TikTok is further liable to Plaintiffs for punitive damages based upon the willful and
19 wanton design of the TikTok social media product that was intentionally marketed and sold to
20 underage users, whom they knew would be seriously harmed through the use of TikTok.
22 154. Plaintiffs reallege each and every allegation contained in paragraphs 1 through 151 as
25 the foreseeable risks of harm posed by the product could have been reduced or avoided by the
26 provision of reasonable instructions or warnings by the manufacturer and the omission of the
27 instructions or warnings renders the product not reasonably safe. This defective condition rendered
28 the product unreasonably dangerous to persons or property, existed at the time the product left
COMPLAINT 29
1 TikTok’s control, reached the user or consumer without substantial change in the condition in which
2 it was sold and was a proximate cause of Lalani Walton and Arriani Arroyo’s deaths.
4 warning to users or parents regarding the addictive design and effects of TikTok.
5 157. TikTok’s social media product relies on highly complex and proprietary algorithms
6 that are both undisclosed and unfathomable to ordinary consumers who do not expect that social media
8 158. The magnitude of harm from addiction to TikTok’s product is horrific ranging from
9 simple diversion from academic, athletic, and face-to-face socialization to sleep loss, severe
10 depression, anxiety, self-harm, accidental death through the TikTok Blackout Challenge and suicide.
11 159. The harms resulting from minors’ addictive use of social media platforms have been
12 not only well- documented in the professional and scientific literature, but TikTok had actual
14 160. TikTok’s product is unreasonably dangerous because it lacks any warnings that
15 foreseeable product use can disrupt healthy sleep patterns or specific warnings to parents when their
16 child’s product usage exceeds healthy levels or occurs during sleep hours. Excessive screen time is
17 harmful to children’s mental health and sleep patterns and emotional well-being. Reasonable and
18 responsible parents are not able to accurately monitor their child’s screen time because most
19 adolescents own or can obtain access to mobile devices and engage in social media use outside their
20 parents’ presence.
21 161. It is feasible for TikTok’s product to report the frequency and duration of their minor
22 users’ screen time to their parents without disclosing the content of communications at negligible cost,
23 whereas parents’ ability to track the frequency, time and duration of their minor child’s social media
24 use are better situated to identify and address problems arising from such use and to better exercise
26 162. TikTok knew about these harms, knew that users and parents would not be able to
27 safely use the TikTok product without warnings, and failed to provide warnings that were adequate to
28 make the product reasonably safe during ordinary and foreseeable use by children.
COMPLAINT 30
1 163. As a proximate result of TikTok’s failure to warn, Lalani Walton and Arriani Arroyo
2 died.
3 164. As a result of TikTok’s failure to warn, Plaintiffs have suffered loss of consortium,
4 emotional distress, past and future medical expenses, and pain and suffering.
5 165. TikTok is further liable to Plaintiffs for punitive damages based upon TikTok’s willful
6 and wanton failure to warn of known dangers of the TikTok product, which was intentionally marketed
7 and sold to child users, whom TikTok knew would be seriously harmed through their use of the TikTok
10 166. Plaintiffs reallege each and every allegation contained in paragraphs 1 through 163 as
12 167. At all relevant times, TikTok had a duty to exercise reasonable care and caution for the
13 safety of individuals using their product, such as Lalani Walton and Arriani Arroyo.
14 168. TikTok owes a heightened duty of care to minor users of its social media product
15 because adolescents’ brains are not fully developed, which results in a diminished capacity to make
16 good decisions regarding their social media usages, eschew self-destructive behaviors, and overcome
17 emotional and psychological harm from negative and destructive social media encounters and much
18 more susceptible to dangerous TikTok Challenges including but not limited to the TikTok Blackout
19 Challenge.
20 169. As a product manufacturer marketing and selling products to residents of across the
21 United States, and including in California, TikTok owed a duty to exercise ordinary care in the
22 manufacture, marketing, and sale of its product, including a duty to warn minor users and their parents
23 of hazards that TikTok knew to be present, but not obvious, to underage users and their parents.
24 170. As a business owner, TikTok owes its users who visit TikTok’s social media platform
25 and from whom TikTok derives billions of dollars per year in advertising revenue a duty of ordinary
26 care substantially similar to that owed by physical business owners to their business invitees.
27 171. TikTok had a duty to monitor the videos and challenges shared, posted, and/or
28 circulated on its app and platform to ensure that dangerous and deadly videos and challenges were not
COMPLAINT 31
1 posted, shared, circulated, recommended, and/or encouraged. TikTok benefited directly and
2 substantially from its continued promotion and amplification of this harmful content, knew the content
4 172. TikTok had a duty to monitor and evaluate the performance of its algorithm and ensure
5 that it was not directing vulnerable children to dangerous and deadly videos and challenges, including
7 173. TikTok had a duty to employ and train personnel to appropriately and reasonably
8 respond to notice that dangerous and deadly videos and challenges were being posted, shared, and/or
10 174. TikTok had a duty to design, develop, program, manufacture, distribute, sell, supply,
11 and/or operate its app and algorithms to ensure that it did not manipulate users and/or otherwise
12 encourage them to engage in dangerous and potentially deadly videos and challenges.
13 175. TikTok was negligent, grossly negligent, reckless and/or careless in that it failed to
14 exercise ordinary care and caution for the safety of underage users, like Lalani Walton and Arriani
15 Arroyo.
16 176. TikTok was negligent in failing to conduct adequate testing and failing to allow
17 independent academic researchers to adequately study the effects of its product and levels of
18 problematic use amongst child users. TikTok has extensive internal research and/or documents and
19 communications indicating that its product is harmful, causes extensive mental harm, and that minor
20 users are engaging in problematic and addictive use that their parents are helpless to monitor and
21 prevent.
22 177. TikTok is negligent in failing to provide adequate warnings about the dangers
23 associated with the use of its social media products and in failing to advise users and their parents
24 about how and when to safely use its social media platform and features.
25 178. TikTok is negligent in failing to fully assess, investigate, and restrict the use of TikTok
26 by adults to sexually solicit, abuse, manipulate, and exploit minor users of its TikTok product.
27 179. TikTok is negligent in failing to provide users and parents the tools to ensure that its
28 social media product is used in a limited and safe manner by underage users.
COMPLAINT 32
1 180. TikTok knew that dangerous and deadly videos and challenges, including but not
2 limited to the Blackout Challenge, were being promoted to users by its algorithms but failed to take
4 181. TikTok knew that children were dying from attempting to participate in dangerous and
5 deadly videos and challenges, including but not limited to the Blackout Challenge, that TikTok’s
6 algorithm was directing to them but failed to take appropriate, reasonable, timely, and necessary
7 remedial actions.
8 182. As a proximate result of TikTok’s negligence, Lalani Walton and Arriani Arroyo died.
10 Heriberto Arroyo and Christal Arroyo have suffered loss of consortium, emotional distress, past and
12 184. TikTok is further liable to Plaintiffs for punitive damages based upon their willful and
13 wanton conduct toward underage users, including Lalani Walton and Arriani Arroyo, whom it knew
17 185. Plaintiffs reallege each and every allegation contained in paragraphs 1 through 182 as
19 186. As corporations or other business entities headquartered in and operating out of the
20 State of California, TikTok and ByteDance (collectively, “TikTok”) were required to comply with the
21 California Consumer Legal Remedies Act, Cal. Civ. § 1750, et seq. 158. At all times relevant hereto,
22 TikTok intended and expected that its product would be marketed, sold, downloaded, and/or used in
25 and/or operated its product for sale and use in the U.S., including California.
26 188. At all times relevant hereto, TikTok was a person within the meaning of Cal. Civ. Code
27 § 1761(c).
28 //
COMPLAINT 33
1 189. At all times relevant hereto, Lalani and Arriani were consumers within the meaning of
3 190. The California Consumer Legal Remedies Act, Cal. Civ. § 1770(a)(5); (7), provides in
4 pertinent part:
COMPLAINT 34
1 a. concealed, suppressed, or omitted to disclose that TikTok’s product was designed
6 and intended to urge and/or compel users to spend as much time as possible on the
7 TikTok app.
11 e. concealed, suppressed, or omitted to disclose that TikTok’s product was not safe or
13 f. concealed, suppressed, or omitted to disclose that the risk and dangerous videos
14 and challenges shown to users by TikTok’s product would result in severe injury
15 and/or death.
19 app’s algorithm, had not been adequately developed, refined, and/or tested to
20 ensure that dangerous and risky videos and challenges would not be disseminated
23 app’s algorithm, had not been adequately developed, refined, and/or tested to
24 ensure that children and other vulnerable users were not shown videos or challenges
26 which otherwise created a system which rewarded users for engaging in said
28
COMPLAINT 35
1 j. concealed, suppressed, or omitted to disclose that TikTok’s corporate profits
2 depended on user addiction and maximizing a user’s time spent on and engaging in
4 194. These acts and practices of TikTok and those with whom it was acting in concert in
6 the TikTok product for sale and use in California, and elsewhere in the U.S., were unfair because they
7 offended public policy, were immoral, unethical, oppressive, and unscrupulous, and caused substantial
8 injury to consumers, including Plaintiffs’ decedents Lalani Walton and Arriani Arroyo, their estates,
11 manufacturing, distributing, selling, supplying, and/or operating the TikTok product for sale and use
12 in California, and elsewhere in the U.S., offended the clearly stated public policy of California.
14 manufacturing, distributing, selling, supplying, and/or operating the TikTok product for sale and use
15 in California, and elsewhere in the U.S., were immoral and unethical, as they served only to financially
16 benefit TikTok at the expense of the health and safety of users of the TikTok’s product, including
19 manufacturing, distributing, selling, supplying, and/or operating the TikTok product for sale and use
20 in California, and elsewhere in the U.S., were likely to cause substantial injury and/or death to users,
21 including Lalani Walton and Arriani Arroyo, by exposing and encouraging them to engage in activities
22 which posed unnecessary and unreasonable risks to their health and safety.
24 manufacturing, distributing, selling, supplying, and/or operating the TikTok product for sale and use
25 in California, and elsewhere in the U.S., were likely to cause, and did cause, substantial injury and/or
26 death to users of TikTok’s product, including Lalani Walton and Arriani Arroyo, in that but for these
27 acts and practices, TikTok’s product would not have been downloaded, purchased, and/or used and
28 persons who used them, including Lalani Walton, and Arriani Arroyo would not have been injured or
COMPLAINT 36
1 killed by said use.
3 manufacturing, distributing, selling, supplying, and/or operating the TikTok product for sale and use
4 in California, and elsewhere in the U.S., were committed in conscious disregard of the safety of others
6 200. The injuries caused by TikTok’s acts and practices in designing, developing,
7 programming, manufacturing, distributing, selling, supplying, and/or operating their product for sale
8 and use in California, and elsewhere in the U.S.—namely, users’ injuries and damages (including
10 201. TikTok intended that purchasers and/or users of their product use it in reliance on these
12 202. The facts that TikTok concealed, suppressed, and/or omitted to disclose were material
13 to the decisions to use TikTok’s product, and Plaintiffs’ decedent would not have used said product
15 203. TikTok’s unfair and deceptive acts and practices occurred in connection with their
17 204. TikTok’s unfair and deceptive acts and practices of TikTok violated the California
19 205. TikTok committed these unfair and deceptive practices knowing they created a
20 substantial risk of harm to those who used TikTok’s product in California, and elsewhere in the U.S.
21 206. As a direct and proximate result of TikTok’s violations of the California Consumer
22 Legal Remedies Act, Plaintiffs’ decedent, Lalani Walton and Arriani Arroyo, suffered grievous injury
23 and died and Lalani Walton and Arriani Arroyo, and their estate and their beneficiaries, suffered all of
25 //
26 //
27 //
28 //
COMPLAINT 37
1 PRAYER FOR RELIEF
2 WHEREFORE, Plaintiffs pray for judgment against Defendants, their “alternate entities”, and
5 WALTON, Deceased:
6 1. For Decedent’s pecuniary loss and economic losses, including loss of income,
19 9. For such other and further relief as the Court may deem just and proper,
23 ARROYO, Deceased:
24 10. For Decedent’s pecuniary loss and economic losses, including loss of income,
COMPLAINT 38
1 Plaintiff HERIBERTO ARROYO, and CHRISTAL ARROYO, Individually:
5 17. For Plaintiff’s damages for loss of love, companionship, comfort, affection,
9 ARROYO, Individually:
11 19. For such other and further relief as the Court may deem just and proper,
17 20. Injunctive relief, including but not limited to ordering TikTok to stop the
19 algorithms in their social media product and provide warnings to minor users
20 and their parents that TikTok’s social media product are addictive and pose a
22
23
24
25
26
27
28
COMPLAINT 39
1 DEMAND FOR JURY TRIAL
COMPLAINT 40