Tawainna and her daughter Nylah (10 years old) were found hanging from a purse in their bedroom. Anderson dialed 911 to attempt CPR, until an ambulance arrived. However, Nylah was declared dead after several days of intensive care.
Police later discovered that the girl had attempted a TikTok viral challenge. The social media platform has over 1 billion users monthly, 28% of which are younger than 20. Users record themselves doing a specific act and then call on others to join in. Common hashtags are used to help make the videos easy to find. TikTok users have their own “For You Page”, where it curates recommended videos based upon each user’s viewing history.
Participants are required to infuse themselves with household objects and asphyxiate until death. This is the so-called blackout challenge. It was at least a part of the 1995 CDC version. Nylah reportedly attempted the blackout challenge after watching videos. Eight children have died since 2021 from the alleged challenge.
Anderson sued TikTok’s parent company ByteDance earlier in the year for wrongful deaths, negligence and strict products liability. Anderson claims that the suit is based on “predatory and manipulative app and algorithm…PushesExtremely unacceptable Nylah is encouraged to take on dangerous challenges, and to upload videos of FYP. Engage and take part.”
It contends that TikTok’s final “”algorithm determined This The Toxic Blackout
Well, Challenge.–Tailored and most likely to be of value to 10–Year–Nylah Anderson was an old lady, she As a result, he died.“
This July ReasonReport on a related lawsuit
It’s unclear whether TikTok could be held legally responsible even if it was directing harmful content to FYPs. Section 230 under the Communications Decency Act shields online services against legal liability for user-generated content. TikTok might host the videos. However, it is not allowed to “be treated like the publisher or speaker” of content posted by its users.
Anderson’s suit was similar to that of the other. It tried to bypass Section 230’s protections for liability by stipulating it “doesn’t seek to hold”. [defendants]ReliableIs the speaker Third publisher–Instead of party content, it intends to instead hold [them]Responsible for Their independent behavior as designers, programmers or manufacturers. distributors.”
Judge Paul Diamond of Pennsylvania’s Eastern District Court dismissed the case this week, citing Section 230. Diamond declared Anderson to be “nicht admissible”.cannot defeat Section 230 immunity…By Creatively Her labeling claims.” He determined, “Although Anderson recasts her content claims by attacking Defendants’ ‘deliberate action’ taken through their algorithm…courts have repeatedly held that such algorithms are ‘not content in and of themselves.’“
Diamond refers to case law that has been used in the past and argues that it is “a fact of life.”Congress Convered This Immunity ‘To Maintain The Robust Nature This is Internet communication and, accordingly, to keep government interference in the medium to a minimum’…. It recognized that because of the ‘staggering’ amount of information communicated through interactive computer services, providers cannot prescreen each message they republish…. To encourage providers not to limit unduel the nature and number of their postings, Congress gave them immunity.
TikTok would not be able to review every video because of the number of users. (The platform encourages content creators to post 1–4 times per day for maximum reach). This year’s study concluded that “TikTok has improved safety controls and removed dangerous challenges,” but the sheer amount of content being posted each day made it difficult.
The death of Nylah Anderson is tragic, and the pain felt by her mother is immense. However, it is not obvious that TikTok was solely responsible for what occurred. Nor is it evident that any legal liability should be assumed.