Tiktok Blackout Challenge Deaths: A U.S. appeals court has reopened a significant lawsuit against TikTok following the tragic death of a 10-year-old girl who attempted the dangerous “blackout challenge.” This challenge, known for urging participants to choke themselves until they lose consciousness, was allegedly promoted to the child through TikTok’s recommendation algorithm.
The Tragic Event
In December 2021, 10-year-old Nylah Anderson from Chester, Pennsylvania, attempted the “Blackout Challenge” after seeing it on her “For You” feed on TikTok. According to the lawsuit, the platform’s algorithm recommended the video to her based on her interests, despite prior reports of similar challenges resulting in fatalities.
Nylah was found unresponsive in her home after trying the challenge with a purse strap. Despite her mother Tawainna Anderson’s efforts to perform CPR and get emergency help, Nylah passed away five days later. Her family describes her as a joyful and intelligent child, and her death has prompted her mother to seek justice and accountability from TikTok.
The Lawsuit and Court Ruling
Initially, a lower court dismissed Tawainna Anderson’s lawsuit, citing Section 230 of the Communications Decency Act of 1996. This law generally protects internet platforms from being held liable for user-generated content. However, the 3rd U.S. Circuit Court of Appeals in Philadelphia has now reopened the case.
Judge Patty Shwartz, writing for the court, noted that TikTok’s algorithmic choices to recommend specific content can be considered the company’s own “speech.” This means that TikTok’s actions in promoting the “Blackout Challenge” are not fully protected by Section 230. The court’s decision reflects a shift in how legal protections for social media platforms are interpreted.
Concerns About Safety and Profits
Judge Paul Matey, in a partially agreeing opinion, criticized TikTok for prioritizing profits over user safety. He suggested that the company knowingly exposed children to dangerous content to boost engagement. According to Matey, while Nylah might not have fully understood the risks, TikTok’s algorithm targeted her with the harmful content intentionally.
This decision to revive the lawsuit sends the case back to the lower court for further proceedings. Jeffrey Goodman, representing Nylah’s mother, believes the ruling could lead to a more rigorous examination of Section 230 protections. Goodman hopes this case will set a precedent for holding social media platforms accountable for their content recommendations.
TikTok’s Response and Future Implications
TikTok and its parent company, ByteDance, have yet to comment on the court’s decision. However, the case highlights growing concerns about how social media platforms handle dangerous content. The situation is further complicated by recent reports that Nicole Iacopetti, TikTok’s head of content strategy and policy, will leave the company in September 2024.
The ruling emphasizes that social media platforms might be held responsible for how they curate and promote content, particularly when it involves vulnerable users like children. This case could have broader implications for the tech industry, potentially leading to increased scrutiny and changes in how platforms operate.
Conclusion
The reopening of the lawsuit against TikTok marks a critical development in the ongoing debate over the responsibilities of social media companies. As the case progresses, it will be closely watched for its potential impact on legal standards for content moderation and user safety.
Nylah Anderson’s tragic death serves as a reminder of the need for better protection for users on social media platforms. With the court’s decision, there is hope that similar cases could lead to meaningful changes in how tech companies manage and promote content, ensuring that such tragedies do not happen again.