When a tragedy occurs, lawsuits often follow. This is no different. And, while it’s reasonable that bereaved survivors often seek justice — whether it’s closure or recompense — through the legal system, it’s not there to bring solace. Its purpose is to discover who was legally responsible for the death.
This lawsuit, brought to our attention by Adam Klasfeld of Law & Crime, is yet another erroneous attempt to hold a social networking service directly liable for an individual’s activities. It’s the death of a 10-year-old TikTok user who allegedly took part in a “blackout challenge” that began with her asphyxiation in her bedroom and finished five days later when she died at a nearby hospital.
However, TikTok is not to blame for this death.
![Latest TIKTOK | The Straits Times](https://static1.straitstimes.com.sg/s3fs-public/styles/small30x18/public/articles/2022/10/10/kc-tiktok1010.jpg?VersionId=a5IAvse1CHCg5i67_M3SupBCm29.h3MX&h=8576b195&itok=9kNtaXpC)
And, even if it’s purposefully constructed to omit consideration of third-party liability, the case [PDF] has very little chance of piercing TikTok’s Section 230 immunity.
The lawsuit begins by implying that this is a problem specific to TikTok, making it unquestionably directly liable for this child’s death. (The emphasis is mine.)
Social media algorithms are unquestionably meant to encourage participation. That cannot be disputed. However, if the algorithm suggested the “blackout challenge” to Nylah Anderson, that suggestion can be traced back to Anderson’s interactions with the service as well as the actions of other users, which allegedly made the challenge “viral” (to use Law & Crime’s headline wording) and more likely to surface as a TikTok user suggestion.
TikTok has long been accused of encouraging “viral” challenges that might result in serious injury or death. The majority of the virality is due to exaggerated, frantic coverage of something someone saw on the internet, rather than trending content published by the site.
TikTok’s actions here were not malevolent or careless. If the claims are true, the algorithm uncovered anything that was popular at the time. It did not issue a lethal challenge to a 10-year-old.
Whatever appears in the “For You” section is protected by Section 230 and the First Amendment. By presenting this as a defective product lawsuit, the plaintiff and her legal representatives want to avoid the inevitable debate. And they falsely claim that this has nothing to do with them.
Sure, you can make such claim in a federal complaint. But it doesn’t mean judges have to pretend the charges have nothing to do with Section 230.
The specifics of TikTok’s recommendation algorithm are identified as a factor in this fatality and an indication that the app’s makers are delivering a purposely faulty product that prioritises money before user safety.
While it is true that profitability and user engagement are more important than the distribution of potentially hazardous content, TikTok provides users with tools to filter their feed as well as information about how their algorithmic suggestions are produced. It’s not a complete black box, and it shows TikTok is at least attempting to minimise exposure to bad content.