A pivotal legal dispute has gained renewed momentum after an appeals court ruled that TikTok must face a lawsuit regarding the death of a child who participated in the dangerous “Blackout Challenge.” This decision challenges the extensive protections that social media platforms typically enjoy under Section 230 of the Communications Decency Act, which often shields them from liability for content created by third parties. The ruling marks a significant shift in the ongoing debate about the responsibilities of social media companies in curating and recommending content, especially when it leads to harmful outcomes.
The Risks of the Blackout Challenge
The “Blackout Challenge” is a dangerous online phenomenon that prompts participants, often young people, to strangle themselves with belts, strings, or similar items until they lose consciousness. Tragically, this challenge has led to the deaths of several children, raising significant alarm among parents and officials. One such victim, Nylah Anderson, died after attempting the challenge. In response, her mother, Tawainna Anderson, filed a lawsuit against TikTok in 2022, alleging that the platform’s algorithm had promoted the challenge to her daughter, thereby making TikTok partly accountable for her death.
 Scrutinizing TikTok’s Algorithm
The lawsuit is centered on TikTok’s For You Page (FYP) algorithm, which suggests videos to users based on their previous interactions on the platform. Tawainna Anderson argues that TikTok’s algorithm acted as a promoter of the Blackout Challenge by recommending the dangerous content to her daughter. Initially, the lower court ruled in favor of TikTok, citing Section 230 immunity, which typically protects platforms from liability for third-party content. However, the appeals court, led by Judge Patty Shwartz, overturned this decision, arguing that TikTok’s algorithm effectively transforms third-party content into the platform’s own “expressive activity.”
Reassessing the Reach of Section 230
Judge Shwartz’s decision centers on how Section 230 is interpreted—a crucial legal rule that has traditionally protected internet platforms from being held liable for content created by users. However, Shwartz contended that this immunity does not extend to situations where a platform’s algorithm actively selects and promotes particular content, as this involves the platform’s own “expressive activity.” She cited a recent Supreme Court ruling indicating that algorithms that reflect editorial choices might be viewed as the platform’s own expression, which falls outside the protections of Section 230.
Shwartz pointed out that while TikTok might not be liable if Nylah Anderson had discovered the Blackout Challenge video through a search, the fact that TikTok’s algorithm recommended the video on her FYP makes the platform an “affirmative promoter” of dangerous content. This interpretation allows Anderson’s lawsuit to proceed, challenging the broad application of Section 230 in cases where platforms algorithmically promote harmful content.
The court’s decision to allow the lawsuit to proceed could have significant implications for TikTok and other social media companies. If TikTok is found responsible, it could establish a precedent for holding platforms accountable for the content their algorithms promote, especially when it results in harm or fatalities. This case might lead to a more limited interpretation of Section 230, as noted by Circuit Judge Paul Matey, who supported Shwartz’s ruling.
Matey stressed that Section 230 should not be applied so broadly as to permit platforms to ignore the risks posed by their content recommendations. He argued that platforms have a duty to prevent the dissemination of harmful content, particularly when they are aware of its dangers. Matey’s viewpoint indicates that platforms like TikTok could be held responsible not only for hosting harmful content but also for their specific recommendations of such content, especially when it is known to be hazardous to children.
The Future of the Anderson Case and TikTok’s Response
With the appeals court’s ruling, the case has been remanded to the district court for further proceedings. The district court will now have to determine which of Anderson’s claims can move forward under this new interpretation of Section 230. Anderson’s legal team has committed to continuing their pursuit of accountability, arguing that the Communications Decency Act was never intended to protect platforms from the consequences of promoting dangerous content to children.
TikTok, while reiterating its commitment to user safety, now faces the challenge of defending its algorithmic practices in court. The platform has previously stated that it removes dangerous content like the Blackout Challenge when identified, but this legal battle may force TikTok and other platforms to reconsider how their algorithms recommend content, particularly to younger users.
The appeals court’s ruling against TikTok marks a crucial moment in the ongoing legal and ethical debate regarding the responsibilities of social media platforms. By challenging the broad protections of Section 230, the court has opened the door to holding platforms liable when their algorithms promote harmful content. As the lawsuit progresses, it could lead to increased scrutiny of how platforms manage content and their duty to protect users, especially children, from dangerous online trends.