TikTok

Troubling Trend of TikTok: How Its Algorithm Pushes Suicide Content to Vulnerable Kids

TikTok’s Algorithm Fails to Recognize the Death of Chase Nasca, Pushes Suicide Content to Vulnerable KidsOver a year has passed since 16-year-old Chase Nasca took his own life, but his TikTok account remains active. Despite his tragic death, scrolling through his For You feed reveals a disturbing trend – a constant stream of videos that depict unrequited love, hopelessness, and pain and even glorify suicide as an escape.

This content was present on his feed even in the days leading up to his death. A video of an oncoming train with a caption indicating he was going for a walk to clear his head surfaced on his feed just days before he died. Nasca’s parents had warned him about the dangers of the nearby train tracks, but he sadly succumbed to the allure of the content pushed by TikTok’s algorithm. This raises serious concerns about how the algorithm fails to recognize and prevent disseminating harmful content, especially to vulnerable children and teenagers. He sent a message to a friend: “I’m sorry. I can’t take it anymore.”

It is impossible to determine with certainty why Nasca decided to end his life, as suicide is often the result of multiple complex factors, and he did not leave behind a note. However, two weeks after his death, Nasca’s mother, Michelle, turned to his social media accounts for answers. When she opened the TikTok app on his iPad, she discovered a collection of over 3,000 videos that Nasca had bookmarked, liked, saved, or tagged as favorites. Michelle could see the topics he had searched for, including Batman, basketball, weightlifting, and motivational speeches. She also noticed that the algorithm had recommended many videos related to depression, hopelessness, and death, which Nasca had engaged with.

Concerns Grow Over the Popularity and Potential of TikTok

Since its explosive rise to popular culture in 2018, TikTok, a short-form video platform owned by Chinese internet company ByteDance Ltd., has been the subject of much scrutiny regarding its impact on children. The algorithm that powers its recommendation engine constantly feeds users with captivating user-generated content, keeping them glued to their screens.

With an estimated 150 million American users, TikTok’s popularity has prompted competitors in Silicon Valley to attempt to replicate its success, while politicians have raised concerns about its potential use as a disinformation tool by the Chinese government. The Biden administration has threatened to ban TikTok, a move previously proposed by the Trump administration unless ByteDance sells its stake in the company.

Child psychologists and researchers are increasingly alarmed as the political debate around TikTok continues. Surveys of teenagers have revealed a concerning correlation between social media use and depression, self-harm, and suicide. Data from the Centers for Disease Control and Prevention indicate that nearly 1 in 4 teens seriously considered suicide in 2021, almost double the rate from a decade earlier. Authorities such as the American Psychological Association have placed partial blame on social media for these concerning trends.

Troubling Trend of TikTok: How Its Algorithm Pushes Suicide Content to Vulnerable Kids
Credits: The Guardian

The app Faces Congressional Scrutiny and Wrongful Death Lawsuit as Concerns Grow Over Social Media’s Impact on Young Users

In a congressional hearing held in March, a representative brought up Nasca’s tragic death, showing TikTok CEO Shou Chew some of the clips that had been sent to the boy through the app. The representative asked Chew if he would allow his children to watch such content. That same month, in a New York state court, Nasca’s parents filed a wrongful death lawsuit against TikTok, ByteDance, and the railroad. The incident has further fueled the ongoing conversation and concerns surrounding social media’s impact on young users, specifically TikTok.

The Trust and Safety team at TikTok, which operates across multiple locations, including the US, Ireland, and Singapore, is responsible for designing features and policies to keep users safe on the platform. This includes moderating the billions of videos uploaded daily and addressing safety issues, such as content that sexualizes minors or encourages dangerous challenges. The team works to remove posts violating standards and develop tools to help users filter harmful material.

However, former employees, who have chosen to remain anonymous due to nondisclosure agreements, have expressed concerns about their limited influence over the algorithm that powers the For You feed. They claim that their requests for information about how the algorithm works were often ignored, and they felt they needed to be equipped to fully comprehend the underlying mechanisms of the app they were tasked with making safer. These employees believe that they were put in a difficult position, tasked with enhancing the safety of an app while needing a deeper understanding of its foundational workings.