TikTok has only recently risen to prominence on social media. During the pandemic, its viral challenges became a regular and a defining feature of the platform. Some are relatively safe, like dance moves. The blackout challenge, in which children choke themselves until they pass out, is one of the riskier ones. The outcomes can be devastating if children under 18 can readily circumvent limitations and locate them.
Children worldwide have been choking themselves with everyday objects until they pass out. First, they recorded the adrenaline rush they experienced upon regaining consciousness. After it, they published the videos on social media. It’s a contemporary version of choking dares, which have existed for years. But they are now spread to youngsters by influential social media algorithms, reaching kids too young to understand the risk properly.
The game on TikTok is causing deaths
The same tragedy struck Arriani, a 9-year-old. A few days later, after Arriani had been laid to rest with her nails newly manicured and wearing a princess dress and tiara. Arriami’s brother told his parents what had transpired. He claimed they were engaging in a game they had seen on TikTok.
Children worldwide have been choking themselves with everyday objects until they pass out, recording the adrenaline they experienced upon regaining consciousness, then publishing the videos on social media. It’s a contemporary version of choking dares, which have existed for years but are now spread to youngsters through powerful social media algorithms, reaching kids too young to understand the risk properly.
Arriani’s death wasn’t reported
Arriani’s passing went unreported by the media, and TikTok took months to find out. But the business knew that young people were participating in the blackout challenge. They weren’t old enough to create profiles on its app and were dying—the trust and safety team at TikTok. The group aims to safeguard users and uphold the company’s reputation, had started looking into a related event in Palermo, Sicily, in the weeks prior. In January, a 10-year-old girl named Antonella Sicomero was discovered with a bathrobe belt hanging from a towel rack around her neck. According to Antonella’s parents, she died while playing “an extreme game on TikTok,” they informed the local media.
The Palermo prosecutor’s office launched an inquiry. Italy’s privacy authority also ordered the social network to block underage users nationwide. The users, whose age it couldn’t confirm as being over 13, argued it was breaking its own policy to keep preteens off the program.
The firm claims it has never been a trend
The group claimed to have found no proof that Antonella had been recommended for the challenge by TikTok’s algorithm. The team members claim that senior executives were relieved by that. A crisis management plan was created to separate TikTok from the tragedy and portray it as a problem facing the entire business. The challenge “had never been a trend” on the network. Additionally, they informed that the users discovered it “from sources other than TikTok.”
As more youngsters who should not be using social media have died. TikTok has spread the same message, most recently in a statement to Bloomberg Businessweek. According to information gathered by Businessweek from news articles. The blackout challenge has been connected to the deaths of at least 15 children aged 12 or younger in the last 18 months. At least five 13 and 14-year-old kids also passed away during that time. TikTok was prominently mentioned in headlines after the fatalities, but police agencies refused requests under the Freedom of Information Act for access to incident reports that might have shown which platform, if any, was at fault.
Moderators are keeping a check on the platform
US law prohibits social media sites from gathering information on users under 13. However, TikTok claims it abides by the regulations. The app redirects underage users to a version where they can access curated content. They don’t even need to create a profile or see advertisements.
At TikTok, videos are reviewed by a global army of around 40,000 moderators, three-quarters employed under contract. According to former workers, each watches roughly 1,000 films daily, giving each one 20 seconds of their time. They claim that the system is not designed to identify underage users. By automatically removing content that might contravene a community standard, such as nudity or violence, artificial intelligence software scans every video uploaded—10 billion in the first quarter of this year—filters posts to moderators.