More than 93,000 Americans died of drug overdoses in 2020, increasing nearly 30% from 2019.
Amy Neville’s son died last year after taking a fentanyl-laced pill. Alex, 14, used the social media app Snapchat to contact with a drug dealer and purchase the pills, according to her. She has since protested outside Snap’s headquarters, and in a letter signed by six other parents, she has asked the company to form an external committee comprised of law enforcement officials, parents, and public health experts to assess the company’s progress in addressing drug availability on its platform.
Snap sent the parents a letter in which it stated that it was committed to stopping drug trafficking on its platform. But, as Neville points out, that’s unsatisfactory.
Officials from the federal government agree. The US Drug Enforcement Administration issued a warning this week about the rise of counterfeit pills sold on the internet that contain fentanyl, a synthetic opioid that may be deadly in small doses. DEA Administrator Anne Milgram notably called out Snapchat and TikTok, two popular applications among teens and young people, in an interview, saying that the agency will approach to social media firms with specific demands to fight sales.
However, it is unclear what those demands will be or how they would assist. Illegal drug sales have been a problem on Facebook, Snapchat, TikTok, and other social media platforms for years. Companies have said repeatedly that they are attempting to clear their platforms of drug sales by hiring more moderators, employing artificial intelligence algorithms to detect illicit content, and restricting searches for drugs-related terms. Prescription and other medicines, on the other hand, are still readily available.
Sammy Chapman, 16, died in February after taking fentanyl laced- tablets he bought online. His parents, TV personality and therapist Laura Berman and Sam Chapman, claim he met a dealer on Snapchat who advertised his services with a colourful menu. They are attempting to persuade Congress to pass legislation requiring social media sites to integrate parental monitoring software, but privacy advocates have expressed concerns that this could harm children who do not want their parents to know about certain aspects of their lives, such as their sexuality.
According to Marc Berkman, CEO of the Organization for Social Media Safety, the organisation conducted an informal test and discovered that drug traffickers could be identified on several social media platforms in under three minutes. According to a study released in March by the Digital Citizens Alliance and the Coalition for a Safer Web, Facebook pages, Instagram profiles, and YouTube videos were used to promote narcotics to thousands of followers or viewers.
In a statement, Facebook spokesperson Avra Siegel said the business does not allow anyone to buy or sell drugs on its platforms. It will collaborate with the nonprofit Partnership to End Addiction on a series of public service announcements about opioid addiction on Wednesday, she added.
In an email, Snap spokesperson Rachel Racusen said the business aggressively bans and combats drug-related conduct, as well as assisting law authorities in investigations. Teens who use the company’s app are also shown movies on the risks of drugs.
In a response, TikTok spokesperson Hilary McQuaide said the company also removes accounts that encourage illicit drug sales, citing that the company uses both technology and human reviewers to locate and assess the infringing content. TikTok prohibits users from searching for certain drug-related keywords, instead referring them to the company’s rules. Following a Washington Post query about drug-related material, it diverted a search term on Monday.