Detecting and combating child sexual abuse material (CSAM) on OnlyFans presents a formidable challenge, according to investigators and experts. The platform, primarily fueled by user-generated content, voluntarily reports suspected CSAM to the National Center for Missing & Exploited Children (NCMEC), despite not being legally obligated to do so as a UK-based entity.
OnlyFans’ Efforts in Reporting and Safety
OnlyFans asserts rigorous measures in its fight against CSAM, promptly removing detected content and notifying NCMEC’s CyberTipline. In 2023 alone, the platform reported 347 instances of suspected CSAM out of a vast volume of “hundreds of millions of posts.” A spokesperson emphasized these figures as evidence of their stringent safety controls, noting that much of the flagged material turns out to be duplicates or unrelated to CSAM. However, verifying these claims independently is complicated by the platform’s decentralized structure, where each creator’s content is shielded behind individual paywalls.
Challenges Faced by Law Enforcement
The presence of paywalls poses significant hurdles for law enforcement agencies. Trey Amick from Magnet Forensics Inc. explains that investigators typically access only basic account details without a subscription. Full access requires formal requests to OnlyFans, which then provides comprehensive information, including account content and messages. NCMEC’s access behind these paywalls is limited to cases reported to their CyberTipline or linked to missing children, without proactive monitoring capabilities.
In an effort to enhance transparency, OnlyFans appointed Michael Ward, a former U.S. Justice Department prosecutor, to monitor and evaluate its safety measures. Ward’s role, however, remains unclear as he declined to confirm his involvement or provide insights into his findings.
Despite OnlyFans’ assurances, independent investigations, including one by Reuters involving documents from over 250 major U.S. law enforcement agencies, uncovered 30 confirmed instances of CSAM. This suggests that the platform’s reported figures may underestimate the actual prevalence. While OnlyFans’ transparency reports indicate a decline in suspected CSAM incidents, questions persist regarding the efficacy of these detection and prevention measures.
OnlyFans requires prospective creators to undergo stringent identity verification processes, including submission of government-issued photo IDs and bank details, verified through human oversight and age-estimation technology. The platform employs continuous scanning and trains content moderators to swiftly identify and report any suspected CSAM.
Challenges in Prevention and Detection
Despite these measures, instances of minors evading age verification have been documented. Reuters highlighted cases where minors used adult identification or took over existing accounts to circumvent detection. An OnlyFans spokesperson defended their safety protocols, citing the low number of CSAM reports as indicative of their effectiveness. The platform’s lack of anonymous posting and end-to-end encryption facilitates law enforcement and prosecutorial efforts.
OnlyFans invests significantly in CSAM detection, collaborating with the Internet Watch Foundation (IWF) through an annual payment of approximately $114,000. This partnership aids in identifying known CSAM, although identifying new and uncatalogued material remains a persistent challenge.
Impact on Victims and Calls for Accountability
The presence of CSAM on OnlyFans profoundly impacts victims, as highlighted by the father of a 16-year-old victim who spoke of enduring trauma. He called for greater accountability from platforms like OnlyFans. Despite these concerns, OnlyFans has yet to face legal consequences for CSAM incidents. Questions remain about whether the platform prioritizes CSAM detection over financial gains, as it declined to comment on revenue from accounts involving minors and the efficacy of its age verification systems.