Last October, 14-year-old Elliston Berry experienced a nightmare no teenager should face. She woke up to an avalanche of calls and messages informing her that fake nude images of her were circulating on Snapchat and other social media platforms. These images were not real—they were AI-generated deepfakes, an unsettling form of digital harassment.
“I was told it went around the whole school,” Berry, now 15, told Fox News. “It was terrifying going through classes and attending school because just the fear of everyone seeing these images created so much anxiety.”
The Deepening Crisis of Deepfakes
Deepfakes, hyper-realistic AI-generated images and videos, have become increasingly common. Initially used to impersonate public figures or create fake pornography, these images now threaten ordinary people. Berry’s ordeal is a grim illustration of this growing menace.
After discovering the deepfakes, Berry immediately informed her parents. Her mother, Anna McAdams, recognized the images as fake and contacted Snapchat repeatedly over an eight-month period to have them removed. While the images were eventually taken down, the classmate who distributed them faced minimal consequences. “This kid is only getting a bit of probation, and when he turns 18, his record will be expunged,” McAdams told CNN.
This incident exposes a significant gap in the legal system’s ability to address the damage caused by deepfakes. Berry and her mother are now urging lawmakers to impose stricter penalties on perpetrators of such digital abuse.
Legislative Efforts to Curb Deepfake Abuse
Recently, Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN), along with several colleagues, introduced the Take It Down Act. This legislation would require social media companies to remove deepfake pornography within two days of receiving a report and make it a felony to distribute such images. Offenders targeting adults could face up to two years in prison, while those targeting minors could face three years.
“What happened to Berry is a sick and twisted pattern that is getting more common,” Cruz told Fox News. He emphasized that the bill places a legal obligation on tech companies to act swiftly when victims or their families report such abuses. Cruz criticized Snapchat’s handling of Berry’s case, pointing out the platform’s prolonged inaction despite multiple requests for image removal.
“If this law had been in place, those pictures would have been taken down within 48 hours, and the perpetrator could be looking at three years in jail,” McAdams said to CNN. Although the photos are now off Snapchat, Berry remains fearful of their potential reappearance. “I still live in fear that these photos might resurface,” Berry said. She hopes the bill’s passage will bring her peace of mind by ensuring serious consequences for those who spread deepfakes.
The Broader Implications and Future of Deepfake Legislation
Berry’s case is not isolated. At Issaquah High School in Washington, boys used an app to create fake nudes of girls who attended a school dance, according to the New York Times. Similar incidents occurred at Westfield High School in New Jersey and in schools across California and Illinois.
Advocates, including some teens, are pushing for laws that penalize the creation and distribution of deepfake nudes. States like Washington, South Dakota, and Louisiana have enacted relevant legislation, with more efforts underway in California and other states. On a federal level, Rep. Joseph Morelle (D-NY) has reintroduced a bill to make sharing such images a federal crime.
Francesca Mani, a 15-year-old Westfield student whose deepfake image was shared, has been advocating for legislative change. “I got super angry, and enough was enough,” she told Vox. Mani’s advocacy reflects a growing movement among young people to combat digital abuse.
While supporters argue these laws are crucial for student safety, some experts doubt their sufficiency. Amy Hasinoff, a communications professor at the University of Colorado Denver, expressed skepticism about the effectiveness of such legislation, given the criminal justice system’s inefficiency in addressing other sex crimes. Hasinoff and others recommend stricter regulation of apps used to create deepfakes to ensure they incorporate consent verification mechanisms.
Revenge porn has long been a problem, but deepfake technology exacerbates it by making it easier to create fake explicit images of anyone. Britt Paris, an assistant professor at Rutgers, highlighted the lasting impacts of deepfakes, noting they can have real and enduring consequences on victims’ lives.
A 2021 study in the UK, New Zealand, and Australia found that 14 percent of respondents aged 16 to 64 had been victimized by deepfake imagery. The psychological impact of such experiences is profound, often causing emotional distress and long-term anxiety.
As lawmakers and advocates push for stronger protections against deepfake abuse, the hope is that future victims will have better recourse and perpetrators will face meaningful consequences for their actions.