The U.S. House of Representatives voted 409-2 this week to pass the Take It Down Act, a bill that targets the growing threat of nonconsensual intimate images online — whether real or artificially generated. With the Senate already on board, the legislation is now heading to President Donald Trump’s desk, and he has signaled his full support.
“This is a bill I’ll gladly sign,” Trump said during a recent speech. “Frankly, I might even need it myself — no one gets treated worse online than I do.”
What the Bill Does
At its core, the Take It Down Act makes it a federal crime to share intimate images of someone without their consent — including those created with artificial intelligence, often referred to as deepfakes. It also requires social media platforms and websites to act fast: any such content flagged must be taken down within 48 hours.
This bill comes at a time when the internet is flooded with tools that make it easier than ever to create fake yet disturbingly realistic images. The dangers go beyond celebrity impersonations — young people, especially teenagers, have found themselves victims of AI-generated nudes and other forms of harassment, often with devastating consequences.
A Rare Bipartisan Victory
The strong vote in the House reflects growing alarm across party lines about how quickly AI and deepfake technology have outpaced current laws. The bill has garnered widespread support from Democrats and Republicans alike, along with advocacy groups and even some major tech companies.
First Lady Melania Trump has been an outspoken advocate for the bill, lending her voice to a broader movement to protect children and vulnerable individuals online.
Major tech players such as Google and Snap have applauded the legislation. “It’s a big step forward in protecting people from having their private images exploited online,” said Kent Walker, Google’s president of global affairs.
Digital Rights Groups Raise Red Flags
Despite its good intentions, not everyone is convinced the bill will help more than it harms. Several digital rights organizations worry that the law could open the door to abuse — not just by individuals, but by governments and platforms acting in bad faith.
The Cyber Civil Rights Initiative (CCRI), an organization founded to support victims of image-based sexual abuse, said it supports criminalizing the act of distributing nonconsensual images — but it cannot endorse the bill as a whole.
Their main concern? The takedown system could be manipulated. According to CCRI, the requirement to remove content within 48 hours may overwhelm platforms, leading them to rely on faulty automated systems that can be abused.
Even more worrying to the group is the political angle. With Trump recently firing several Democratic members of the Federal Trade Commission, CCRI fears the bill could be selectively enforced, with politically aligned platforms receiving leniency while others are held to stricter standards.
Privacy May Be the Price
Another significant issue is the bill’s impact on privacy, particularly for platforms that use end-to-end encryption — like messaging apps and secure cloud services. These platforms can’t access user content by design, making it nearly impossible to comply with takedown orders without compromising user privacy.
The Electronic Frontier Foundation (EFF) issued a stark warning: platforms might respond by dropping encryption altogether just to stay legally compliant. “That would turn private, secure communications into monitored spaces,” the group said, a shift that could have chilling effects on users — especially survivors of abuse who rely on encrypted messaging to stay safe.
Smaller Platforms Could Struggle
While companies like Google and Snap have the infrastructure to handle takedown requests at scale, smaller or mid-sized platforms may face serious challenges.
Peter Chandler, executive director of the Internet Works coalition — which represents platforms like Reddit, Discord, and Etsy — welcomed the bill but acknowledged the strain it could place on smaller companies. “It empowers victims, but the technical burden is real,” he said.