In a move aimed at tackling the growing threat of online abuse, President Donald Trump has signed the Take It Down Act into law. The legislation requires online platforms to remove nonconsensual intimate images—such as revenge porn or AI-generated deepfakes—within 48 hours of receiving a request. If companies fail to comply, they could face hefty fines of up to $50,000 per violation.
Supporters hail the law as a necessary step to protect victims whose private images are shared without permission, often leading to trauma, public shame, and long-term emotional distress. Major tech firms like Google, Meta, and Microsoft have backed the legislation, which is expected to go into effect within the coming year. Enforcement will be led by the Federal Trade Commission (FTC), which will monitor companies for compliance under its authority to address deceptive or unfair practices.
While the goal is clear—stopping harmful, explicit content from spreading online—critics worry the law could be misused or over-applied, leading to unintended censorship.
Global Momentum for Tackling Nonconsensual Content
The United States now joins a growing list of countries—including India—that have passed laws to address the rapid spread of sexually explicit or manipulated images online. The global push for faster removals is rooted in the reality that once harmful content goes live, it can quickly spread beyond control. Victims often struggle to get it taken down before irreversible damage is done.
Microsoft’s delayed response to a major deepfake case in the past illustrates just how damaging slow action can be. The Take It Down Act aims to cut down on such delays by making prompt response not just recommended—but legally mandatory.
Concerns About Overreach and Abuse of Power
Despite the good intentions behind the bill, civil liberties advocates have raised serious concerns. The new law doesn’t include a clear appeals process for individuals whose content may be wrongfully taken down. Nor does it impose significant penalties for those who submit false or malicious takedown requests.
This has sparked fears that the system could be exploited by bad actors hoping to silence critics or competitors. Without a strong deterrent, experts say fraudsters who once abused copyright laws for these purposes may now turn to this new legal pathway.
The Take It Down Act is modeled loosely on the Digital Millennium Copyright Act (DMCA), which has long faced criticism for being easily manipulated. Under the DMCA, companies often remove content first and ask questions later to avoid legal liability—sometimes leading to the removal of legitimate material. Critics believe the new law could lead to a similar outcome.
Speed vs. Accuracy: A Risky Trade-Off
One of the law’s most debated elements is the 48-hour removal deadline. With such a short window to respond, many platforms may feel pressured to comply with every request—without doing the due diligence needed to confirm its legitimacy.
Google, for example, handles millions of copyright claims every year and has acknowledged that it often relies solely on the information provided by the requestor. Experts warn that this same approach could result in the over-removal of content under the new law.
Becca Branum, deputy director at the Center for Democracy and Technology, warns that platforms “have no incentive to make sure a takedown request is legitimate. It’s easier and cheaper to take the content down, even if it doesn’t actually violate the law.”
Privacy vs. Accessibility: A Delicate Balance
Another point of contention is identity verification. Some tech companies currently require victims to show government-issued ID to confirm they are the person depicted in the image. While that step helps prevent fraud, it can be burdensome—and even dangerous—for vulnerable individuals such as minors or abuse survivors.
The Take It Down Act does not mandate such verification, which makes the process more accessible but also opens the door for misuse. If companies make it too difficult to file a request, they risk FTC scrutiny for discouraging legitimate claims. If they make it too easy, they risk being exploited by those looking to take down content for illegitimate reasons.
Lawmakers Silent on Legal Gaps
The bipartisan effort behind the legislation was led by Senators Ted Cruz and Amy Klobuchar. Despite their leadership in pushing the bill through Congress with little resistance, both have remained quiet in the face of criticism that the law lacks essential safeguards.
The urgency to act stemmed from real stories—particularly those involving teenagers—whose intimate images were shared online without their consent. Lawmakers say they hope the law will give future victims a fast and reliable way to reclaim their privacy.