Navigating the TAKE IT DOWN Act in Litigation

The recently enacted TAKE IT DOWN Act is a major step against “revenge porn” and AI-generated sexual imagery online. Formally titled the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, the Act was passed with near-unanimous bipartisan support (a rare feat these days) and signed into law in May 2025. The law’s primary goal is to curb the spread of non-consensual intimate images (including realistic deepfakes) and to require swift removal of such content from online platforms. This post summarizes what the Act does and, importantly, how lawyers can strategically use it in litigation – even though the Act itself provides no private right of action for victims.

What the TAKE IT DOWN Act Does (in a Nutshell)

Criminalizes Non-Consensual Intimate Images:

The Act makes it a federal crime to knowingly publish or share intimate visual depictions of someone without their consent, if the person depicted had a reasonable expectation of privacy. This targets so-called “revenge porn” – sharing private nude or sexual images without permission – as well as AI-manipulated nudes or deepfake pornography of real people. For adults, violators can face up to 2 years in prison, and if the images involve a minor, up to 3 years. The law also covers threats to disclose such images, not just actual publication. Importantly, consent to create or share an image privately is not consent to publish it publicly – the Act explicitly says prior consent or voluntary sharing with someone does not excuse a subsequent public posting.

“Covered Platforms” Must Take It Down Quickly:

The Act imposes a new notice-and-takedown regime for online services. Any website, app, or online service that hosts user-generated content and serves the public is likely a “covered platform.” This includes social media sites, forums, image hosts, etc., but excludes services like ISPs, email, or platforms that mainly host their own content with only incidental user interaction. By May 2026 (one year from enactment), every covered platform must establish a clear and easy process for people to report NCII (non-consensual intimate images) and request removal. The law requires platforms to provide a “clear and conspicuous” notice on their site explaining how to report intimate images and what the platform’s duties are in plain language.

Once a valid report is made by a victim or their agent (it must be in writing, signed, and identify the content with a sworn statement that it’s non-consensual), the platform has 48 hours to remove the content and must make reasonable efforts to remove any duplicate copies of that image on the service. In other words, if the same picture has been reposted by others or appears elsewhere on the site, those should come down too.

Liability Safe Harbor for Good Actors:

To encourage cooperation, the Act grants platforms a safe harbor from liability for removing reported content in good faith. This means if a website takes down a user’s post because it was an intimate image reported under the Act, the poster can’t sue the site for taking it down. (This is akin to the existing protections in Section 230 of the Communications Decency Act for voluntary content removal, now made explicit for intimate images.) Essentially, the law wants platforms to err on the side of removal without fear of being sued by the person who uploaded the content.

Enforcement by the FTC, Not by Private Lawsuits:

Notably, the TAKE IT DOWN Act does not give victims the right to sue websites or perpetrators directly under this law. Instead, it relies on government enforcement. Criminal violations (someone posting or threatening intimate images without consent) will be prosecuted by the Department of Justice. And for the platform takedown requirements, the Act empowers the Federal Trade Commission to enforce compliance, treating a platform’s failure to maintain the required removal process or timeline as an “unfair or deceptive act or practice” under the FTC Act. Put simply, a company that doesn’t set up the reporting system or ignores valid takedown requests could face an FTC investigation, fines, and penalties – but an individual victim cannot directly sue the company under the Act. This enforcement design is similar to other consumer protection laws (like data privacy rules) where the FTC steps in, rather than private lawsuits, to ensure companies follow the rules.

No Private Right of Action – How Can Victims Leverage This Law?

The TAKE IT DOWN Act provides no private cause of action. Unfortunately, you cannot file a lawsuit “under the TAKE IT DOWN Act” against a website or an ex-partner, because Congress chose to make enforcement the job of prosecutors and the FTC. This is similar to other laws like certain federal privacy statutes or HIPAA – they set standards, but they don’t explicitly allow a private lawsuit for violations.

However, the absence of a direct cause of action doesn’t mean the Act is useless in civil litigation. Creative attorneys can still wield this law as a tool by incorporating it into other legal claims. Think of the TAKE IT DOWN Act as setting a new benchmark for lawful behavior and industry responsibility. Even if a victim can’t say “the Act gives me a right to sue you,” they can say “you violated a law that was designed to protect people like me – and I’m using that fact to strengthen my case under other legal theories.”

In practice, we foresee a few strategic ways to leverage the Act for victims’ justice:

  • Negligence Per Se: This s a doctrine in tort law where violating a statute or regulation meant to protect certain people can be treated as automatically negligent. In plainer terms, negligence per se means the court uses the law as the standard of care – if the defendant broke that law, they are presumed to have breached their duty of care to the plaintiff. Here, the TAKE IT DOWN Act could serve as the basis for a negligence per se argument. For example, suppose an ex-boyfriend posts his former girlfriend’s private explicit photo online. He has clearly violated the criminal prohibition of the Act. A plaintiff’s lawyer could sue him for civil damages (emotional distress, etc.) under common law tort theories like invasion of privacy or intentional infliction of emotional distress – and argue that his violation of the federal law constitutes negligence per se. The Act was expressly designed to prevent the harm the victim suffered (psychological, reputational, and financial harm from non-consensual exposure), and the victim is exactly in the class the law aims to protect. Many courts would allow a jury to be instructed that breaking this law is evidence of negligence. In some jurisdictions, it might even be considered conclusive proof of negligence unless the defendant can excuse the violation.

  • Evidence of Standard of Care: Even if a court is hesitant to apply negligence per se (some courts are cautious when the statute itself has no private right, seeing it as legislative intent not to directly litigate it), the Act still serves as strong evidence of what a reasonable standard of care is in situations involving intimate content. For instance, consider a website or platform that dragged its feet in removing an intimate video after the victim’s request, and the victim sues the platform under some legal theory (perhaps a state law claim, or maybe the platform had made contractual promises to remove such content). Normally, Section 230 of the Communications Decency Act will shield major platforms from liability for user-posted content, but there are edge cases and smaller sites or specific factual scenarios where a platform might still be on the hook. In those cases, the 48-hour removal rule in the Act sets a benchmark: if a platform took weeks to address a known intimate image posting, a lawyer will highlight that Congress expects such content to be gone in two days or less. That can be used to argue the platform breached its duty of care by acting unreasonably slow. Similarly, if a platform had no clear reporting mechanism at all, that violates the Act’s requirements – a compelling indication that the platform fell below the standard of care that a prudent company should meet in 2025. In essence, the Act’s provisions can guide courts in understanding what “reasonable” behavior looks like in the context of online sexual privacy. We saw this dynamic with other laws: for example, while there’s no private lawsuit under HIPAA, courts have accepted HIPAA rules as the standard by which to measure a healthcare provider’s negligence in protecting patient privacy. We can expect the TAKE IT DOWN Act to play a similar role: a statutory yardstick for measuring negligence in the digital privacy realm.

  • Bolstering Other Claims and Statutes: If a victim is suing under other legal theories or statutes, the Act can be cited to bolster those claims. One key example is the Violence Against Women Act (VAWA). Unbeknownst to many, the 2022 reauthorization of VAWA quietly created a federal civil cause of action for victims of non-consensual pornography, now codified at 15 U.S.C. § 6851. That VAWA provision (which took effect in late 2022) allows victims to sue perpetrators in federal court for disclosing intimate images without consent, provided there’s an interstate commerce nexus (which online postings usually satisfy). Under that VAWA civil law, victims can recover actual damages or statutory damages (up to $150,000) and get injunctions to stop further distribution. If you bring a claim under 15 U.S.C. § 6851 against a perpetrator, referencing the TAKE IT DOWN Act can strengthen your case: it shows that Congress (again, in 2025) affirmed the wrongfulness of the same conduct by criminalizing it. It’s a one-two punch – VAWA gives the private right to sue the individual, and the TAKE IT DOWN Act underscores that the behavior violates public policy and even criminal law.

  • Enhancing Claims of Emotional Distress or Privacy Torts: In state courts, victims rely on common-law causes of action like intrusion upon seclusion, public disclosure of private fact, intentional infliction of emotional distress (IIED), or specific state revenge porn statutes where they exist. The existence of the TAKE IT DOWN Act can be cited to emphasize the egregiousness of the defendant’s conduct. For example, in an IIED claim, one element is often that the conduct was “outrageous” or intolerable in a civilized community. Pointing out that federal law now treats this conduct as a serious crime helps convince the court that yes – this is outrageous conduct by definition. It can also help in arguing for punitive damages: the defendant’s violation of a federal criminal statute shows a degree of willfulness or recklessness that justifies punishing them civilly. In addition, lawyers can borrow the definitions from the Act (such as how the Act defines “consent” or “intimate visual depiction”) to frame the issues clearly and show the court that these definitions have legislative recognition.

  • Negligence Per Se for Platforms? A question we often hear is whether an online platform’s failure to meet the Act’s requirements could be negligence per se in a suit by a victim. This is tricky because of Section 230 immunity – in general, you cannot sue a platform for damages for leaving someone else’s harmful content up (that’s treating the platform as the “publisher,” which 47 U.S.C. § 230 shields). The TAKE IT DOWN Act does not create a Section 230 exception for civil suits by victims; it leaves enforcement to the FTC. So a direct negligence claim against, say, a big social media company for not removing your photos in 48 hours will likely be barred by federal immunity. That said, there may be scenarios involving platforms not covered by Section 230 (perhaps because they in some way developed or encouraged the content, or smaller entities that aren’t considered “interactive computer services”) where a negligence per se theory might be attempted. Even if such cases are rare, the Act’s standards could still come into play indirectly.

Connecting the Act to VAWA and Digital Abuse Claims

The Violence Against Women Act (VAWA) has long been a centerpiece in the fight against gender-based violence, and it has increasingly recognized technologically-facilitated abuse as part of that fight. The TAKE IT DOWN Act complements this trend and offers new avenues to assist victims of coercion, emotional abuse, or digital exploitation in intimate contexts.

Consider scenarios of coercive control: an abusive partner threatens, “If you leave me, I will post your nudes online,” or actually shares intimate images to shame and control the victim. This is both a form of psychological abuse and sexual exploitation. Under VAWA and many state domestic violence laws, such conduct can be domestic abuse, even if it’s not physical. Courts and legislatures are increasingly recognizing “image-based sexual abuse” as a tactic of domestic violence. Now, with the TAKE IT DOWN Act in place, survivors and their attorneys have additional support to address these situations:

  • Civil Remedy via VAWA’s NCII Provision: VAWA 2022 created a direct civil claim for non-consensual pornography (NCII) cases. A creative litigator dealing with a revenge porn scenario in a domestic violence context should invoke this VAWA provision. When doing so, the TAKE IT DOWN Act can be cited in the complaint to demonstrate that Congress has reinforced the policy against NCII. It sends a signal that the defendant’s actions aren’t just harmful to the victim; they are broadly condemned by society and lawmakers. If the case involves interstate elements (most online cases do), one could even involve VAWA’s other provisions – for example, if the abuse involved stalking or threats across state lines, there are federal stalking laws (18 U.S.C. § 2261A) that criminalize using electronic means to cause substantial emotional distress.

  • Coercion as Extreme Cruelty: In immigration law or certain family law contexts, “coercion and extreme emotional abuse” are grounds for relief (for instance, VAWA self-petitions for immigrant survivors require showing battery or “extreme cruelty”). An abuser distributing or threatening to distribute intimate images is a textbook example of extreme emotional cruelty. Lawyers can use the TAKE IT DOWN Act’s existence to argue that such behavior is so beyond the pale that it’s literally criminal. This could support, say, a VAWA cancellation of removal case or a custody case where one parent engaged in this abuse and the other is seeking sole custody for the child’s safety.

  • Emotional Distress and Evidence of Harm: The TAKE IT DOWN Act recognizes harm beyond physical – including psychological trauma and reputational damage. In a lawsuit under VAWA or any related claim, one must often prove the harm suffered. Here, we can use the legislative findings or context of the Act to support our damages arguments. Congressional discussion around the Act (and studies cited in its support) document severe emotional and social consequences for victims of intimate image abuse – such as depression, PTSD, lost jobs, humiliation, etc.

  • Integrated Claims Strategy: A creative litigator might plead multiple causes: for instance, a count under VAWA’s civil NCII statute against the perpetrator, a count for intentional infliction of emotional distress, and even a negligence per se count as discussed earlier. The negligence per se could be predicated on the criminal law violation (the TAKE IT DOWN Act’s criminal part) or perhaps on violating a state statute if one exists. Meanwhile, if any third parties were involved – say a website that refused to remove the content or an accomplice who helped spread it – one could bring state law claims against them and use the Act’s requirements to show they breached duties. In doing all this, you tie in VAWA concepts: for example, label the conduct as “sexual abuse and domestic violence” in the pleading, not just “privacy invasion,” to frame it within VAWA’s purpose. Many jurors (and judges) respond more strongly when they see the behavior for what it is: a form of violence and control, not a mere prank or invasion of privacy. The Act’s language about intent to harm, humiliate, or degrade aligns with domestic abuse – it’s often exactly about humiliation and control.

Compliance and Enforcement: Practical Points for Companies

It’s worth noting what the TAKE IT DOWN Act means for online platforms and businesses, since proactive compliance can both prevent harm and reduce legal exposure. Here’s a streamlined cheat-sheet for practitioners advising companies (or in-house counsel at those companies):

  • Determine if you’re a “Covered Platform”: If your service allows users to post content – be it images, videos, messages, etc. – open to the public, you’re likely covered. Exceptions are narrow (pure ISP services, one-to-one private communications like email, or platforms with only incidental user posts). Most social apps, forums, and content-sharing sites must comply.

  • Implement a Takedown Process (Deadline: May 19, 2026): Create a mechanism (web form, email, etc.) for people to report non-consensual intimate images. Importantly, train staff and build procedures to act on these requests within 48 hours of receipt. “Within 48 hours” likely means two calendar days, so there must be an on-call system to handle requests even on weekends and holidays. Document everything – keep records of requests and your removals.

  • Remove and Stay Down: When a report comes in, remove the specific image quickly and use tools (like hash matching, image recognition) to find any identical copies on your service and remove those too. This “take down and stay down” approach may require hashing reported images and automatically blocking re-uploads of the same file.

  • Post Clear Notices for Users: The Act mandates that platforms publish an easy-to-understand notice about the removal process. In practice, this could be a dedicated page or FAQ explaining “If you see an intimate image of yourself posted without consent, here’s how to report it and get it removed.”

  • Good Faith Safe Harbor: The silver lining for companies is that removing content in compliance with a takedown request will shield you from lawsuits by disgruntled users who posted the content. Nonetheless, be prepared for users to complain or appeal – it’s wise to have an internal appeals process for takedowns in case someone claims a mistake (similar to a DMCA counternotice system). Although the Act doesn’t explicitly require an appeals process for the poster, being fair and double-checking claims can help avoid bad press or user outrage. Just remember: the priority is protecting victims, and the law strongly favors removal (even at some risk of “over-removal”).

  • FTC Enforcement and Penalties: Non-compliance can result in the FTC treating it like an unfair or deceptive trade practice. For companies, that could mean hefty fines, consent decrees, injunctions and monitoring. The FTC will likely expect to see that you took “reasonable steps” to comply – hence, if you can show you’ve built the required system and acted in good faith, you’ll be far better off if there’s ever an inquiry.

In practice, implementing the TAKE IT DOWN Act’s mandates will require some investment (training, possibly new software for image tracking, legal counseling to set up policies). Practitioners should help their clients see the big picture: compliance protects users, keeps the company out of trouble, and reduces the likelihood of being the next headline or test case.

Conclusion: Turning a “Swordless” Law into a Shield (and a Sword)

In summary:

  • The Act makes clear that those who engage in non-consensual sharing of intimate images (or create vile deepfakes) are breaking the law (criminal and civil), and online platforms are now obligated to respond rapidly to protect victims.

  • Litigation strategy can incorporate this new law by using its violation as evidence of negligence or wrongdoing (negligence per se, standard of care, etc.), by piggybacking on VAWA’s civil remedies and other statutes, and by framing perpetrators’ actions as part of a broader pattern of coercion and abuse that the law recognizes.

  • VAWA connections mean victims of domestic or intimate partner violence have another facet of abuse they can name and address.

  • Compliance for businesses isn’t just a legal checkbox – it’s part of the new standard of care in the industry. Companies that lag behind may not only face FTC action but also lose the trust of users and find themselves indirectly embroiled in litigation when plaintiffs highlight their failures.

Dynamis is an elite litigation boutique with offices in Boston, New York and Miami. Dynamis can help those who have been victims of revenge-porn or other humiliating and unlawful practices. Contact Eric Rosen to schedule an inquiry.

Next
Next

The Legality and Constitutionality of President Trump’s Military Deployments to Los Angeles