Fabricated Images, Real Harm: The DEFIANCE Act and Federal Civil Remedies for Deepfake Pornography

by Kennedy Aikey, Associate Member, University of Cincinnati Law Review Vol. 94

I. Introduction

Advances in Artificial Intelligence (“AI”) have transformed the way digital images and videos are created, edited, and shared.1Zilana Lee, Note & Comment, Unveiling the Underbelly of Artificial Intelligence: The Inadequacies of the Legal System with Regard to Victims of Nonconsensual Sexual Deepfakes, 33 J.L. & Pol’y 182, 187 (2025). Among the most troubling consequences of these developments is the rise of nonconsensual synthetic intimate imagery, commonly referred to as deepfake pornography.2Grace Mohlin, Comment, Synthetic Images, Authentic Harms: A Definitional Approach to Criminal NSII “Deepfake” Statutes, 60 Wake Forest L. Rev. 921, 923-24 (2025). What once required specialized skill can now be accomplished in minutes with widely available AI tools, allowing users to generate highly realistic, explicit images of real people without their consent.3Lee, supra note 1, at 188. This results in harm that ranges from reputational damage and emotional distress to long-term professional and personal consequences.

Recent high-profile incidents have drawn national attention to the issue, highlighting both the problem and the inadequacy of existing legal remedies.4Mohlin, supra note 2, at 926-27. While some states have enacted laws addressing nonconsensual intimate imagery, the legal landscape remains inconsistent, leaving many survivors without a meaningful remedy.5Id. at 927-28. In response to these challenges, Congress has begun developing a federal framework to address deepfake abuse.6Id. First, Congress has passed the TAKE IT DOWN Act, which federally criminalizes the knowing distribution of nonconsensual intimate imagery and requires platforms to remove such content.7Take It Down Act, RAINN (Jan. 12, 2026), https://rainn.org/federal-legislation/take-it-down-act/ However, removal alone is not enough. Therefore, Congress is now attempting to pass the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2025 (the “DEFIANCE Act”), which seeks to fill this gap by creating a federal civil cause of action for survivors of explicit deepfake imagery to ensure they receive the recourse they deserve through monetary damages.8Ben Colman, The DEFIANCE Act Just Passed the Senate. Here’s What It Means for Survivors of Deepfake Abuse, Reality Defender (Jan. 14, 2026), https://www.realitydefender.com/insights/senate-passes-the-defiance-act [https://perma.cc/RQ65-R9ZQ].

This Article explains the importance of enacting the DEFIANCE Act and the issues survivors may still face after its enactment. Part II provides background information on AI deepfake explicit images and how the DEFIANCE Act seeks to provide a remedy to victims of such images. Part III advocates for the passage of the DEFIANCE Act and addresses potential solutions to the problems survivors may face regarding the anonymity of the creators of these explicit deepfake images. Part IV briefly concludes and foreshadows Congress’s next steps regarding the DEFIANCE Act.

II. Background

Deepfake pornography is an increasingly prevalent and concerning phenomenon. This section will provide a brief background on this issue. Part A defines what constitutes a deepfake and explains its historical development. Part B examines some of the individuals and communities impacted by deepfake explicit imagery. Part C introduces the DEFIANCE Act, explaining its purpose, scope, and current status in Congress. Finally, Part D analyzes the potential challenges survivors may encounter when attempting to use the DEFIANCE Act to bring a cause of action against their perpetrators.

A. Defining Deepfake Pornography and Non-Consensual Synthetic Intimate Imagery

Non-consensual Intimate Imagery (“NCII”) refers to any situation in which intimate images are produced, published, or reproduced without consent.9Mohlin, supra note 2, at 923. However, when NCII is altered with technology to no longer represent actual events, it becomes non-consensual synthetic intimate imagery, which is known as “deepfake pornography.”10Id. Although synthetic media has been around for decades, the emergence of AI has increased the amount of deepfake pornography.11Id. at 924.

Deepfakes are a specific type of advanced synthetic media that utilizes deep learning.12Id. at 925. The term originated from combining the two phrases “deep learning” and “fake.”13Id. The term became widely used after a Reddit user in 2017 introduced the idea of using AI software to superimpose celebrities’ faces onto the bodies of people in preexisting explicit videos.14Id. Although the term deepfake includes both explicit and non-explicit videos, the reality is that 98% of all deepfakes online are explicit.15Id. at 927. At least 90% of these explicit videos are non-consensual.16Id.

Deepfakes come in three different forms: face-swapping, voice cloning, and synthetic media generation.17Lee, supra note 1, at 187. Face-swapping involves users putting faces on different bodies.18Id. Voice cloning uses AI to analyze patterns in people’s speech and then replicates those patterns to make the person say anything the user wants.19Id. at 188. Software for face-swapping and voice cloning is becoming readily accessible to the public online.20Id. The final form of deepfakes, synthetic media generation, produces entirely artificial content, as the entire image or video is generated by AI.21Id.

B. From Celebrities to Classrooms: The Victims of Deepfake Pornography

Deepfakes and NCII were pushed to the forefront of pop culture when explicit images of Taylor Swift flooded the social media platform X in 2024.22Mohlin, supra note 2, at 926. It is likely that the pictures were created by a free image generator through Microsoft Designer, DALL-E.23Id. Once these images were posted online, they spread quickly, garnering over forty-seven million views before X intervened and suspended the account responsible for posting the content.24Lee, supra note 1, at 183. Despite the images being fabricated, Taylor Swift quickly suffered reputational damage and career setbacks, particularly because a large majority of her audience consists of young girls and families.25Id. at 183–84. The creation of explicit deepfake images and videos is not only happening to public figures; the tragedy is affecting people of all ages and walks of life, with a disproportionate impact on women and young girls.26Id. at 185. A staggering 96% of deepfake videos are of women, with many of these videos depicting nonconsensual nude images of underage girls.27Id.

AI platforms are so widely used and available today that creating a deepfake does not require any specialized technical training.28Id. at 188. Unfortunately, teenagers across the country are accessing deepfake AI image-creation tools and using them to generate nonconsensual nude images of their peers and classmates.29Id. The Center for Democracy and Technology reported in 2024 that 15% of high schoolers reported hearing about a deepfake depicting someone associated with their school in a sexually explicit manner.30Id. at 190. As AI has become more integrated into everyday life, the amount of teenagers being affected by deepfake pornography has only grown, as a recent study conducted  in 2025 found that 1 in 8 teens know someone who has been targeted by deepfake nudes.31Deepfake Nudes Are a Harmful Reality for Youth: New Research from Thorn, Thorn (Mar. 3, 2025), https://www.thorn.org/blog/deepfake-nudes-are-a-harmful-reality-for-youth-new-research-from-thorn/ [https://perma.cc/YXV5-GY8E].

The effects of nonconsensual explicit deepfake images can be detrimental, for all victums from celebrities to high school girls. Using deepfakes to create nonconsensual explicit images of someone is a form of image-based sexual abuse.32Lee, supra note 1, at 190. Victims of deepfakes suffer both psychological and physical harm, and the distribution of these images and videos can cause ongoing humiliation and harm to their reputation and future.33Id. at 191.

C. The DEFIANCE Act

In early January 2026, the DEFIANCE Act passed unanimously in the U.S. Senate; however, it has not yet passed in the House of Representatives.34Colman, supra note 8. If the DEFIANCE Act is passed in the House, it would establish a federal civil right of action for survivors of explicit deepfake images and videos.35Id. Without the DEFIANCE Act, the legal framework for victims to sue deepfake creators is currently inconsistent across states and varies in effectiveness.36Id.

This Act would codify the term “intimate digital forgery,” which would cover visual depictions created by software, AI, or machine learning that are “indistinguishable from an authentic visual depiction” to a reasonable person.37Id. The Act would allow survivors to sue for liquidated damages of up to $150,000, or if the deepfake is linked to sexual assault, stalking, or harassment, $250,000.38Id. Further, the Act would allow all plaintiffs to use pseudonyms throughout their cases, given the sensitive nature of the proceedings and the need to protect the victim’s identity.39Id. The Act includes a statute of limitations of 10 years beginning the moment the victim discovers the violation or turns 18 years old.40Id.

The DEFIANCE Act is complemented by the recent passage of the TAKE IT DOWN Act.41RAINN, Take It Down Act, supra note 7. The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks (“TAKE IT DOWN”) Act criminalizes NCII and makes it a federal crime to knowingly share or threaten to share NCII, including AI-generated images.42Id. The TAKE IT DOWN Act was passed by Congress in May of 2025 with bipartisan support.43Id. It requires websites and online platforms to remove NCII within 48 hours of a survivor’s verified request.44Id. The TAKE IT DOWN Act also preserves the First Amendment, as it only targets knowing publications of NCII and includes protections for journalistic, artistic, and lawful speech.45Id.

The DEFIANCE Act was originally introduced in January of 2024 and passed in the Senate in July of 2024.46Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act), S. 3696, 118th Cong. (2024), https://www.congress.gov/bill/118th-congress/senate-bill/3696/all-info?s=6&r=1 However, the bill never passed in the House of Representatives and subsequently died.47Id. Representative Alexandria Ocasio-Cortez, Representative Laurel Lee, Senator Richard J. Durbin, and Senator Lindsey Graham reintroduced the DEFIANCE Act in May of 2025.48Press Release, Rep. Alexandria Ocasio-Cortez, Representative Alexandria Ocasio-Cortez, Lee, Durbin, Graham Introduce Bipartisan, Bicameral Legislation to Combat Non-Consensual, Sexually Explicit Deepfake Imagery, (May 21, 2025), https://ocasio-cortez.house.gov/media/press-releases/ocasio-cortez-lee-durbin-graham-introduce-bipartisan-bicameral-legislation [https://perma.cc/73MY-N9ML]. The Senate unanimously passed the Act in January of 2026.49Id. Supporters of the Act and survivors of deepfake explicit images have advocated strongly for the House to pass this Act.50Press Release, Rep. Alexandria Ocasio-Cortez,, Ocasio-Cortez, Lee Join House Members and Advocates in Calling to Pass DEFIANCE Act (Jan. 22, 2026), https://ocasio-cortez.house.gov/media/press-releases/ocasio-cortez-lee-join-house-members-and-advocates-calling-pass-defiance-act[https://perma.cc/T97C-2TPE]. Representative Ocasio-Cortez has stressed that the TAKE IT DOWN Act provides the removal of the images, but the DEFIANCE Act will provide survivors recourse and restitution.51Id. The House has yet to schedule a vote on the Act, which is the reason it died in 2024.52Grace Panetta, ‘They Sold My Pain for Clicks’: Paris Hilton Urges Lawmakers to Act on Nonconsensual Deepfakes, 19th News (Jan. 22, 2026), https://19thnews.org/2026/01/paris-hilton-aoc-deepfakes/ [https://perma.cc/66GT-RDS3]. However, House Speaker Mike Johnson spoke favorably of the bill, giving hope that it will likely pass in the House.53Id.

D. Challenges in Identifying Anonymous Deepfake Perpetrators

One of the major logistical issues with the DEFIANCE Act is that AI deepfake creators often operate anonymously online.54Id. This Act would allow for survivors to gain the power of the subpoenas to compel social media platforms and internet service providers to turn over information about anonymous accounts that are posting deepfake explicit images.55Id. However, even with subpoenas, there is no guarantee that the users’ identities can be uncovered.56Id. If the IP address is overseas, or if someone used public Wi-Fi, it may be impossible to find out who created the images.57Id. Oftentimes, uncovering anonymous internet users is a matter of luck on the technical side.58Id. The technological setbacks in uncovering the perpetrators may have a chilling effect on survivors who bring cases, as the litigation process can become very costly without any guarantee of justice.59Id.

III. Discussion

Moving forward, Congress should pass the DEFIANCE Act and continue to seek ways to help victims identify the perpetrators of deepfake images. Part A of this section will advocate for the passing of the DEFIANCE Act. Part B of this section will offer possible solutions to the issue of survivors being unable to identify the creators of the deepfake images and videos.

A. Why the House Should Pass the DEFIANCE Act

The House of Representatives should pass the DEFIANCE Act to ensure that survivors of deepfake pornography are not left without meaningful recourse. While recent legislation like the TAKE IT DOWN Act provides critical mechanisms for removing nonconsensual intimate imagery from online platforms, removal alone does not remedy the personal, reputational, and psychological harms inflicted on victims. The DEFIANCE Act fills this gap by empowering survivors to seek civil damages, protecting their identities through pseudonymous litigation, and holding perpetrators accountable in a rapidly evolving technological landscape. As generative AI tools continue to become more accessible and misuse of these tools becomes increasingly common, delaying passage of the Act risks allowing legal protections to lag behind technological reality. The Act has gained bipartisan support, which is a rarity in today’s current political climate; however, this reflects that nonconsensual deepfake abuse is not a partisan issue but a fundamental threat to personal autonomy and dignity.

B. Addressing Identification and Enforcement Gaps

Even with the passage of the DEFIANCE Act, which provides a civil cause of action, survivors must still identify a defendant. This is becoming increasingly difficult when a large portion of the internet is anonymous, and individuals can hide behind screen names and computers. Without mechanisms to trace deepfake imagery back to its source, the Act risks promising recourse for survivors without practical enforcement.

One potential improvement would be to require generative AI platforms to embed provenance metadata in AI-generated images and videos.60Tim Mucci, What Is Data Provenance?, IBM Think, https://www.ibm.com/think/topics/data-provenance [https://perma.cc/WS66-CAHF] (Mar. 7, 2026). Provenance metadata is information regarding the origin or history of data.61Id. It can provide details such as who created the data, when it was created, any modifications, and who made the changes.62Id. Requiring AI platforms to implement this metadata could enable the identification of the generating platform, model, and a unique generation identifier tied to a user account. This information would not be publicly visible but could be disclosed pursuant to a subpoena in civil litigation. This could help survivors identify the creators of the explicit images by utilizing subpoena power.

Another approach Congress could consider is identity verification requirements for high-risk generative AI tools. Rather than mandating identity verification across all online platforms, which would raise serious privacy and First Amendment concerns, such requirements could be narrowly tailored to apply only to tools capable of producing highly realistic images or videos, particularly sexual content. Platforms could verify users through government identification or payment credentials and securely retain that information without making it public. Disclosure would occur only pursuant to a court order or subpoena in a civil action under the DEFIANCE Act. By limiting anonymous access to the most powerful generative tools, this model would balance the protection of free expression with the prevention of abuse.

These options would allow for easier identification of the perpetrators of AI deepfake pornography, without eliminating anonymity online. Anonymity remains a vital component of lawful speech and political expression. However, as generative AI increasingly enables image-based sexual abuse, modest constraints on anonymity may be necessary to ensure accountability. Without such measures, the civil remedies promised by the DEFIANCE Act may remain largely symbolic, offering recognition of harm without a realistic path to justice.

IV. Conclusion

The rise of deepfake pornography has exposed significant gaps in existing legal protections for survivors of nonconsensual synthetic intimate imagery. As AI advances and becomes more accessible, creating explicit deepfake images has become easier, more widespread, and increasingly difficult to trace. While recent legislation, such as the TAKE IT DOWN Act, addresses the removal of nonconsensual intimate imagery, removal alone does not provide survivors with meaningful redress or accountability.

The DEFIANCE Act seeks to establish a federal civil cause of action for survivors of explicit deepfake imagery, offering uniform protections across states and allowing victims to pursue damages while protecting their identities through pseudonymous litigation. Its bipartisan support in the Senate reflects broad recognition of the serious harms caused by deepfake abuse and the need for a consistent federal response. However, as discussed, the effectiveness of the Act depends in large part on survivors’ ability to identify perpetrators who often operate anonymously online. Without additional measures to assist in identifying defendants, such as requiring platforms to use metadata and track identification information to trace AI-generated content, the civil remedies provided by the DEFIANCE Act risk remaining difficult to enforce. For the Act to fulfill its intended purpose, Congress must not only ensure its passage in the House but also remain attentive to the practical barriers survivors face in pursuing justice. As deepfake technology continues to evolve, meaningful accountability will require legal solutions that evolve alongside it.


Cover Photo by Zulfugar Karimov on Unsplash

Author

References

Up ↑

Discover more from University of Cincinnati Law Review Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading

Skip to content