In the wake of the spread of explicit AI-generated photos of Taylor Swift, US lawmakers have proposed suing people for fake pornographic images of themselves. The Denial of Explicitly False Images and Non-Consent Editing (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an personally identifiable person without that person’s consent, and would allows a person to recover financial damages from someone who “knowingly created or possessed” the law. This is an image intended for dissemination.
The bill was introduced by Senate Majority Whip Dick Durbin (D-IL) and co-sponsored by Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley. Rep. (R-Missouri) also participated. This builds on the provisions of the Violence Against Women Act Reauthorization Act of 2022, which added similar rights of action for women. NonExplicit images that are faked. In summary, the sponsors said the move was a response to the “exponential” increase in the amount of digitally manipulated and explicit AI images, and that fakes “exploit women, especially public figures and politicians. He cited Swift’s case as an example of “how it can be used to harass people.” , and celebrities.
AI-manipulated pornographic images, often referred to as deepfakes, have grown in popularity and sophistication since the term was coined in 2017. Off-the-shelf generative AI tools have made it much easier to create images, even in systems with guardrails against explicit images. It has been used to impersonate, harass and intimidate. But so far, there are no clear legal remedies in many parts of the United States. Nearly every state has passed laws banning nonconsensual pornographic simulations, but the process has been slow. Far fewer countries have enacted laws addressing simulated images. (There is no federal criminal law that directly bans either type.) But it is part of President Joe Biden’s plan to regulate AI, and White House press secretary Karine Jean-Pierre said last week that He called on Congress to pass new legislation in response.
The DEFIANCE method was introduced in response to, but not limited to, AI-generated images. Counterfeiting counts as any “intimate” sexual image (as the term is defined in the Basic Regulations) created by “software, machine learning, artificial intelligence, or other computer-generated or technical means” …make it appear to a reasonable person to be indistinguishable from a genuine visual depiction of the individual. ” This includes real photos that have been altered to appear sexually explicit. The language seems applicable to older tools such as Photoshop, as long as the results are realistic enough. Adding a label indicating that an image is not authentic does not relieve you from liability.
Lawmakers have introduced a number of bills to address AI and non-consensual pornography, but most have not yet been passed. Earlier this month, lawmakers introduced the AI Fraud Act, a very broad ban on using technology to imitate someone without their permission. But blanket rules for impersonation raise big questions about artistic expression. Lawsuits could be filed by powerful people over the treatment of political parody, reenactment, or creative fiction. The DEFIANCE Act may raise some of the same questions, but it is significantly more limited in its content, although it still faces an uphill battle to pass.