This week, a sexually explicit fake image of global superstar and singer Taylor Swift went viral across the internet, sparking outrage and highlighting how quickly explicit images manipulated by artificial intelligence can spread. .
This is a developing story that shows how little clear legal protection exists for victims in a world where AI has proliferated rapidly in just a few years and can generate images of almost anything without the consent of the people depicted. It’s also a controversy.
USA TODAY was able to identify only 10 states that have passed laws banning exploitative deepfake pornography or AI-generated images, audio files, or videos containing sexual content. There is no federal law regulating it.
That means the question of whether the depiction actually violates the law is thorny, and victims like Swift are left with a number of confusing options.
Although faked images can result in criminal charges, victims are more likely to get justice by suing the companies involved in creating and disseminating the images. So says victims’ rights attorney Carrie Goldberg. He works with tech companies and represents clients who have been victims of nonconsensual pornography, stalking, harassment, and now deepfake pornography.
Goldberg also pointed out that litigation is a much more viable solution for wealthy celebrities than suing less powerful people who may be victims of deepfake porn. are doing.
Because the technology to create deepfakes only became available in 2017, legal remedies are still being developed and little has been resolved about what exactly is illegal. yeah.

In which states is deepfake porn illegal?
On Friday, USA TODAY could only find 10 states that appeared to have enacted laws specifically addressing the issue of porn deepfakes. Virginia’s oldest law went into effect in 2019.
A small number of states have existing laws regarding the nonconsensual distribution of pornography, or “revenge porn,” which could also cover AI-generated pornography, Goldberg said.
However, in many states, these laws are written in a way that implies that the images must be of the victim’s own private body parts and not generated by an AI.
This means that the following states currently provide special legal remedies for deepfake victims:
- California: In 2020, California passed a law allowing victims of deepfake pornography to sue those who create and distribute sexually explicit deepfake material if they did not consent. If the deepfake was “performed with malicious intent,” victims can sue for up to $150,000.
- florida: In 2022, Florida passed a law banning the distribution of sexually explicit deepfake images without the victim’s consent. This is a third-degree felony, punishable by up to five years in prison, a $5,000 fine, and five years’ probation.
- georgia: A 2020 Georgia law prohibited the online distribution of falsely created pornographic images and videos.
- Hawaii: In 2021, the state of Hawaii prohibited the intentional creation, publication, or threat of publication of non-consensual sexually explicit deepfake images or videos. This is a Class C felony, punishable by up to five years in prison and a fine of up to $10,000.
- illinois: On January 1, 2024, Illinois added new protections for victims of deepfake porn. This law allows people to sue the person who created sexually explicit images or videos if the material is falsely depicted. The law amends existing protections for revenge porn victims, passed in 2015. Victims can sue for damages and can even use false names in court to protect their privacy.
- minnesota: A 2023 Minnesota law makes it illegal to create sexually explicit deepfakes or use deepfakes to influence elections. This could include up to five years in prison and a $10,000 fine for her if she distributes the images or videos.
- new york: In 2023, states banned the distribution of pornographic images created using artificial intelligence without the consent of the subjects. Violators could face up to one year in prison and a $1,000 fine. Victims also have the right to sue.
- texas: In 2023, Texas will make creating sexually explicit, non-consensual deepfake videos a Class A misdemeanor, punishable by up to one year in prison, up to a $4,000 fine, or both prison and fines. did.
- south dakota: A 2022 law makes it a first-degree misdemeanor to create deepfake pornography of an unwilling victim. If the victim is under the age of 17 and the perpetrator is over the age of 21, it is a class 6 felony, punishable by two years in prison, a fine of up to $4,000, or both.
- Virginia: A state law was passed in 2019 as part of existing laws related to revenge porn. This update adds “incorrectly created videos and still images.” The penalty is up to one year in prison, a $2,500 fine, or both.
Deepfake porn trauma:She discovered a naked video of herself online, but it wasn’t her
Was Taylor Swift’s fake image a crime? Can she sue?
Even though 10 states have enacted laws targeting deepfake porn, criminal law may not be the most practical solution for victims, Goldberg said.
First, law enforcement must prioritize investigating incidents, and second, there is a vast web of criminals to trace content from creator to sharer. It is possible that there are.
Her focus as a lawyer will be to pursue the AI products (the companies or platforms used to create deepfake porn) and the technology platforms that made their use possible, such as the app stores where the products can be downloaded. . Social media companies where images are shared.
Goldberg said Taylor Swift could sue such companies and platforms.
The state of Tennessee, where Swift lives, has no law explicitly banning deepfake porn. Tennessee Governor Bill Lee made the proposal earlier this month. The bill, called the Similarity Voice and Image Security Protection (ELVIS) Act, would update the state’s privacy laws, including protecting the voices of songwriters, performers, and music industry professionals from the misuse of artificial intelligence. This will include deepfakes of pornography.
In Swift’s case, the star spends a lot of time in New York, which has both criminal and civil options for the victim. Even without her criminal law, she could file a civil lawsuit focusing on the unauthorized use of her likeness.
Even if it turns out the culprit is not in the United States, Swift’s enormous power and influence could help her in that case as well. Goldberg said that several years ago he represented a celebrity whose image was superimposed on a pornographic scene and fought to have it removed from overseas sites.
“When you have Taylor Swift, you always have something to fall back on,” Goldberg said. “For people like her who have the resources, there are many options to have law enforcement in other countries take care of them. Most people don’t have that available.”
AI images of Taylor Swift that are sexually explicitSpread online, prompt backlash
As AI image generation becomes mainstream, what happens next?
It is expected that more laws regulating deepfake pornography will be enacted at the state level in the coming years.
A federal anti-AI fraud law could be drafted in 2023, but many in the technology industry believe it is too broad and too unspecific to be enforceable.
Goldberg said future legislation at the federal and state levels will require a vocal consensus from lawmakers and more prominent voices to draw attention to the issue.
“As with non-consensual pornography, we as a society need to turn the tables on what we tolerate and encourage people to share, like, relink and post this kind of content. to deter those who might do so,” Goldberg said.
Contact Kayla Jimenez at kjimenez@usatoday.com. Follow @kaylajjimenez on X (formerly Twitter). Elizabeth Wise eweise@usatoday.com