Millions of people on social media this week saw an AI-generated fake image of Taylor Swift that contained sexually explicit material, and for many there is a need to regulate the potential misuse of AI technology. Emphasized gender.
A White House spokesperson told ABC News on Friday that he was “alarmed” by what happened to Swift online and said Congress “should take legislative action.”
White House Press Secretary Karine Jean-Pierre told ABC News White House Correspondent Karen L. “I am alarmed by the reports that this is the case. And it is concerning.” traverse.
“While social media companies make their own independent decisions about content moderation, they have an important role to play in enforcing their own rules to prevent the spread of misinformation and intimate images of real people without their consent. We believe,” she added.
Mr. Jean-Pierre highlighted some of the recent actions the administration has taken on these issues. These include the creation of a task force to combat online harassment and abuse, and the Department of Justice launching the nation’s first 24/7 helpline for victims of image-based sexual abuse. This includes starting up.
And outraged fans, not just the White House, were surprised to learn that there is no federal law in the United States that prohibits or deters someone from creating and sharing non-consensual deepfake images.
But just last week, Representative Joe Morrell renewed his push to pass a bill that would make sharing explicit digitally altered images without consent a federal crime, punishable by prison terms and fines.
“We hope that the news about Taylor Swift will galvanize momentum and support for our bill,” a spokesperson for Ms. Morell told ABC News. “We will address both criminal and civil penalties in these situations,” he told ABC News.
The New York Democrat authored the bipartisan Intimate Image Deepfake Prevention Act, which is currently being referred to the House Judiciary Committee.
Deepfake pornography is often described as image-based sexual abuse, a term that also includes the creation and sharing of unfabricated intimate images.
With rapid advances in AI technology, until a few years ago users needed a certain level of technical skill to create AI-generated content, now they just need to download an app or click a few buttons. You can now create it with .
Experts say there is now an entire commercial industry that thrives on creating and sharing digitally produced content that appears to feature sexual abuse. Some of the websites broadcasting these fakes have thousands of paying members.
Last year, a town in Spain made international headlines after a number of young schoolgirls claimed they had received fabricated nude images of themselves created using an easily accessible “undressing app” powered by artificial intelligence. It became. This has sparked a huge debate about the harm this app causes. Cause.
The sexually explicit images of Swift were likely fabricated using artificial intelligence text conversion tools. Some of the images were shared on social media platform X, formerly known as Twitter.
One post that shared a screenshot of the fabricated image was reportedly viewed more than 45 million times before the account was suspended on Thursday.
Early Friday morning, Company X’s safety team said it was “actively removing all identified images” and “taking appropriate action against the accounts that posted them.”
“Posting non-consensual nude (NCN) images is strictly prohibited on X and we have a zero-tolerance policy against such content.” statement. “We are closely monitoring the situation to ensure that any further violations are immediately addressed and content removed. We are committed to maintaining a safe and respectful environment for all of our users. is.”
“More than 100,000 images and videos like this are distributed on the web every day, and they are viral in and of themselves,” said Stefan Turkheimer, vice president of public policy at RAINN, a nonprofit anti-sexual assault organization. ”. This is outrage on behalf of Taylor Swift, but it’s even more outrageous for the millions of people who don’t have the resources to regain autonomy over their own image. ”