Nearly a year after AI-generated nude images of high school girls caused mayhem in a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year’s probation.
But the artificial intelligence tools used to create the harmful deepfakes are still easily accessible on the internet, and the website promises to “render any photo” uploaded to it “unprotected” within seconds.
Now, a new effort to shut down the app and others like it is underway in California, with the city of San Francisco filing the first-of-its-kind lawsuit this week, which experts say could set a precedent but will also face many obstacles.
“The prevalence of these images has led to the exploitation of an astounding number of women and girls around the world,” said David Chiu, a San Francisco attorney who filed the lawsuit against a group of widely visited websites based in Estonia, Serbia, Britain and elsewhere.
“These images are used to bully, humiliate and intimidate women and girls,” she told The Associated Press in an interview. “The impact on victims is devastating in terms of their reputation, their mental health and loss of independence, in some cases driving them to suicide.”
Lawsuits filed on behalf of California residents allege that these services violate many of the state’s laws banning deceptive business practices, non-consensual pornography and child sexual abuse. But because the apps are not available in mobile phone app stores but are easily found on the internet, it’s difficult to determine who is running them.
One service contacted by The Associated Press late last year claimed in an email that its “CEO is based in the United States and travels throughout the country” but declined to provide evidence or answer other questions. The AP declined to name specific apps because it doesn’t promote them.
“At this point, there are many sites where we don’t know exactly who is operating them or where they are operating from, but we have the investigative tools and subpoena power to look into that,” Chiu said. “And we will certainly use those powers during the course of this litigation.”
Many of these tools are used to create realistic fakes of clothed photos of adult women, including celebrities, that are “nude” without their consent. But such fakes have also appeared in schools around the world, from Australia to Beverly Hills, California, where typically male students create images of female classmates that are then widely circulated on social media.
A doctor whose daughter was among the girls victimized last year in one of the first widely publicized cases in the Spanish city of Almendralejo in September helped bring the case to public attention, said she was pleased with the severity of the sentences her classmates are receiving following a court ruling earlier this summer.
But this is “not only the responsibility of society, education, parents and schools, but also the responsibility of the digital giants who are profiting from this waste,” Dr. Miriam Al-Adib Mendili said in an interview on Friday.
She praised San Francisco’s actions but said more effort was needed from big companies like California-based Meta Platforms and its subsidiary WhatsApp, which were used to circulate the images in Spain.
While schools and law enforcement agencies are trying to punish those who create and share deepfakes, authorities are struggling with what to do about the tools themselves.
In a letter to Spanish MEPs in January, the European Union’s executive body said the app used by Almendralejo was not a large enough platform that it “does not appear to fall under” sweeping new EU rules aimed at strengthening online safety.
Organizations that have been tracking the rise of AI-generated child sexual abuse material will be following the San Francisco case closely.
Emily Slifer, policy director at Thorn, an organization that fights child sexual exploitation, said the case “has the potential to set legal precedent in this area.”
Stanford University researchers said it would be difficult to bring the defendants to justice because many of them are based outside the United States.
“Chiu will face an uphill battle in this lawsuit, but if the defendants who run the sites ignore it, he may be able to shut down parts of the sites,” said Liana Pfefferkorn of Stanford University.
She said that could happen if the city were to prevail on appeal in their absence and obtain an order affecting domain name registrars, web hosts and payment processors: “The order would effectively shut down those sites even if the owners never appeared in court in court.”