If a California eighth-grader shares a nude photo of a classmate with a friend without their consent, the student could be charged under state laws dealing with child pornography and disorderly conduct.
However, it is not clear whether the photo is an AI-generated deepfake. Any State law will apply.
That’s the dilemma facing the Beverly Hills Police Department as it investigates a group of Beverly Vista Middle School students who allegedly shared photos of their classmates that had been altered using an artificial intelligence-powered app. According to the school district, the student’s real face was superimposed on top of an AI-generated nude body in the image.
Lt. Andrew Myers, a spokesman for the Beverly Hills Police Department, said no arrests have been made and the investigation is ongoing.
A security guard stands outside Beverly Vista Middle School in Beverly Hills on February 26.
(Jason Almond/Los Angeles Times)
Beverly Hills Unified School District Assistant Superintendent Michael Breggie said the district’s investigation into the episode is in its final stages.
“I am pleased that the disciplinary action was taken immediately and that this was a restrained and isolated incident,” Breggie said in a statement, but no information was disclosed about the nature of the action, the number of students involved, or the grade level. Not yet.
He called on Congress to prioritize the safety of America’s children, saying, “Technology, including AI and social media, can be used in an incredibly positive way, but just like cars and cigarettes, if left unregulated, they will be completely destroyed.” That’s the point,” he added.
But whether fake nudity constitutes a criminal offense is complicated by the technology involved.
Under federal law, the prohibition on child pornography includes computer-generated images of identifiable persons. Legal experts warn that while the prohibition is clear, it has not yet been tested in court.
California’s child pornography law does not address artificially generated images. Instead, it applies to any image that “depicts a person under the age of 18 privately engaging in or imitating a sexual act.”
Santa Ana criminal defense attorney Joseph Abrams said the AI-generated nudes “do not depict real people.” Although this could be defined as child erotica, he said it was not child pornography. And in his capacity as an attorney, he said, “I don’t think it crosses the line with this particular statute or any other statute.”
“As we move into this age of AI, these kinds of issues are going to be litigated,” Abrams said.
Kate Ruane, director of the Freedom of Expression Project at the Center for Democracy and Technology, said early versions of digitally altered child sexual abuse material included pornographic images of other people’s bodies superimposed on children’s faces. said. But now freely available “undressing” apps and other programs are generating fake bodies to match real faces, raising legal issues that have yet to be addressed head-on, she said. Ta.
Still, she said it’s hard to understand why the law doesn’t cover sexual images just because they’re artificially generated. “The evil we were trying to address was [with the prohibition] It is the harm to children associated with the presence of images. It’s exactly the same here,” Luan said.
But there are other obstacles to criminal prosecution. In both state and federal cases, the ban applies only to “explicit sexual conduct,” which boils down to sexual intercourse, other sexual acts, and “indecent” exposure of a child’s private parts.
Courts consider whether something is an indecent display by considering things such as what the image focuses on, whether the pose is natural, and whether the image is intended to arouse the viewer. We use a six-factor test to determine. Courts will need to weigh these factors when evaluating images that are not sexual in nature before being “undressed” by AI.
“It’s really going to depend on what the final picture looks like,” said Sandy Johnson, senior legislative policy adviser for the National Network on Rape, Abuse and Incest, the nation’s largest anti-sexual violence group. Told. “It’s not just a nude photo.”
Abrams said the age of the children involved is no defense against conviction because “children do not have the same rights as adults to possess child pornography.” However, like Johnson, he pointed out that “nude photographs of children are not necessarily child pornography.”
Neither the Los Angeles County District Attorney’s Office nor the state Department of Justice immediately responded to requests for comment.
State lawmakers have proposed several bills to fill gaps in the law regarding generative AI. These include proposals to extend criminal prohibitions on possession of child pornography and the non-consensual distribution of intimate images (also known as ‘revenge porn’) to computer-generated images, as well as ‘related issues and It includes a proposal to convene a working group of academics to advise lawmakers on the issue. The impact of artificial intelligence and deepfakes. ”
Lawmakers have proposed competing proposals that would expand federal criminal and civil penalties for distributing intimate AI-generated images without consent.
At Tuesday’s district school board meeting, Dr. Jane Tabiev Asher, chair of the Department of Child Neurology at Cedars-Sinai, told the school board that “children should have access to so much technology in and out of the classroom.” He called on the government to consider the impact of “giving.” .
Feb. 26 at Beverly Vista Middle School in Beverly Hills.
(Jason Almond/Los Angeles Times)
Asher said students can spend their free time at school on their own devices instead of interacting and socializing with other students. “If they were on a screen all day, what would they like to do at night?”
Research shows that children under 16 should not use social media, she said. She pointed out how the district has been blindsided by coverage of AI-generated nudity, and she warned: And we have to protect our children from it. ”
Board members and Mr. Bresey all expressed outrage about the images at the meeting. “This challenges the foundation of trust and safety that we strive for every day for all of our students,” Breggie said, but added: Happening. “
“I would like parents to continue to check on their children.” [children’s] Your phone, what apps you have on your phone, what you send, what social media sites you use,” he said. These devices “open the door to a lot of new technologies that are coming out completely unregulated.”
Board member Rachel Marcus pointed out that the district prohibits students from using cell phones at school. I think we as parents need to have more control over what our students are doing on their phones, and we are completely failing in that regard. ”
“From my perspective, what’s missing right now is partnership with parents and families,” board member Judy Manucelli said. “There are a lot of programs to get kids off their phones in the afternoon.”


