Federal prosecutors announced a criminal case this week alleging that a U.S. Army soldier stationed in Alaska used artificial intelligence to create child sexual abuse material, highlighting the lengths to which online predators will go to exploit children.
The Justice Department accused Seth Herrera, 34, of using an AI chatbot to create pornography of minors he knew, and of viewing tens of thousands of images depicting violent sexual abuse of children, including babies, according to court records.
“Criminals considering using AI to further their crimes should think twice, because the Department of Justice will prosecute AI-enabled criminal activity to the fullest extent of the law and will seek increased sentences when warranted,” said Deputy Attorney General Lisa Monaco.
Earlier this year, the FBI released a public service announcement on child sexual abuse, noting that all such images and videos, including those created by AI, are illegal.
The arrest comes as federal officials warn of a rise in AI-mediated sexual abuse content, which they say has allowed criminals to create images and videos on an exponentially larger scale, according to the Department of Homeland Security. The technology poses new challenges for law enforcement agencies targeting the content, but it could also serve as a tool to quickly and accurately identify criminals and victims, the department said.
Court documents detail child pornography chat group
According to pretrial detention documents filed in the U.S. District Court for the District of Alaska, Herrera participated in online messaging groups aimed at buying and selling abusive content. Federal court documents allege that the soldier, who was stationed at Joint Base Elmendorf-Richardson in Anchorage, Alaska, would “surreptitiously record” minors undressing at his home and use an AI chatbot to generate the exploitative content.
He also used images and videos of children posted on social media to create the sexual abuse material, according to the memo.
According to court records, agents with the Department of Homeland Security Investigations executed a search warrant at the home Herrera shares with his wife and daughter. According to the memo, three Samsung Galaxy smartphones contained tens of thousands of videos and images depicting the rape and other sexual abuse of young children, including infants, dating back to at least March 2021. Prosecutors said Herrera stored the material in a password-protected app disguised as a calculator on his phone.
According to the memo, Herrera also sought out sexual abuse material depicting children roughly the same age as his daughter, and that six children lived under the same roof as him in a four-story home on the military base.
During an interview, he admitted to viewing child sexual abuse material online for the past year and a half, according to court records.
“No child should suffer such a tragedy, and no one should escape the detection and prosecution of these crimes by HSI and our law enforcement partners,” said Katrina W. Berger, Deputy Director of Homeland Security Investigations.
Herrera was arrested Friday and charged with transportation, receipt and possession of child pornography, which carries a maximum sentence of 20 years in prison. His initial court appearance is scheduled for Tuesday.
A public defender listed in Herrera’s court records did not immediately respond to USA Today’s request for comment Monday.
Combating Sexual Predators in the Age of AI
The arrests announced Monday are the latest across the country as federal law enforcement agencies grapple with the use of new technology by sex offenders.
According to an FBI public service announcement, “Federal law prohibits the production, promotion, transportation, distribution, receipt, sale, access to, or possession of any CSAM, including realistic computer-generated imagery.”
Officials say they’ve also been able to use the new technology to catch criminals. In 2023, Homeland Security Investigations used machine learning models to identify 311 cases of online sexual exploitation. The three-week operation, dubbed “Operation New Hope,” led to the identification or rescue of more than 100 victims and the arrest of several suspects, according to Homeland Security Investigations.
If you suspect the production of child sexual abuse material, including AI-generated content, you can report it to the National Center for Missing and Exploited Children by calling 800-THE LOST or online at www.cybertipline.org. You can also report it to the FBI Internet Crime Complaint Center at www.ic3.gov.