A suburban Chicago school district is working with police and prosecutors to investigate dozens artificial intelligence-generated nude images of students, officials told parents on Monday.

An estimated 30 “sensitive images that appear to have been created and circulated on a computer” were discovered by officials at Richmond-Burton High School on Monday, Principal Mike Baird wrote in an email to parents cited by Shaw Local.

Go Ad-Free, Get Content, Go Premium Today - $1 Trial

“He just took random photos from my daughter’s prom and turned them into nude images and started distributing them among the student body,” Stephanie Essex, mother of sophomore Stevie Hyder, told WGN.

Hyder is among dozens who were featured in the images, which were allegedly created by a student.

“I know there are at least 22,” Hyder said. “They said I was the 22nd person whose mom they called, but we think there’s around 30 girls that are surfaced now.”

The incident is among what’s become a near daily occurrence in districts across the country of students creating nude images of their classmates using artificial intelligence. In many states including Illinois, the incidents fall into a legal gray area – with a fake body morphed with a real image of minors that doesn’t fit neatly into existing laws.

Go Ad-Free, Get Content, Go Premium Today - $1 Trial

“These technologies allow for bad actors to try to use them in ways that create misinformation, deep fakes,” DePaul University computer science professor Bamshad Mobasher told WGN. “In this context there is defiantly a need for quick action. The potential for harm is great.”

“I don’t know where it falls in the law. In my discussion with Richmond police last night they were still working with the McHenry County State’s Attorneys Office to figure out exactly how to lay out the charges for this,” Essex said. “In my opinion I think this basically should be considered child pornography, it’s unacceptable.”

Officials with Richmond Police and Richmond-Burton High School District 157 declined to discuss the case with the media, though they are soliciting information from parents of students involved.

“I’m reaching out to all of the parents who are involved with these victims so that they are aware,” Richmond Police Sgt. Jennifer Fillicaro told Shaw Local. “I am calling each and every parent and discussing with them directly. We are taking it very seriously.”

Students and parents are, as well.

“Obviously we know it’s fake, but a lot of people from towns over they won’t know it’s fake,” Hyder told WGN. “We know he was sending them around we don’t know who all has it.”

“She feels so violated, it is so gross,” Essex told Shaw Local. “I am furious that I am finding out because my daughter’s images are circulating around the school.”

The incident in Richmond comes about six weeks after U.S. Senate Majority Whip Dick Durbin, D-Ill., chair of the Senate Judiciary Committee, partnered with Democrat and Republican sponsors to introduce the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, which is described as “legislation that would hold accountable those responsible for the proliferation of nonconsensual, sexually explicit ‘deepfake’ images and videos,” according to a news release.

The legislation would create a federal civil remedy for victims who are identifiable in a “digital forgery,” which is defined as a visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer generated or technological means to falsely appear to be authentic.

“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit ‘deepfakes’ is very real,” Durbin said. “Victims have lost their jobs, and they may suffer ongoing depression or anxiety. My introducing this legislation, we’re giving power back to the victims, cracking won on the distribution of ‘deepfake’ images, and holding those responsible for the images accountable.”

The DEFIANCE Act, S 3696, remains pending in the Senate Judiciary Committee. Similar bills have been introduced in the U.S. House.

Last week, five eighth-grade students at California’s Beverly Vista Middle School were expelled over more than a dozen AI-generated nude images of classmates uncovered by school officials last month, KTLA reports.

“Sixteen eighth-grade students were identified as being victimized, as well as five egregiously involved eighth-grade students,” the superintendent wrote in a letter to parents.

The Beverly Hills Police Department continues to investigate that case for potential criminal charges, according to the news site.

It was a slightly different outcome at Florida’s Pinecrest Cove Preparatory Academy, where the two boys involved in distributing dozens of AI-generated nude images of students around Thanksgiving were suspended for 10 days in December, the New York Post reports.

Other recent high-profile examples come from Issaquah, Washington, where a 14-year-old student was investigated by police for sharing AI-generated nude images of several female classmates he shared on Snapchat, as well as a similar case at Westfield High School in New Jersey last year.

The latter prompted student victim Francesca Mani and her mother, Dorota, to campaign for change, prompting legislation in New Jersey and Washington, D.C.

“Many people don’t feel comfortable going public with what happened,” Mani told The Guardian. “Because, just like in my school, they are constantly hearing that nothing can be done.”

Independent researcher Genevieve Oh told The Associated Press in December more than 143,000 new deepfake videos were posted online in 2023, more than all previous years combined. Ten states including Texas, Minnesota, New York, Virginia, Georgia and Hawaii have laws that make creating nonconsensual deepfake pornography a crime, while California and Illinois allow victims to sue perpetrators for damages in civil court. Legislation to address “deepfakes” is also currently pending in numerous states including Michigan, Missouri, New Jersey, Indiana, South Dakota, and others, the AP reports.