Nude images of middle school students created by classmates using artificial intelligence are roiling the Beverly Hills school district, the latest in an emerging trend that’s prompting calls for legislation.
Officials in California’s Beverly Hills Unified School District are working with the Beverly Hills Police Department to identify both the victims and perpetrators of an undisclosed number of nude images reported by students at Beverly Vista Middle School last week.
“On Wednesday, the BVMS Administration received reports from students about the creation and dissemination by other students of Artificial Intelligence generated images that superimposed the faces of our students onto AI-generated nude bodies,” district officials wrote to parents in an email cited by Fox 11. “As the investigation is progressing … more victims are being identified. We are taking every measure to support those affected and to prevent any further incidents.”
Go Ad-Free, Get Content, Go Premium Today - $1 Trial
Jackie Kruger, mother of an eighth grader at the middle school, described the situation to KCAL as “frightening and terrible.”
“I can imagine how devastating that would be,” Kruger said.
While her daughter wasn’t a victim, the pictures have had an impact, she said.
“I think it made everybody alarmed and alerted to the fact this could happen to them,” Kruger said. “This is a form of bullying, and bullying should be reprimanded.”
Go Ad-Free, Get Content, Go Premium Today - $1 Trial
Kruger’s daughter Evelyn told Fox 11 administrators last week interviewed the girls targeted.
“Girls (were) being called out one by one,” she said. “There’s always that fear, am I going to be next? Am I going to be called in? Sure, it might be animated, it might look unrealistic, but a couple of them look real.”
School officials have vowed to ensure students responsible face criminal charges, if possible.
“While the law is still catching up with the rapid advancement of technology and such acts may not yet be classified as a crime, we are working closely with the Beverly Hills Police Department throughout this investigation,” the statement to parents read. “We assure you that if any criminal offenses are discovered, they will be addressed to the fullest extent possible.”
“Any student found to be creating, disseminating, or in possession of AI-generated images of this nature will face disciplinary actions, including, but not limited to, a recommendation for expulsion,” according to the district.
Kruger told KCAL AI companies should be held accountable for the products they create, and lawmakers should step in to ensure those products aren’t accessible to children.
“If the onus is on them, and Congress legislates in a way to protect our children, then we will all be safer,” Kruger said. “Ultimately, it’s like putting a weapon in a child’s hands right? The child should not use a weapon whether it’s in their hands or not, but don’t put the weapons in their hands.”
Pete Nicoletti, cybersecurity expert with Check Point, told Fox 11 the problem with AI-generated nude images is quickly growing across the country, and stressed the importance of parents setting time limits, download controls, and application approvals on their kids’ phones.
“On the kids’ side of things, it’s happening every single day, dozens of times a day,” he said. “We’re seeing news reports of it everywhere.”
The incident in Beverly Hills follows a similar situation at Pinecrest Cove Preparatory Academy in Florida that involved two dozen students of both sexes who were allegedly targeted by two students who used AI to transpose their faces on naked bodies during the Thanksgiving break, WTVJ reports.
The Pinecrest students used images from the school’s social media to create the images, and were ultimately suspended for 10 days.
“This is something that I don’t think is going to end anytime soon,” Gary Gilmore, father of one of the girls targeted, told CBS Miami.
Other incidents at Westfield High School in New Jersey and Issaquah High School in suburban Seattle also featured AI-generated nude images of teen girls that were shared with classmates. In Washington state and other places, there’s no laws against it and local prosecutors are calling for action.
“At this moment no one has yet been prosecuted for creating a deep fake with the intention of harming … an adult, a teenager, or a child and the law needs to catch up to that,” Washington attorney Debbie Silberman told KIRO.
Independent researcher Genevieve Oh told The Associated Press in December more than 143,000 new deepfake videos were posted online in 2023, more than all previous years combined. Several states including Texas, Minnesota, New York, Virginia, Georgia and Hawaii have laws that make creating nonconsensual deepfake pornography a crime, while California and Illinois allow victims to sue perpetrators for damages in civil court.
Parents of victims in Westfield and elsewhere are now lobbying state and federal lawmakers to do more to provide uniform protections for victims.
“We’re fighting for our children,” mother Dorota Mani told the AP. “They are not Republicans, and they are not Democrats. They don’t care. They just want to be loved, and they want to be safe.”