Beverly Hills school expels students over deepfake nude photos – NBC News

2 minutes, 4 seconds Read

The Beverly Hills Unified School District voted this week to confirm the expulsion of five middle school students who were accused last month of using generative AI to create and share fake nude images of their classmates, according to the Los Angeles Times and the school board’s meeting minutes.

The case became national news days after Beverly Vista Middle School officials began investigating the incident in February and the Beverly Hills Police Department launched its own criminal investigation, which is ongoing. No arrests have been made and no charges have been brought. 

The five students and their victims were in the eighth grade, according to the school district. Sixteen students were targeted, Superintendent Michael Bregy said in an email to the district community, which was obtained by NBC News. 

“This incident has spurred crucial discussions on the ethical use of technology, including AI, underscoring the importance of vigilant and informed engagement within digital environments,” Bregy wrote. “Furthermore, we recognize that kids are still learning and growing, and mistakes are part of this process. However, accountability is essential, and appropriate measures have been taken.”

The expulsions, which the school district reportedly approved on Wednesday, are a turning point in how schools have publicly handled deepfake cases so far. The expelled students and their parents did not contest the district’s decision and will not be identified, according to the Los Angeles Times

The Beverly Hills case followed a string of incidents around the world over the past year involving AI-generated fake nude images of school-age children. The number of cases has exploded as AI technology has reached mainstream audiences, and apps and programs that are specifically designed and advertised to “undress” photos and “swap” victims’ faces into sexually explicit content have proliferated. False and misleading AI-generated images, videos and audio clips are often referred to as “deepfakes.”

Today, it is faster, cheaper, and easier than ever to create sophisticated fake material. 

The same week that the Beverly Hills case became public, NBC News identified ads running on Facebook and Instagram throughout February for a deepfake app that “undressed” an underage photo of a teen celebrity. It is already illegal to produce, distribute, receive or possess computer-generated sexually explicit content that features the faces of identifiable children, but that hasn’t stopped such material from being posted online for decades. 

Fake nude images and fake pornographic videos overwhelmingly victimize women and girls, and such material is easily searchable on major social media platforms and search engines

This post was originally published on this site

Similar Posts