Skip to Content

This teen was targeted by deepfake nudes. She hopes new training course will help future victims

By Clare Duffy, CNN

New York (CNN) — When Elliston Berry, then 14 years old, discovered a classmate had made and shared a deepfake nude image of her, she didn’t know where to turn for information on what had happened or how to get the photos removed from social media. Now, she’s pushing to ensure no other young person has to feel the same way.

Berry helped to create an online training course to teach students, parents and school staff about non-consensual, explicit deepfake image abuse, in partnership with cybersecurity firm Adaptive Security and Pathos Consulting Group.

It’s an increasingly common form of harassment, amid the proliferation of artificial intelligence tools that make creating sexualized deepfakes simple and widely available. Just this week, Elon Musk’s xAI came under fire after its AI chatbot Grok was repeatedly used to create nude or sexualized AI images of women and minors. (xAI has since limited its image generation feature.)

One in eight US teens report personally knowing someone who has been targeted by nude deepfakes, according to research published last year by the non-profit Thorn. That’s despite the Take It Down Act — which President Donald Trump signed into law last year, and for which Berry advocated — making it a crime to share nonconsensual, explicit images, real or computer-generated.

“One of the situations that we ran into was a lack of awareness and a lack of education,” Berry, now 16, told CNN of the leadership at the Texas high school where she was harassed. “They were more confused than we were, so they weren’t able to offer any comfort, any protection to us. That’s why this curriculum is so important … it focuses on the educators so they’re able to help and protect if a victim were to come to them for a situation like this.”

The online course takes about 17 minutes to complete and is designed for middle- to high school-aged students, as well as teachers and parents. It includes lessons on understanding and recognizing AI-generated deepfakes, deepfake sexual abuse and sextortion.

Sextortion, a scheme where victims are deceived into sending online perpetrators explicit images and then blackmailed in exchange for money or additional graphic content, has affected thousands of teens in recent years and led to multiple suicide deaths.

The course also includes links to support resources from RAINN, as well as information about legal consequences under the Take It Down Act and on how to get images removed. Berry said it took nine months to get the images of her removed from social media. The Take It Down Act now requires platforms to remove such images within 48 hours of being notified of them.

“It’s not just for the potential victims, but it’s also for the potential perpetuators of these types of crimes,” said Adaptive Security CEO Brian Long. “They need to understand that this isn’t a prank, right? … It’s against the law and it’s really, really harmful and dangerous to people.”

Adaptive Security is making the course available for free to schools and parents of young people.

“I know a handful of girls that this has happened to in just the past month,” Berry said. “It is so scary, especially if no one knows what we’re handling. So, I think it’s super important to take initiative, learn more, educate more and have conversations.”

The-CNN-Wire
™ & © 2026 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.