Their daughter was bullied by fake nude images at school. They're warning others.
Their daughter was bullied by fake nude images at school. They're warning others.
Alyssa Goldberg, USA TODAY Tue, April 7, 2026 at 5:59 PM UTC
0
As artificial intelligence platforms evolve, the prevalence of "nudify" applications, meant to remove someone's clothing, is growing in tandem. School policies, legal recourse and awareness lag far behind, experts say.
She was sitting in the lunchroom when a friend approached her.
“Your nudes got leaked,” the friend said.
But the 16-year-old student had never sent a nude. Instead, there was a deepfake pornographic image of her circulating at her high school.
In her next class, she asked another friend, “Have you heard anything about nude images of me?”
The friend said yes − at a party. Someone showed a photo of a naked body and identified it as the 16-year-old girl.
Her parents told USA TODAY that the deepfake scandal began months prior in December 2024, when their daughter had a falling out with a friend. A rumor spread through the school − located in the Pennsylvania New Hope-Solebury School District − that their daughter was "sending nudes." But until the exchange in the cafeteria in February 2025, her parents say she had no idea explicit photos were being generated and distributed among students.
The impact on the student and her family has been devastating. Once social and bubbly, the now 18-year-old has become reclusive, her parents say, afraid to go to the grocery store or pharmacy out of fear that other patrons have seen the deepfake images. The parents and their legal representative, Matthew Faranda-Diedrich, a partner at the Royer Cooper Cohen Braunfeld law firm, claim the school failed to adequately investigate the deepfake scandal and provide support in the aftermath. They are working with Faranda-Diedrich to initiate legal action against the school district. The family requested to remain anonymous to protect their daughter's privacy.
Lawyers for the school district and representatives for the high school did not return USA TODAY's requests for comment.
The school district's solicitor sent a letter to Faranda-Diedrich in March 2026 that was reviewed by USA TODAY. The letter states that a Title IX investigation conducted by the school did not find proof that perpetrators accused of sharing the photos "actually circulated deepfake nudes and/or rumors of the same."
But the parents pulled their daughter from the public school after what they describe as persistent bullying. Her parents are proud of the progress she's made at her new private school − she's planning to attend senior prom with her new friends after skipping junior prom at her old school − but they say the deepfake scandal still weighs on the entire family.
They aren't alone in facing the impact of deepfake abuse. A growing number of schools across the country are grappling with the rise of deepfakes and "nudify" applications, and parents are often left in the dark, according to AI experts, lawyers and impacted families. For victims of deepfake abuse, the emotional toll can be serious and long-lasting. Often when they reach out for support, they feel alone or misunderstood.
When the 18-year-old's parents share their experience with friends, most respond with shock.
"There's still a lack of awareness around that this can happen to anybody," her father says. "We live in a very resourced school district, a very resourced area of the world. If it can happen here, it can happen anywhere."
Here's what they want other parents to know about deepfake abuse, and how experts are helping schools navigating this rising crisis.
Two boys made deepfake porn of 60 girls. Here's what happened next
'It's a fake picture, but it's a real life'
The first time parents hear about pornographic deepfake shouldn't be when their children encounter this type of media or are victimized.
"The school is supposed to be the one that is up to date on these types of things that are happening," the mother says. "There should be outreach to the community, because they do outreach for other things like bullying."
In 2024, a survey of 3,170 K-12 students, teachers and parents was conducted by the Center for Democracy & Technology to demonstrate the prevalence of AI deepfakes in schools and how prepared schools were to handle cases.
The survey found that 6 in 10 teachers were not aware of school policies and procedures for addressing authentic or deepfake sexual images, and only 16% said their school’s teacher training covered how to protect the privacy of a student depicted in a deepfake. Only 13% of students reported that their school has explained that sharing this type of AI-generated media is harmful to the person depicted.
Evan Harris, a national expert on emerging AI risks in schools at Pathos Consulting Group, says schools already have so much on their plate and are struggling to keep up with this technology. Harris has conducted thousands of webinars to help schools navigate this rising crisis, worked on the ground to workshop policies with various schools and contributed to a free educational program co-led by Elliston Berry, a victim of deepfake abuse, and Adaptive Security, a security awareness training platform.
Advertisement
"A big part of my job is making all of this feel manageable, because if people are in that fight-or-flight mode, they can't really get into anything concrete or actionable," Harris explains. He starts by establishing three buckets: policies, crisis readiness plans, and preventative education for faculty, parents and students.
It's a "shared responsibility" between the school, parents and the students themselves, the Pennsylvania student's father says. "It takes a village."
Her mother also wants students to complete educational trainings that emphasize the "human repercussions of doing this to an individual."
"(Our daughter) loved school, she loved her town," her father says. "And in the end, she ended up having to leave the school and she's afraid to go into town."
"It's a fake picture, but it's a real life," her mother says.
Don't wait to talk to your kids about deepfakes
The traditional parental advice and health class warnings ("don't send nude photos") are no longer sufficient, experts say. Anyone can have a explicit photo of them made and shared, even if they never take it themselves.
The Pennsylvania student's parents say that keeping an open dialogue with your children is imperative.
"Kids that are going through this need to be heard. They need to be believed," her father says.
"Being able to know what was going on, we were able to help her," he continued. This required tough conversations, but they say this helped prevent their daughter from sinking further into a "dark place."
Harris says that schools and parents need to establish an "anti-judgment culture" by modeling positive responses while educating stakeholders about deepfake abuse − before a crisis arises.
"(Students) have to hear that message that they go to a school where if they come forward, they will be supported, they will be believed and they won't be judged," he says. "Shame is the number one enemy here. That's what we're trying to counter."
Parents and experts call for stronger mandatory reporting laws
The Pennsylvania student's parents say the school took too long to act on multiple complaints about what was happening to their daughter and did not file ChildLine report to the state, meant to identify potential child abuse, until May 2025.
They want schools to implement stronger protections and reporting practices to prevent deepfake abuse from happening in the first place − and to better support students when it does.
"It's about a lack of understanding," Faranda-Diedrich says. "If someone had walked in and said there's child pornography being spread in the school, there's no way the school would have handled it the way they did."
State Sen. Tracy Pennycuick cosponsored the bill updating Pennsylvania’s AI child pornography laws, and is hoping to see their state’s bipartisan action extend nationally. In Pennsylvania, she is sponsoring another bill that would tighten up reporting requirements for mandatory reporters, like teachers and school administrators.
Pennycuick wants to leave "no ambiguity" in what mandatory reporters are required to act on: "If you suspect that there is any kind of child sexual abuse material, you report."
The Pennsylvania student's parents want to protect other families from living their nightmare, too.
"Schools need to trust the students that are coming to them and realize it's a new world we're living in," her mother says. "If you hear something, say something. I'm around my children; I hear their friends talking. Start the conversation with them about it. It's uncomfortable, but it's something that's real nowadays."
This story was supported by a grant from the Tarbell Center for AI Journalism. Funders do not provide editorial input.
This article originally appeared on USA TODAY: Their daughter was bullied by deepfake nudes. They're warning others.
Source: “AOL Breaking”