A 17-year-old New Jersey girl is suing the developer of an AI-powered "clothes removal" app after a classmate allegedly used it to generate fake-nude images of her from a social media photo when she was 14.
Yale Law Group Joins Case Targeting Deepfake Exploitation
The lawsuit, filed by a Yale Law School professor, his students and a trial attorney, accuses AI/Robotics Venture Strategy 3 Ltd.—the company behind the web tool ClothOff—of enabling the creation and distribution of nonconsensual, sexually explicit deepfakes, reported the Wall Street Journal.
The case also names Telegram as a nominal defendant, as the app hosted bots that provided access to ClothOff.
According to the complaint, the teen's Instagram photo, which showed her in a bathing suit, was altered into a realistic nude image shared among male classmates.
The lawsuit demands the deletion of all AI-generated nude images involving minors and adults without consent and seeks a court order to remove the software from the internet.
ClothOff did not immediately respond to Benzinga's request for comments.
See Also: Google Tightens ‘Work From Anywhere’ Policy, With Even 1 Remote Day Counting As Full Week: Report
Developer Denies Wrongdoing, But Concerns Mount
ClothOff's developer, based in the British Virgin Islands and believed to be operated from Belarus, states on its website that its system cannot process images of minors and automatically deletes all data.
However, the plaintiff's lawyers allege the software has been used to create child sexual abuse material, violating federal and state laws.
The teenage boy accused of creating the fake nudes is not included in the current lawsuit, although the plaintiff has filed a separate suit against him. In their response to the complaint.
His attorneys stated that the "defendant is without knowledge or information sufficient to form a belief as to the truth of the allegations."
Rising Pressure To Regulate AI Deepfakes
The case adds to a growing push for regulation amid a surge in AI-generated sexual imagery.
In May, Congress passed the Take It Down Act, which makes it a federal crime to publish nonconsensual intimate imagery—real or AI-generated—and requires platforms to remove such content within 48 hours of a valid complaint.
The plaintiff's filing says she now "lives in constant fear" that her fake image will resurface online.
Read More:
Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors.
Photo courtesy: Shutterstock
© 2025 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.