San Francisco sues AI websites that make nude deepfakes of women and girls

Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

“The proliferation of these images has exploited a shocking number of women and girls across the globe,” said David Chiu, the elected city attorney of San Francisco who brought the case against a group of widely visited websites tied to entities in California, New Mexico, Estonia, Serbia, the United Kingdom and elsewhere.

“These images are used to bully, humiliate and threaten women and girls,” he said in an interview with The Associated Press. “And the impact on the victims has been devastating on their reputation, mental health, loss of autonomy, and in some instances, causing some to become suicidal.”

The lawsuit brought on behalf of the people of California alleges that the services broke numerous state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. But it can be hard to determine who runs the apps, which are unavailable in phone app stores but still easily found on the internet.

Contacted late last year by the AP, one service claimed by email that its “CEO is based and moves throughout the USA” but declined to provide any evidence or answer other questions. The AP is not naming the specific apps being sued in order to not promote them.

“There are a number of sites where we don’t know at this moment exactly who these operators are and where they’re operating from, but we have investigative tools and subpoena authority to dig into that,” Chiu said. “And we will certainly utilize our powers in the course of this litigation.”

Many of the tools are being used to create realistic fakes that “nudify” photos of clothed adult women, including celebrities, without their consent. But they have also popped up in schools around the world, from Australia to Beverly Hills in California, typically with boys creating the images of female classmates that then circulate through social media.

In one of the first widely publicized cases last September in Almendralejo, Spain, a physician who helped bring it to the public’s attention after her daughter was among the victims said she is satisfied by the severity of the sentence their classmates are facing after a court decision earlier this summer.

But it is “not only the responsibility of society, of education, of parents and schools, but also the responsibility of the digital giants that profit from all this garbage,” Dr. Miriam Al Adib Mendiri said in an interview Friday.

She applauded San Francisco’s action but said more efforts are needed, including from bigger companies like California-based Meta Platforms and its subsidiary WhatsApp, which was used to circulate the images in Spain.

While schools and law enforcement agencies have sought to punish those who make and share the deepfakes, authorities have struggled with what to do about the tools themselves.

In January, the executive branch of the European Union explained in a letter to a Spanish member of the European Parliament that the app used in Almendralejo “does not appear” to fall under the bloc’s sweeping new rules for bolstering online safety because it is not a big enough platform.

Organizations that have been tracking the growth of AI-generated child sexual abuse material will be closely following the San Francisco case.

The lawsuit “has the potential to set legal precedent in this area,” said Emily Slifer, the director of policy at Thorn, an organization that works to combat the sexual exploitation of children.

A researcher at Stanford University said that because so many of the defendants are based outside the U.S., it will be harder to bring them to justice.

Chiu “has an uphill battle with this case, but may be able to get some of the sites taken offline if the defendants running them ignore the lawsuit,” said Stanford’s Riana Pfefferkorn.

She said that could happen if the city wins by default in their absence and obtains orders affecting domain-name registrars, web hosts and payment processors “that would effectively shutter those sites even if their owners never appear in the litigation.”

—Matt O’Brien and Haleluya Hadero, Associated Press

https://www.fastcompany.com/91175100/san-francisco-ai-websites-nude-deepfakes-women-girls-lawsuit?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Created 12mo | Aug 19, 2024, 6:10:05 PM


Login to add comment

Other posts in this group

This free email scam detector gives you the protection Gmail and Outlook don’t

I don’t know if you’ve noticed, but email scams are getting surprisingly sophisticated.

We’ve had a handful of instances here at The Intelligence International Headquarters where we’ve h

Aug 9, 2025, 12:20:05 PM | Fast company - tech
You might want a VPN on your phone. Here’s how to get started

Interest in virtual private networks (VPNs) has surged in America and Europe this year. Countries on both sides of the Atlantic have recently enacted new age-verification laws designed to prevent

Aug 9, 2025, 9:50:05 AM | Fast company - tech
Instagram’s new location sharing map: how it works and how to turn it off

Instagram’s new location-sharing Map feature is raising privacy concerns among some users, who worry their whereab

Aug 8, 2025, 5:40:06 PM | Fast company - tech
The one part of crypto that’s still in crypto winter

Crypto is booming again. Bitcoin is near record highs, Walmart and Amazon are report

Aug 8, 2025, 1:10:06 PM | Fast company - tech
Podcasting is bigger than ever—but not without its growing pains

Greetings, salutations, and thanks for reading Fast Company’s Plugged In.

On August 4, Amazon announced that it was restructuring its Wondery podcast studio. The compan

Aug 8, 2025, 1:10:04 PM | Fast company - tech
‘Clanker’ is the internet’s favorite slur—and it’s aimed at AI

AI skeptics have found a new way to express their disdain for the creeping presence of

Aug 8, 2025, 10:50:02 AM | Fast company - tech
TikTok is losing it over real-life octopus cities

Remember when the internet cried actual tears for an anglerfish earli

Aug 7, 2025, 11:20:03 PM | Fast company - tech