artificial intelligence

San Francisco city attorney takes on companies that produce nonconsensual AI-generated pornography

NBC Universal, Inc.

Concerns about artificial intelligence have moved to the forefront of American discourse at a time when disinformation is already plentiful. AI is everywhere, and ranges from predictive text that anticipates a typer's next word to deliberately misleading images, videos and memes.

But there's an even more nefarious use of AI, that of so-called deepfake pornography, which creates sexually explicit images of real people without their consent.

On Thursday, the office of San Francisco City Attorney David Chiu launched what it says is the first-of-its-kind lawsuit against websites that create and distribute nonconsensual porn using AI. Chiu alleges that these sites not only "undress" adults, but also manipulate and create pornographic images of children.

Deepfake porn images have affected celebrities as well, including Gal Gadot, Natalie Portman, Emma Watson, and Scarlett Johansson.

"The proliferation of nonconsensual deepfake pornographic images has exploited real women and girls across the globe," said a statement released by Chiu. "This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation."

The lawsuit, filed in San Francisco Superior Court on behalf of the People of the State of California, alleges violations of state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as violations of California's Unfair Competition Law.

Some of the 16 companies named in the suit reside in the United Kingdom or Estonia and reach millions of people. One is based in Florida and owned by a Ukranian businessman, others are in Los Angeles and New Mexico. Multiple others are just referred to as "Does" because the city attorney's office could not yet identify them.

None of the companies in the suit could be reached for comment on Thursday.

Most of the sites allow users to upload images that they would like to "undress," regardless of whether the person doing so has permission to use the image or images. According to Chiu, this means that images of children are also being submitted. These open-source models have been adapted and trained to create new versions that are highly effective at generating pornographic content, he said.

"These highly popular fine-tuned versions generate not only pornographic content involving fictitious AI-generated individuals, but also manipulate images of real people to produce fictional pornographic content that depicts those individuals," reads the complaint in the lawsuit. "The models are able to recognize clothing and body features in an image of a person, and can be further conditioned to manipulate the image to generate a fake, photorealistic image that maintains the person's face, but replaces their clothed body with a nude body--thus appearing to ]undress' the person and display their intimate body parts. These models 'undress' or 'nudify' not only adults, but also children."

Such images aren't only created for the user, but can also be used to "bully, threaten or humiliate" women and girls, reads the complaint.

Chiu cites AI-generated nude images that circulated at a middle school in Beverly Hills in February that targeted 16 eighth-grade students.

"Generative AI has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology. We have to be very clear that this is not innovation -- this is sexual abuse," said Chiu.

Chiu's suit calls for the websites to be taken down, as well as permanently barring the defendants from further engaging in the alleged unlawful conduct. The suit also seeks civil penalties and costs for bringing the lawsuit.

As chief attorney for a large city, Chiu is able to sue entities on behalf of the state, according to his office.

"That is the same thing we did when we sued the opioid industry, fossil fuel industry, gun manufacturers, etc.," said spokesperson Jen Kwart in an email. "We can sue entities outside of California if they are violating the law in California, which these companies are. There have been media reports of victims of this conduct in California."

Copyright Bay City News
Contact Us