San Francisco is taking a big stand against a troubling trend involving artificial intelligence (AI). The city’s attorney, David Chiu, has filed a lawsuit against 16 websites that let people use AI to create fake nude images of women and young girls. Here’s what’s happening:

The Lawsuit Explained
On August 15, San Francisco City Attorney David Chiu announced that he’s suing the owners of 16 websites. These sites have been allowing users to upload pictures of people and then use AI to create fake, nude versions of them. This means someone can take a regular photo and turn it into a very realistic, pornographic image without the person’s consent.

Who’s Behind the Lawsuit?
The lawsuit claims that the website owners come from different places, including Los Angeles, New Mexico, the UK, and Estonia. The complaint says these sites have broken California and U.S. laws related to deepfake porn, revenge porn, and child sexual abuse material. According to the lawsuit, these websites have had an astonishing 200 million visits just in the first half of this year.
What the Websites Do
Some of these sites boast that they let users “see anyone naked” or suggest it’s better to use their service instead of dating someone. This kind of language shows how these websites exploit people, turning their images into something they never agreed to.
How the AI Is Misused
The AI used on these websites is trained on inappropriate content, including porn and child sexual abuse material. This means that the AI can create images that look almost real, even though they are entirely fake. Some websites limit their images to adults, but others also create fake nude images of children, which is especially disturbing.
Real-Life Impact
The lawsuit highlights several disturbing cases. In February, fake nude images of 16 eighth-grade students, who are usually around 13 or 14 years old, were shared at a California middle school. In June, a teenager in Australia was arrested for spreading fake images of high school students.
City Attorney Chiu has expressed his horror at the situation. He says that while AI has many positive uses, it is being misused to exploit and harm real people, including children. “We must make it clear that this is not innovation—this is sexual abuse,” Chiu said.
What’s Next?
San Francisco’s lawsuit is a major step in fighting back against the abuse of AI technology. By targeting these websites, the city hopes to stop the spread of fake, harmful images and protect the victims who have been affected. This case highlights the need to address the darker side of technological advances and ensure they are used responsibly.