Recently, Yvonne Meré, the deputy mayor of Old Mountain City and a lawyer, initiated a lawsuit against 16 websites that use artificial intelligence technology to create deepfake pornographic content. These websites process women and girls' photos into nude images without their consent. This unprecedented legal action aims to combat the growing harmful trends among teenagers, particularly young boys who use "stripping apps" to manipulate their classmates' images.

AI Face Swap and Facial Recognition_

Image Source Note: The image is generated by AI, authorized by service provider Midjourney

According to The New York Times, the 16 sued websites were visited 200 million times in the first six months of this year. The companies responsible for these websites are located in California, New Mexico, the UK, and Estonia. Journalists attempted to contact representatives of these websites, but either received no response or no one was available for an interview. Some websites even promoted their services with phrases like "Want her to strip?" while others directly encouraged users to obtain women's nude photos through these sites.

It is worth noting that these websites typically offer free initial images, but subsequent processing requires a fee, with payment methods including cryptocurrency or credit cards. The deepfake technology used relies on AI models trained with real pornographic and child abuse images to generate seemingly authentic nude photos. San Francisco City Attorney David Chiu emphasized that the punishment for those responsible is almost negligible, and pointed out that once the images are disseminated, it becomes quite difficult to find the original website, complicating the legal recourse process for victims.

Sara Eisenberg heads a legal unit focused on significant social issues, and she pointed out that relying solely on educating teenagers to safely use technology is far from enough. Any photo can be processed without authorization, and traditional protective measures are no longer effective. Eisenberg said, "Even if children master the skills of using the internet and social media, they still cannot prevent someone from doing extremely恶劣的事情 using these websites."

This lawsuit not only seeks to shut down these websites but also hopes to permanently ban them from creating deepfake pornographic content, while demanding civil fines and attorney fees. The lawsuit states that these websites violate national and local revenge porn laws, child pornography laws, and California's unfair competition law, which prohibits illegal and unjust commercial practices.

Meré said that after seeing the harm of deepfake images in The New York Times, she immediately contacted Eisenberg and sought Chiu's support to formulate the lawsuit. Chiu mentioned that deepfake nude photos, from Taylor Swift to ordinary middle school students, are common, and almost no one is punished. Experts warn that deepfake pornographic content has a severe impact on the mental health, reputation, college, and job prospects of victims.

Although Chiu also acknowledges that this approach may lead to new websites emerging like "whack-a-mole," they hope that as the issue evolves, more new websites can be included in the lawsuit. As the center of the AI industry, San Francisco is the right place for this legal battle. Chiu pointed out that while the contributions of the AI industry are positive, the emergence of deepfake pornographic content is the "dark side" that needs to be addressed.

Key Points:

🌐 San Francisco sues 16 deepfake pornography websites to protect women's privacy and rights.

💰 The lawsuit demands the closure of these websites and requires payment of civil fines and attorney fees.

📉 Deepfake pornographic content has a severe impact on the mental and social life of victims, with difficult legal accountability.