In response to the issue of deepfake pornography arising from the misuse of generative AI technology, Microsoft has recently taken significant action. The company announced a collaboration with the anti-revenge porn organization StopNCII to provide tools for victims to help them remove these disturbing synthetic nude images from Bing search results.
Specifically, Microsoft will assist victims in creating digital fingerprints or "hashes" of these images on their own devices. StopNCII's partners will utilize these fingerprints to thoroughly remove the related images across various platforms, including Facebook, Instagram, Bing, and more.
Image Source Note: The image was generated by AI, provided by the image licensing service Midjourney
This is a critical move. Although Microsoft previously offered a direct reporting channel for such images, the effectiveness was not ideal. As the company stated in their blog, relying solely on user reports is insufficient to fully address the risk of images being widely accessed through search engines.
It is worth noting that Google's search engine seems to need further strengthening in this regard. According to media reports, Google has not yet partnered with StopNCII, and its users have long reported the severity of this issue.
The problem of deepfake pornography brought about by generative AI is indeed concerning. Currently, the United States does not have a federal law specifically targeting such actions, relying instead on state and local regulations. However, attempts like those made by the San Francisco prosecutor to file lawsuits may lay the groundwork for more robust legislation.