The free tool Glaze is designed to protect artists' styles from being copied by AI image generators, and its demand has surged rapidly. The tool adds nearly invisible noise to images to prevent AI systems from mimicking the styles.

image.png

Product Entry: https://top.aibase.com/tool/glaze

According to Glaze developer Ben Zhao, the web version of WebGlaze has seen an enormous surge in access requests since Meta announced its plan to use user data for AI training.

Artists sometimes have to wait for weeks or even months to gain access, as the Glaze project manually reviews each application to ensure that applicants are real individuals and the tool is not being misused.

At the same time, security researchers have discovered methods to bypass Glaze's protection. Although Zhao and his team have made changes to increase the difficulty of the attack, this has still raised doubts about the effectiveness of Glaze, especially as the team that launched the attack criticized the changes as insufficient.

Highlights:

🖼️ The Glaze tool is popular among artists for preventing AI image theft

🔒 The demand for the Glaze tool has surged due to Meta's plan to use user data for AI training

⚙️ Security researchers have found methods to bypass Glaze's protection, raising doubts about its effectiveness