Translated data: A new tool called Nightshade allows artists to inject invisible pixels into their images, causing AI models to have incorrect perceptions. This is expected to reshape the power balance between AI and artists, but there is also the risk of misuse. Nightshade alters pixels in the image, causing AI models to learn incorrect names for objects and scenery, for example, making a dog appear as a cat. Researchers have submitted Nightshade's work for peer review at the Usinex computer security conference.