According to foreign media reports, several photographers have complained that Meta has erroneously added the "Made with AI" label to their genuine photographs. Over the past few months, several photographers have shared examples, with Meta recently tagging a basketball game photo taken by former White House photographer Pete Souza as AI-generated.

In another recent case, Meta incorrectly applied the label to an Instagram photo depicting the Kolkata Knight Riders winning a cricket match in the Indian Premier League. Interestingly, similar to Souza's photo, this label only appears when viewing the image on a mobile device, not on the web.

image.png

Souza stated that he attempted to remove the label but was unsuccessful. He speculated that using Adobe's cropping tool and compressing the image into JPEG format might trigger Meta's algorithm.

However, Meta has also mistakenly labeled real photos as "Made with AI" when photographers use generative AI tools (like Adobe's Generative Fill) to remove minor objects, according to PetaPixel. The publication tested this using Photoshop's Generative Fill tool to remove a spot from an image, after which Meta tagged it as AI-generated on Instagram. Curiously, when PetaPixel re-uploaded the file back into Photoshop, copied it, and pasted it into a black document to save, Meta did not add the "Made with AI" label. Multiple photographers have expressed dissatisfaction with such minor edits being unfairly marked as AI-generated.

Photographer Noah Kalina wrote on Threads: "If 'retouched' photos are all labeled 'Made with AI', then the term essentially loses its meaning. If they were serious about protecting people, they might automatically tag every photo with 'not a true representation'."

Meta spokesperson Kate McLaughlin stated in a declaration that the company is aware of the issue and is evaluating its approach, "so that our labels reflect the amount of AI used in the image." "We rely on industry-standard metrics included in the tools of other companies, so we are actively collaborating with these companies to improve this process and align our labeling method with our intentions," McLaughlin added.

In February of this year, Meta announced that it would add the "Made with AI" label to photos uploaded on Facebook, Instagram, and Threads before the election season. Specifically, the company stated it would add labels to AI-generated photos using tools from Google, OpenAI, Microsoft, Adobe, Midjourney, and Shutterstock.

Meta has not disclosed what triggers the "Made with AI" label, but all these companies have either already added or are in the process of adding metadata to image files to indicate the use of AI tools, which is one way Meta identifies AI-generated photos. For example, Adobe introduced its Content Credentials system last year, adding information about the origin of content to the metadata.

Key points:

- Multiple photographers have complained about Meta erroneously marking real photos as "Made with AI".

- Photos made with editing tools seem to be affected.

- Meta also mistakenly labels real photos as Made with AI when using generative AI tools.