We all know that images used in advertisements and magazine covers are routinely processed with Photoshop. Barring botched jobs that make the manipulation all too obvious, most people usually can’t tell how (or how much) an image has been retouched. But it seems that AI can help us with that.
AI has been the source of much controversy lately, but not only can it create its own manipulations, it can also be trained to uncover them. A while back, researchers trained AI to detect Photoshop use, and now someone has tested it on magazine covers featuring some of the most recognizable celebrity beauty symbols (using Adobe’s image editing software itself? Check out our picks of the best Photoshop tutorials – or see how to download Photoshop if you still need the software).
Adobe’s Photoshop is so ubiquitous that it has become a generic verb used to refer to all types of photo editing. Almost all images are edited in some way, whether it’s adjusting colors or removing a background, but the most controversial use of Photoshop is for retouching, especially with human figures.
Many will take it for granted that an image of a celebrity in an ad or on a magazine cover has been manipulated in some way to smooth skin, reduce dark circles, or even shrink certain facial features, but an AI-powered tool created by Adobe Research itself developed together with UC Berkeley can show how much by creating a heatmap on the image. The redder the output, the more manipulation.
The tool has been specially trained to detect the use of one of Photoshop’s most powerful features for manipulating facial features: the Face-Aware Liquify (FAL) tool. The FAL detector uses a binary classifier AI model with a dilated residual network trained on images processed with the tool (see research paper (opens in new tab)). It can even try to undo the use of FAL to undo the manipulation.
To test the tool, Within Health (opens in new tab), an online treatment service for eating disorders, used it on magazine covers, including 20 with closeups of Jennifer Aniston. Clear use of Photoshop’s FAL was detected in 50% of the samples. The most edited parts of Aniston’s face were her jawline and chin (both manipulated in six frames out of ten), while her bottom lip was altered in three frames.
“Aniston has long been admired for her beauty. For a lot of women, their facial features are iconic,” says Within. “Unfortunately, decision makers in magazines, advertising agencies and even the press routinely photoshop her look and make editorial decisions about which parts of Jennifer’s face don’t match her idea of perfect beauty.”
Within , similar results were found testing magazine covers featuring Angelina Jolie. Again, 10 out of 20 covers showed clear use of FAL, with five out of 10 manipulating their jaw. Far fewer cases were identified when the tool was tested with images of Beyoncé, but this may be partly because magazines tended to use full-body images of Queen Bey and because the AI model powering the FAL detector was using a dataset trained from images from Flickr, biased toward white faces. But some examples were found nonetheless.
Inside notes that new tools for photo and video manipulation are constantly emerging, noting that the FAL detector is a “much-needed weapon in the fight against the increasingly unrealistic representation of beauty.” It believes that seeing how images are manipulated, understanding the motives and goals behind the manipulation, and improving media and content literacy can help us.
While this test used an AI model trained to detect use of a specific Photoshop tool, tools could be created to detect use of other techniques. The development is also of interest given the tremendous advances in text-to-image AI image generation and deepfakes. Many have suggested that as AI-generated images become increasingly difficult to detect, we desperately need tools that can tell if an image is real, edited, or an AI-generated fake.
FAL Detector is available on Github (opens in new tab).
Continue reading: