A group of researchers have attempted to put artificial intelligence (AI) to good use by making a face anonymizer program.
The aim of Face Anonymization Made Simple is to mask a person’s identity without substantially changing the basic image.
At a quick glance, the photos look normal but it’s impossible to identify who the person actually is in the pictures.
Hackaday reports that the paper’s authors inputted an image into a diffusion model which automatically picked out identity-related features and altered them just slightly so the image remains natural-looking.
The aforementioned identity-related features on a human face are things like eyes, ears, nose, and mouth. While those features are altered the other elements of the photos, background pose, clothing, et cetera, remain unchanged.
It is not a novel concept and AI face anonymizers have been explored before but the researchers say that their method is both simpler and outperforms outers.
The researchers used Stable Diffusion, an open-source AI image generator that can be run locally on private hardware allowing for innovative experimentations.
“We have introduced our approach leveraging diffusion models for face anonymization. Our framework eliminates the need for facial key points and masks and relies solely on a reconstruction loss, while still generating images with detailed fine-grained features,” write the researchers.
“Our results show that this method effectively anonymizes faces, preserves attributes, and produces high-quality images. Additionally, our model can use an extra facial image input to perform face swapping tasks, demonstrating its versatility and potential for various facial image processing applications.”
The project is available via GitHub.
Privacy in the AI Age
There are people who will be interested in this technology. Facial scanning has become a thing with companies like Clearview AI found to have scraped Google and Facebook to bolster its database and power its facial recognition technology.
If there is a photo of you online then in all likelihood it has been scraped into some kind of dataset. But even those not online aren’t safe. Recently, Harvard students demonstrated the terrifying power of AI smart glasses by making a pair that reveals anyone’s personal information by just looking at them.