Artificial Intelligence

These new tools could help protect our pictures from AI

While nonconsensual deepfake porn has been used to torment women for years, the latest generation of AI makes it an even bigger problem. These systems are much easier to use than previous deepfake tech, and they can generate images that look completely convincing.

Image-to-image AI systems, which allow people to edit existing images using generative AI, “can be very high quality … because it’s basically based off of an existing single high-res image,” Ben Zhao, a computer science professor at the University of Chicago, tells me. “The result that comes out of it is the same quality, has the same resolution, has the same level of details, because oftentimes [the AI system] is just moving things around.” 

You can imagine my relief when I learned about a new tool that could help people protect their images from AI manipulation. PhotoGuard was created by researchers at MIT and works like a protective shield for photos. It alters them in ways that are imperceptible to us but stop AI systems from tinkering with them. If someone tries to edit an image that has been “immunized” by PhotoGuard using an app based on a generative AI model such as Stable Diffusion, the result will look unrealistic or warped. Read my story about it.

Another tool that works in a similar way is called Glaze. But rather than protecting people’s photos, it helps artists  prevent their copyrighted works and artistic styles from being scraped into training data sets for AI models. Some artists have been up in arms ever since image-generating AI models like Stable Diffusion and DALL-E 2 entered the scene, arguing that tech companies scrape their intellectual property and use it to train such models without compensation or credit.

Glaze, which was developed by Zhao and a team of researchers at the University of Chicago, helps them address that problem. Glaze “cloaks” images, applying subtle changes that are barely noticeable to humans but prevent AI models from learning the features that define a particular artist’s style. 

Zhao says Glaze corrupts AI models’ image generation processes, preventing them from spitting out an infinite number of images that look like work by particular artists. 

PhotoGuard has a demo online that works with Stable Diffusion, and artists will soon have access to Glaze. Zhao and his team are currently beta testing the system and will allow a limited number of artists to sign up to use it later this week. 

But these tools are neither perfect nor enough on their own. You could still take a screenshot of an image protected with PhotoGuard and use an AI system to edit it, for example. And while they prove that there are neat technical fixes to the problem of AI image editing, they’re worthless on their own unless tech companies start adopting tools like them more widely. Right now, our images online are fair game to anyone who wants to abuse or manipulate them using AI.