While nonconsensual deepfake porn has been used to torment ladies for years, the newest technology of AI makes it a fair larger drawback. These methods are a lot simpler to make use of than earlier deepfake tech, and so they can generate photographs that look utterly convincing.
Image-to-image AI methods, which permit folks to edit current photographs utilizing generative AI, “can be very high quality … because it’s basically based off of an existing single high-res image,” Ben Zhao, a pc science professor on the University of Chicago, tells me. “The result that comes out of it is the same quality, has the same resolution, has the same level of details, because oftentimes [the AI system] is just moving things around.”
You can think about my aid after I realized a few new software that could help folks protect their photographs from AI manipulation. PhotoGuard was created by researchers at MIT and works like a protecting defend for pictures. It alters them in methods which might be imperceptible to us however cease AI methods from tinkering with them. If somebody tries to edit a picture that has been “immunized” by PhotoGuard utilizing an app primarily based on a generative AI mannequin similar to Stable Diffusion, the end result will look unrealistic or warped. Read my story about it.
Another software that works in the same manner is named Glaze. But quite than defending folks’s pictures, it helps artists stop their copyrighted works and creative kinds from being scraped into coaching information units for AI fashions. Some artists have been up in arms ever since image-generating AI fashions like Stable Diffusion and DALL-E 2 entered the scene, arguing that tech corporations scrape their mental property and use it to coach such fashions with out compensation or credit score.
Glaze, which was developed by Zhao and a staff of researchers on the University of Chicago, helps them deal with that drawback. Glaze “cloaks” photographs, making use of refined modifications which might be barely noticeable to people however stop AI fashions from studying the options that outline a selected artist’s type.
Zhao says Glaze corrupts AI fashions’ picture technology processes, stopping them from spitting out an infinite variety of photographs that appear to be work by explicit artists.
PhotoGuard has a demo on-line that works with Stable Diffusion, and artists will quickly have entry to Glaze. Zhao and his staff are at present beta testing the system and can permit a restricted variety of artists to join to make use of it later this week.
But these tools are neither good nor sufficient on their very own. You could nonetheless take a screenshot of a picture protected with PhotoGuard and use an AI system to edit it, for instance. And whereas they show that there are neat technical fixes to the issue of AI picture enhancing, they’re nugatory on their very own until tech corporations begin adopting tools like them extra extensively. Right now, our photographs on-line are truthful recreation to anybody who desires to abuse or manipulate them utilizing AI.