Are you a graphic designer or photographer? This magical tool will protect your work from image-generative AI…


Alexandre Schmid

October 24, 2023 at 5:08 p.m.

6

Emmanuel Macron social media crisis © Midjourney for Clubic.com

Image of Emmanuel Macron generated by AI © Midjourney for Clubic.com

Midjourney, SLAB and other image-generative AIs could become much less effective with Nightshade, a technology that poisons model training data.

AI-powered image generation platforms produce impressive results. Problem is, they were trained on databases made up of works for which the consent of their authors has not been granted. A new tool aimed at deceiving these models could call into question this mode of operation.

” A cat ? But I asked for a dog”

A team of researchers, led by Ben Zhao, professor at the University of Chicago, developed Nightshade. This device allows creators to protect their art from AI by making invisible changes to pixels before publishing the creation online.

Works integrating Nightshade will thus poison the training data of AI models, whose performance could be severely affected. With Nightshade, a dog becomes a cat, a car becomes a cow, and a hat becomes a cake.

The study indicates that this method is very effective in causing image-generative AI to completely malfunction. For example, 300 poisoned samples are enough to defeat the Stable Diffusion AI. Researchers exploit the models’ ability to establish links between concepts to infect an entire semantic field from a single word. By targeting only the term “dog”, Nightshade will also infect nearby words, such as “puppy”, “husky” or “wolf”.

Nightshade © © University of Chicago

A dog becomes a cat with Nightshade © University of Chicago

Glaze integrates Nightshade

Nightshade has been made open source. Its creators encourage other developers to develop derivative versions to make the tool even more powerful. Image banks could therefore exploit it in order to protect the works they host.

The Glaze image library, also from the University of Chicago, will soon support Nightshade. Artists will have the choice to protect their creations against generative AI models or not.

If Nightshade or another such initiative becomes popular, AI companies will have to adapt or risk seeing their platforms lose reliability and credibility. They could develop tools that manage to detect this layer of protection, and either remove it or exclude the work in question from the training of their model. They could also finally respect the rights of artists and sit down at the negotiating table to find a solution that satisfies all parties.

Results with clean model versus poisoned model © University of Chicago

The best AI to generate your content

The emergence of artificial intelligence as a mainstream tool has opened up numerous possibilities for all content producers. Text, image, sound… This new fashionable technology can now provide assistance in many areas, and facilitate work in the most difficult stages of creation. And with an ever-increasing offering, it is important to distinguish which tools provide real added value. So you don’t waste hours trying everything the Google results pages offer!
Read more

Source : MIT Technology Review



Source link -99