Nightshade, the tool that allows you to “poison” your images in the eyes of Midjourney or DALL-E, is now available


Corentin Béchade

January 22, 2024 at 9:01 a.m.

2

Defactive_AI © Hadrian / Shutterstock

Ninghtshade will “fry” the brains of people who try to be inspired by your creations © Hadrian / Shutterstock

In front of the growing popularity of image generative AI, some artists want to protect themselves from the insatiable appetite of these machines. Nightshade does exactly that.

At the beginning of January 2024, OpenAi created controversy by declaring that it was “ Impossible to train existing major AI engines without using copyrighted content“. Many graphic designers, comic book authors and video artists saw this as an easy justification for plundering their work without their consent and creating machines capable of spitting out an image imitating their styles in a matter of seconds.

Windows and macOS compatible

Faced with this industrial use of images protected by copyright, researchers have imagined a system allowing machine learning to be “poisoned” without changing (apparently) anything about its creations. The tool nicknamed Nightshade, which remained in development for a long time, is now available for download to everyone.

Freely accessible on the University of Chicago website, the small software is capable of running on Mac and Windows. Once the executable has been downloaded, you will need to make at least 4 GB of space on your hard drive to allow the program to download the software libraries necessary for its operation.

If you already use Glaze, the other software supposed to protect you from AI plundering, the two applications will be able to share part of their resources. The Windows version of the software can also run on the memory of your graphics card, if the latter is an Nvidia component listed here. Be careful though, given the popularity of the software, the download may be slowed down somewhat while the servers get back on track.

A cow withbeautiful brown leather handles »

Nightshade is thought of as “an offensive tool to distort the interpretation of an image that can be carried out by generative AI“. Unlike Glaze, which protects an artist’s line by making the machine believe that the image adopts a completely different style from that actually used, Nightshade will make the AI ​​”see” radically altered content. Using NightshadeHuman eyes will be able to see an image of a cow in a green field, but an AI will see a large leather handbag lying in the grass. […] Trained on enough poisoned images, the AI ​​will eventually be convinced that cows have nice brown leather handles and side pockets, as well as a zipper and maybe a nice logo.»

Source : University of Chicagoo via VentureBeat



Source link -99