AI
"Poisoned" pixels. New Nightshade tool protects artists' works from AI.
Scientists have developed a new tool that is to be a "poison" for generative AI. Nightshade allows artists to change their images so that they become useless for algorithms. However, a human will not notice the difference.
Creators of digital art have found a solution to the problem of copyright theft by generative AI learning models. A team from the University of Chicago has presented the Nightshade program - a tool that "poisons" visual information with pixels invisible to the human eye, making images useless for AI models. The team's leading scientist, Professor Ben Zhao, hopes that the developed program will return control to the hands of artists.
Currently, tools based on generative AI such as DALL-E, Midjourney or Stable Diffusion download data from the internet without respecting copyright, paying fees for the use of works or the consent of the artists themselves, who feel threatened.
How does Nightshade work?
Nightshade is designed to solve these problems. The program introduces changes to each pixel of the image that are invisible to the human eye. A model trained on these samples begins to hallucinate. This causes the artificial intelligence to start interpreting, for example, an image of a cat as a dog, a hat as a cake, or changing women's handbags into toasters. Since the data collection process is usually automated, over time more and more misleading data will be delivered to the model. Their removal will be a huge problem, as creators have to do it manually.
Nightshade is an open source project, which allows other programmers and researchers to introduce their own versions and improvements of this method. Scientists are not afraid that they will completely ruin the work on current AI models. These are trained on billions of images, so point "poisonings" should not be a problem for the general operation of AI. However, it will make it difficult to copy artists who are securing themselves with the system.
Nightshade is a tool that offers artists the opportunity to actively defend against unauthorized use of their works in AI models. Although this requires a significant number of "poisoned" images to effectively affect the operation of the model, it is an important step in the fight for artists' rights in the digital age. As representatives of Stability AI noted, the development of AI models is moving towards greater diversity and elimination of biases, but Nightshade shows that artists are not defenseless and can also act to protect their achievements.