This new data poisoning tool lets artists fight back against generative AI


Nightshade is a new tool designed to fight against AI companies that use artists’ work to train their models without permission. Nightshade “poisons” the training data to essentially confuse the AI model and prevent it from copying an artist’s work. The purpose of Nightshade is to return the power to artists to protect their intellectual property and prevent large AI companies like Google and Meta from taking advantage of them.

Related Stories