The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
This new data poisoning tool lets artists fight back against generative AI