University of Chicago researchers seek to “poison” AI art generators with Nightshade

image via arstechnica.com
image via arstechnica.com

The open source "poison pill" tool (as the University of Chicago's press department calls it) alters images in ways invisible to the human eye that can corrupt an AI model's training process. Many image synthesis models, with notable exceptions of those from Adobe and Getty Images, largely use data sets of images scraped from the web without artist permission, which includes copyrighted material. (OpenAI licenses some of its DALL-E training images from Shutterstock.)

https://arstechnica.com/information-technology/2023/10/university-of-chicago-researchers-seek-to-poison-ai-art-generators-with-nightshade/