Nightshade, a project from the University of Chicago, gives artists some recourse by “poisoning” image data, rendering it useless or disruptive to AI model training. Ben Zhao, a computer science professor who led the project, compared Nightshade to “putting hot sauce in your lunch so it doesn’t get stolen from the workplace fridge.”
https://techcrunch.com/2024/01/26/nightshade-the-tool-that-poisons-data-gives-artists-a-fighting-chance-against-ai/