Disruptors

Nightshade via University of Chicago’s Glaze Project ….

Subverting the AI learning + training process at the pixel level?

However it’s not anti-AI, though it sounds that way initially…it’s designed to protect artistic vision. Their goal is to “create an ecosystem in which users of image-generating programs would need the approval of rightsholders to get unaltered access to training images.”

Glaze is a system designed to protect human artists by disrupting style mimicry. At a high level, Glaze works by understanding the AI models that are training on human art, and using machine learning algorithms, computing a set of minimal changes to artworks, such that it appears unchanged to human eyes, but appears to AI models like a dramatically different art style.” – see more here

Nightshade is a little different: Glaze disrupts the stylistic mode training models interpret, while Nightshade disrupts the content.

Leave a Reply

Your email address will not be published. Required fields are marked *