A look at Nightshade and Glaze, UChicago researchers' tools that help artists "mask" or even "poison" their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review)

[ad_1]


Melissa Heikkilä / MIT Technology Review:

A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.



[ad_2]

Source link

Latest