zlacker

[parent] [thread] 2 comments
1. immibi+(OP)[view] [source] 2025-01-03 06:39:55
For images, there's Nightshade, which imperceptibly alters your images but makes them poison for AI (does anyone understand why?)

I don't know if there's something similar for text. You could try writing nonsense with a color that doesn't contrast with the background.

The evidence Nightshade works is that AI companies want to make it illegal.

replies(2): >>kelsey+zf >>rcxdud+Rl
2. kelsey+zf[view] [source] 2025-01-03 09:36:54
>>immibi+(OP)
Link to Nightshade: https://nightshade.cs.uchicago.edu/whatis.html

This is fascinating. Would be great to have a web interface artists can use that doesn't require them to install the software locally.

3. rcxdud+Rl[view] [source] 2025-01-03 10:54:20
>>immibi+(OP)
Nightshade and glaze are basically adversarial attacks on various commonly used subcomponents of image generators, most notably the CLIP image captioner, which is both used to generate training data and as part of the generation process.

Like most adversarial attacks, they get more perceptible as they try to be robust to more transformations of the data (both in practice, i.e. applied to a level that's non-trivially removable, tend to make images look like slightly janky AI, ironically), and they are specific to the net(s) they are targeting, so it's more of a temporary defense against the current generation then a long-term protection.

[go to top]