zlacker

[return to "We’ve filed a law­suit chal­leng­ing Sta­ble Dif­fu­sion"]
1. dr_dsh+12[view] [source] 2023-01-14 07:17:25
>>zacwes+(OP)
“Sta­ble Dif­fu­sion con­tains unau­tho­rized copies of mil­lions—and pos­si­bly bil­lions—of copy­righted images.”

That’s going to be hard to argue. Where are the copies?

“Hav­ing copied the five bil­lion images—with­out the con­sent of the orig­i­nal artists—Sta­ble Dif­fu­sion relies on a math­e­mat­i­cal process called dif­fu­sion to store com­pressed copies of these train­ing images, which in turn are recom­bined to derive other images. It is, in short, a 21st-cen­tury col­lage tool.“

“Diffu­sion is a way for an AI pro­gram to fig­ure out how to recon­struct a copy of the train­ing data through denois­ing. Because this is so, in copy­right terms it’s no dif­fer­ent from an MP3 or JPEG—a way of stor­ing a com­pressed copy of cer­tain dig­i­tal data.”

The examples of training diffusion (eg, reconstructing a picture out of noise) will be core to their argument in court. Certainly during training the goal is to reconstruct original images out of noise. But, do they exist in SD as copies? Idk

◧◩
2. anothe+06[view] [source] 2023-01-14 08:07:50
>>dr_dsh+12
It's going to be very hard to them to argue against Stable Diffusion and not reach the conclusion that people looking at art are doing exactly what training the AI did.

You looked at my art, now I can use copyright against the copies in your brain.

◧◩◪
3. visarg+z6[view] [source] 2023-01-14 08:12:09
>>anothe+06
By forcing the AI community to develop technology to avoid replication of training examples they might disclose every bit of human copying as well. What can detect copyright violations in AI can be applied on human works as well.
[go to top]