zlacker

[return to "We’ve filed a law­suit chal­leng­ing Sta­ble Dif­fu­sion"]
1. dr_dsh+12[view] [source] 2023-01-14 07:17:25
>>zacwes+(OP)
“Sta­ble Dif­fu­sion con­tains unau­tho­rized copies of mil­lions—and pos­si­bly bil­lions—of copy­righted images.”

That’s going to be hard to argue. Where are the copies?

“Hav­ing copied the five bil­lion images—with­out the con­sent of the orig­i­nal artists—Sta­ble Dif­fu­sion relies on a math­e­mat­i­cal process called dif­fu­sion to store com­pressed copies of these train­ing images, which in turn are recom­bined to derive other images. It is, in short, a 21st-cen­tury col­lage tool.“

“Diffu­sion is a way for an AI pro­gram to fig­ure out how to recon­struct a copy of the train­ing data through denois­ing. Because this is so, in copy­right terms it’s no dif­fer­ent from an MP3 or JPEG—a way of stor­ing a com­pressed copy of cer­tain dig­i­tal data.”

The examples of training diffusion (eg, reconstructing a picture out of noise) will be core to their argument in court. Certainly during training the goal is to reconstruct original images out of noise. But, do they exist in SD as copies? Idk

◧◩
2. synu+H4[view] [source] 2023-01-14 07:51:37
>>dr_dsh+12
You could make the same argument that as long as you are using lossy compression you are unable to infringe on copyright.
◧◩◪
3. visarg+h6[view] [source] 2023-01-14 08:09:50
>>synu+H4
That's a huge understatement. 5 billion images to a model of 5GB. 1 byte per image. Let's see if one byte per image would constitute a copyright violation in other fields than neural networks.
◧◩◪◨
4. forgot+29[view] [source] 2023-01-14 08:36:49
>>visarg+h6
The distribution of the bytes matters a bit here. In theory the model could be over trained against one copyrighted work such that it is almost perfectly preserved within the model.
◧◩◪◨⬒
5. synu+8a[view] [source] 2023-01-14 08:49:47
>>forgot+29
You can see this with the Mona Lisa. You can get pretty close reproductions back by asking for it (or at least you could in one of the iterations). Likely it overfit due to it being such a ubiquitous image.
[go to top]