zlacker

[return to "We’ve filed a law­suit chal­leng­ing Sta­ble Dif­fu­sion"]
1. Traube+42[view] [source] 2023-01-14 07:18:07
>>zacwes+(OP)
You are literally modern day luddites.

If you succeed, you will undo decades of technological progress.

◧◩
2. klabb3+Z3[view] [source] 2023-01-14 07:43:53
>>Traube+42
How? If you want to distribute a commercial non-research model, simply train it on data sets where people have given consent. I doubt that research would be affected.

At most, I’d expect copyright legislation around training to slightly delay commercial mass-deployment. Given the huge socio-technical transition that is ahead of us, it’s probably a good thing to let people have a chance to form an opinion before opening the floodgates. Judging by our transition into ad-tech social media, I’m not exactly confident that we’ll end up in a good place, even if the tech itself has a lot of potential.

◧◩◪
3. dymk+J4[view] [source] 2023-01-14 07:51:59
>>klabb3+Z3
Every professional artist trained their own brains on some number copyrighted images, without the consent of the original creator.
◧◩◪◨
4. klabb3+gc[view] [source] 2023-01-14 09:11:38
>>dymk+J4
I’m aware of that argument, but it's not a silver bullet. Scale matters. There’s, for instance, a difference between looking at your neighbors house vs recording hi-res video. Human attention is an incredibly scarce resource, and arguably even sacred.
◧◩◪◨⬒
5. dymk+8K[view] [source] 2023-01-14 14:59:39
>>klabb3+gc
A neural network is anything but a high-res recording of the content it’s trained on. It’s got a particularly low resolution view (512x512 iirc) of its inputs.
[go to top]