zlacker

[return to "AI companies have all kinds of arguments against paying for copyrighted content"]
1. andy99+gf[view] [source] 2023-11-05 18:17:33
>>rntn+(OP)
Copyright holders make all kinds of arguments for why they should be get money for incidental exposure to their work. This is all about greed and jealousy. If someone uses AI to make infringing content, existing laws already cover that. The fact that an ML model could be used to generate infringing content, and has exposure to or "knowledge" of some copyrighted material is immaterial. People just see someone else making money and want to try and get a piece of it.
◧◩
2. rvz+rh[view] [source] 2023-11-05 18:28:56
>>andy99+gf
All I see is AI companies poorly justifying their grift that they know they don't want to pay for the content that they are commercializing without permission and pull the fair use excuses.

It is no wonder why OpenAI had to pay Shutterstock for training on their data and Getty suing Stability AI for training on their watermarked images and using it commercially without permission and actors / actresses filing lawsuits against commercial voice cloners which costs them close to nothing, as those companies either take down the cloned voice offering or shutdown.

These weak arguments from these AI folks sound like excuses justifying a newly found grift.

◧◩◪
3. Tadpol+sj[view] [source] 2023-11-05 18:39:01
>>rvz+rh
When you're viewing everyone with a different opinion than you as a grifter, corporate rat, or some other malicious entity, you've disabled the ability or desire for people to try to engage with you. You won't be convinced, and you're already being uncivil and fallacious.

AI outputs should be regulated, of course. Obviously impersonation and copyright law already applies to AI systems. But a discussion on training inputs is entirely novel to man and our laws, and it's a very nuanced and important topic. And as AI advances, it becomes increasingly difficult because of the diminishing distinction between "organic" learning and "artificial" learning. As well as when stopping AI from — as an example — learning from research papers means we miss out on life-saving medication. Where do property rights conflict with human rights?

They're important conversations to have, but you've destroyed the opportunity to have them from the starting gun.

◧◩◪◨
4. Turing+pn[view] [source] 2023-11-05 18:59:04
>>Tadpol+sj
> AI outputs should be regulated, of course.

Why "of course"?

[go to top]