zlacker

[parent] [thread] 15 comments
1. iandan+(OP)[view] [source] 2023-12-27 15:06:58
Arguing whether it can is not a useful discussion. You can absolutely train a net to memorize and recite text. As these models get more powerful they will memorize more text. The critical thing is how hard is it to make them recite copyrighted works. Critically the question is, did the developers put reasonable guardrails in place to prevent it?

If a person with a very good memory reads an article, they only violate copyright if they write it out and share it, or perform the work publicly. If they have a reasonable understanding of the law they won't do so. However a malicious person could absolutely trick or force them to produce the copyrighted work. The blame in that case however is not on the person who read and recited the article but on the person who tricked them.

That distinction is one we're going to have to codify all over again for AI.

replies(3): >>cudgy+s6 >>JW_000+Rb >>noober+Uo2
2. cudgy+s6[view] [source] 2023-12-27 15:41:51
>>iandan+(OP)
> Critically the question is, did the developers put reasonable guardrails in place to prevent it?

Why? If I steal a bunch of unique works of art and store them in my house for only me to see, am I still committing a crime?

replies(3): >>danthe+07 >>Dalewy+27 >>solard+f9
◧◩
3. danthe+07[view] [source] [discussion] 2023-12-27 15:45:03
>>cudgy+s6
violating copyright is not stealing - it's a government granted monopoly...
replies(2): >>Shrezz+T8 >>jrajav+V9
◧◩
4. Dalewy+27[view] [source] [discussion] 2023-12-27 15:45:09
>>cudgy+s6
Yes, but policing affairs inside the home have always been impractical at the best of times.

Of course, OpenAI and most other "AI" aren't affairs "inside the home"; they are affairs publicly demonstrated far and wide.

replies(1): >>lances+bi
◧◩◪
5. Shrezz+T8[view] [source] [discussion] 2023-12-27 15:57:51
>>danthe+07
Taking an original "one of one" piece from a museum without permission and hanging it up in your livingroom isn't exactly copyright infringement though, is it?
◧◩
6. solard+f9[view] [source] [discussion] 2023-12-27 15:59:26
>>cudgy+s6
Yes... because you're stealing?

But if you simply copied the unique works and stored them, nobody would care. If you then tried to turn around and sell the copies, well, the artist is probably dead anyway and the art is probably public domain, but if not, then yeah it'd be copyright infringement.

If you only copied tiny parts of the art though, then fair use examinations in a court might come into play. It just depends on whether they decide to sue you, like NYT did in this case, while millions of others did not (or just didn't have the resources to).

replies(1): >>asylte+uE
◧◩◪
7. jrajav+V9[view] [source] [discussion] 2023-12-27 16:04:01
>>danthe+07
This is a little ridiculous. There are flaws with copyright law but making money from creative work would be even less viable than it is now if there were no disincentives at all to blatant plagiarism and repackaging right after initial creation.
8. JW_000+Rb[view] [source] 2023-12-27 16:15:28
>>iandan+(OP)
> If a person with a very good memory reads an article, they only violate copyright if they write it out and share it, or perform the work publicly. If they have a reasonable understanding of the law they won't do so. However a malicious person could absolutely trick or force them to produce the copyrighted work. The blame in that case however is not on the person who read and recited the article but on the person who tricked them.

Is that really true? Also, what if the second person is not malicious? In the example of ChatGPT, the user may accidentally write a prompt that causes the model to recite copyrighted text. I don't think a judge will look at this through the same lens as you are.

◧◩◪
9. lances+bi[view] [source] [discussion] 2023-12-27 16:47:30
>>Dalewy+27
Not only not inside the home but also charging money for it.
◧◩◪
10. asylte+uE[view] [source] [discussion] 2023-12-27 18:51:05
>>solard+f9
Yes and OpenAI sells its copies as a subscription, so that’s at least copyright infringement if not theft.
replies(1): >>anigbr+qK
◧◩◪◨
11. anigbr+qK[view] [source] [discussion] 2023-12-27 19:25:01
>>asylte+uE
They're not copies, no matter how much you want them to be.
replies(1): >>asylte+Pv1
◧◩◪◨⬒
12. asylte+Pv1[view] [source] [discussion] 2023-12-28 00:13:48
>>anigbr+qK
They are copied. If I can say something like “make a picture of xyz in the style of Greg rutkowski” and it does so, then it’s a copy. It’s not analogous to a human because a human cannot reproduce things like a machine can. And if someone did copy someone artwork and try to sell it, then yes that would be theft. The logic doesn’t change just because it’s a machine doing it.
replies(1): >>anigbr+XM1
◧◩◪◨⬒⬓
13. anigbr+XM1[view] [source] [discussion] 2023-12-28 03:05:45
>>asylte+Pv1
Repeating what you want to be true doesn't make it so, in either technology or law.
14. noober+Uo2[view] [source] 2023-12-28 10:04:57
>>iandan+(OP)
I hate to do this but this then becomes a "only bad people with a gun kill people" argument. Even most but the most ardent gun rights advocates in that scenario think they shouldn't be extended to very powerful weapons like bombs or nuclear weapons. In this situation then, this logic would be "sure this item allows a person to kill thousands or millions of people, but really the only person at fault in such a situation is the one who presses the button." This ignores the harm done and only focuses on who gets the fault, as if all discourse on law is determining who is a bad guy or a good guy in a movie script.

The general prescription (that I do agree not everyone accepts) society has come up with is we relegate control of some of these weapons to governments and outright ban others (like chemical weapons, biological weapons, and such) through treaties. If LLMs can cause so much damage and their use can be abused so widely, you have to stop focusing on questions about whether a user is culpable or not and move to consider whether their wide use is okay and shouldn't be controlled.

replies(1): >>iandan+Bh4
◧◩
15. iandan+Bh4[view] [source] [discussion] 2023-12-28 21:48:23
>>noober+Uo2
This is a lawsuit, not a call for regulatory action. They are claiming there are guilty parties under existing law. Culpability is the point.
replies(1): >>noober+lD4
◧◩◪
16. noober+lD4[view] [source] [discussion] 2023-12-29 00:34:04
>>iandan+Bh4
No you're right. The reply I made concerns the logic itself especially if this justification is used to ward off regulation in the future. For the suit in question, culpability in fact central.
[go to top]