zlacker

[return to "A federal judge sides with Anthropic in lawsuit over training AI on books"]
1. 3PS+V1[view] [source] 2025-06-24 16:32:07
>>moose4+(OP)
Broadly summarizing.

This is OK and fair use: Training LLMs on copyrighted work, since it's transformative.

This is not OK and not fair use: pirating data, or creating a big repository of pirated data that isn't necessarily for AI training.

Overall seems like a pretty reasonable ruling?

◧◩
2. almata+K9[view] [source] 2025-06-24 17:14:35
>>3PS+V1
If a publisher adds a "no AI training" clause to their contracts, does this ruling render it invalid?
◧◩◪
3. heavys+cb[view] [source] 2025-06-24 17:23:22
>>almata+K9
Fair use overrides licensing
◧◩◪◨
4. AlanYx+Vc[view] [source] 2025-06-24 17:34:26
>>heavys+cb
Fair use "overrides" licensing in the sense that one doesn't need a copyright license if fair use applies. But fair use itself isn't a shield against breach of contract. If you sign a license contract saying you won't train on the thing you've licensed, the licensor still has remedies for breach of contract, just not remedies for copyright infringement (assuming the act is fair use).
◧◩◪◨⬒
5. protoc+k94[view] [source] 2025-06-26 02:47:32
>>AlanYx+Vc
I am not going to sign a contract at the bookstore. Anyone who tries to get me to sign a contract at the bookstore is just going to lose book sales. IIRC the case involved Anthropic literally feeding physical books into scanners. Your proposed solution sounds like its just going to make books worse, not AI better.
[go to top]