zlacker

[return to "A federal judge sides with Anthropic in lawsuit over training AI on books"]
1. 3PS+V1[view] [source] 2025-06-24 16:32:07
>>moose4+(OP)
Broadly summarizing.

This is OK and fair use: Training LLMs on copyrighted work, since it's transformative.

This is not OK and not fair use: pirating data, or creating a big repository of pirated data that isn't necessarily for AI training.

Overall seems like a pretty reasonable ruling?

◧◩
2. derbOa+H6[view] [source] 2025-06-24 16:56:10
>>3PS+V1
But those training the LLMs are still using the works, and not just to discuss them, which I think is the point of fair use doctrine. I guess I fail to see how it's any different from me using it in some other way? If I wanted to write a play very loosely inspired by Blood Meridian, it might be transformative, but that doesn't justify me pirating the book.

I tend to think copyright should be extremely limited compared to what it is now, but to me the logic of this ruling is illogical other than "it's ok for a corporation to use lots of works without permission but not for an individual to use a single work without permission." Maybe if they suddenly loosened copyright enforcement for everyone I might feel differently.

"Kill one man, and you are a murderer. Kill millions of men, and you are a conqueror." (An admittedly hyperbolic comparison, but similar idea.)

◧◩◪
3. protoc+594[view] [source] 2025-06-26 02:43:18
>>derbOa+H6
If I buy a book, and use it to prop up the table on which I build a door, I dont owe the author any additional money over what I paid for it.

If I buy a book, and as long as the product the book teaches me to build isnt a competing book, the original author should have no avenue for complaint.

People are really getting hung up on the computer reading the data and computing other data with it. It shouldnt even need to get to fair use. Its so obviously none of the authors business well before fair use.

[go to top]