zlacker
[parent]
[thread]
3 comments
1. 1970-0+(OP)
[view]
[source]
2025-07-07 15:54:04
The buried lede here is Antrhopic will need to attempt to explain to a judge that it is impossible to de-train 7M books from their models.
replies(3):
>>nickps+Wg
>>protoc+1Y
>>rangun+xJ1
◧
2. nickps+Wg
[view]
[source]
2025-07-07 17:32:44
>>1970-0+(OP)
I'm hoping they fail to incentivize using legal, open, and/or licensed data. Then, thry might have to attempt to train a Claude-class model on legal data. Then, I'll have a great, legal model to use. :)
◧
3. protoc+1Y
[view]
[source]
2025-07-07 22:47:54
>>1970-0+(OP)
Or they could be forced to settle a price for access to the books.
◧
4. rangun+xJ1
[view]
[source]
2025-07-08 08:59:26
>>1970-0+(OP)
How come? They just need to delete the model and train a new one without those books.
[go to top]