> Alsup ruled that Anthropic's use of copyrighted books to train its AI models was "exceedingly transformative" and qualified as fair use
> "All Anthropic did was replace the print copies it had purchased for its central library with more convenient space-saving and searchable digital copies for its central library — without adding new copies, creating new works, or redistributing existing copies"
It was always somewhat obvious that pirating a library would be copyright infringement. The interesting findings here are that scanning and digitizing a library for internal use is OK, and using it to train models is fair use.
> But Alsup drew a firm line when it came to piracy.
> "Anthropic had no entitlement to use pirated copies for its central library," Alsup wrote. "Creating a permanent, general-purpose library was not itself a fair use excusing Anthropic's piracy."
That is, he ruled that
- buying, physically cutting up, physically digitizing books, and using them for training is fair use
- pirating the books for their digital library is not fair use.
So Suno would only really need to buy the physical albums and rip them to be able to generate music at an industrial scale?
If the output from said model uses the voice of another person, for example, we already have a legal framework in place for determining if it is infringing on their rights, independent of AI.
Courts have heard cases of individual artists copying melodies, because melodies themselves are copyrightable: https://www.hypebot.com/hypebot/2020/02/every-possible-melod...
Copyright law is a lot more nuanced than anyone seems to have the attention span for.
But Suno is definitely not training models in their basement for fun.
They are a private company selling music, using music made by humans to train their models, to replace human musicians and artists.
We'll see what the courts say but that doesn't sound like fair use.
The law doesn't distinguish between basement and cloud – it's a service. You can sell access to the service without selling songs to consumers.
The problem is, copyright law wasn't written for machines. It was written for humans who create things.
In the case of songs (or books, paintings, etc), only humans and companies can legally own copyright, a machine can't. If an AI-powered tool generates a song, there’s no author in the legal sense, unless the person using the tool claims authorship by saying they operated the tool.
So we're stuck in a grey zone: the input is human, the output is AI generated, and the law doesn't know what to do with that.
For me the real debate is: Do we need new rules for non-human creation?