zlacker

[parent] [thread] 12 comments
1. martin+(OP)[view] [source] 2025-06-24 20:28:41
Copyright was codified in an age where plagiarism was time consuming. Even replacing words with synonyms on a mass scale was technically infeasible.

The goal of copyright is to make sure people can get fair compensation for the amount of work they put in. LLMs automate plagiarism on a previously unfathomable scale.

If humans spend a trillion hours writing books, articles, blog posts and code, then somebody (a small group of people) comes and spends a million hours building a machine that ingests all the previous work and produces output based on it, who should get the reward for the work put in?

The original authors together spent a million times more effort (normalized for skill) and should therefore should get a million times bigger reward than those who build the machine.

In other words, if the small group sells access to the product of the combined effort, they only deserve a millionth of the income.

---

If "AI" is as transformative as they claim, they will have no trouble making so much money they they can fairly compensate the original authors while still earning a decent profit. But if it's not, then it's just an overpriced plagiarism automator and their reluctance to acknowledge they are making money on top of everyone else's work is indicative.

replies(2): >>bonobo+d5 >>msabal+B9
2. bonobo+d5[view] [source] 2025-06-24 21:00:10
>>martin+(OP)
> get fair compensation for the amount of work

This is a bit distorted. This is a better summary: The primary purpose of copyright is to induce and reward authors to create new works and to make those works available to the public to enjoy.

The ultimate purpose is to foster the creation of new works that the public can read and written culture can thrive. The means to achieve this is by ensuring that the authors of said works can get financial incentives for writing.

The two are not in opposition but it's good to be clear about it. The main beneficiary is intended to be the public, not the writers' guild.

Therefore when some new factor enters the picture such as LLMs, we have to step back and see how the intent to benefit the reading public can be pursued in the new situation. It certainly has to take into account who and how will produce new written works, but it is not the main target, but can be an instrumental subgoal.

replies(1): >>martin+J9
3. msabal+B9[view] [source] 2025-06-24 21:28:49
>>martin+(OP)
Copyright's goal, at least under Constitution under which this court is ruling is to "promote the progress of science and the useful arts" not to ensure that authors get paid for anything that strikes their whim.

LLMs are models of languages, which are models of reality. If anyone deserves compensation, it's humanity as a whole, for example by nationalizing, or whatever the global equivalent is, LLMs.

Approximately none of the value of LLMs, for any user, is in recreating the text written by an author. Authors have only ever been entitled to (limited) ownership their expression, copyright has never given them ownership of facts.

◧◩
4. martin+J9[view] [source] [discussion] 2025-06-24 21:29:54
>>bonobo+d5
As you point out, people make rules ("laws") which benefit them. I care about fairness and justice though, even if I am a minority.

Fundamentally, fair compensation is based on the amount of work put in (obviously taking skill/competence into account but the differences between people in most disciplines probably don't span a single order of magnitude, let alone several).

The ultimate goal should be to prevent people who don't produce value from taking advantage of those who do. And among those who do, that they get compensated according to the amount of work and skill they put in.

Imagine you spend a year building a house. I have a machine that can take your house and materialize a copy anywhere on earth for free. I charge people (something between 0 and the cost of building your house the normal way) to make them a copy of your house. I can make orders of magnitude more money this way than you. Are you happy about this situation? Does it make a difference how much i charge them?

What if my machine only works if I scan every house on the planet? What if I literally take pictures of it from all sides, then wait for your to not be home and xray it to see what it looks like inside?

You might say that you don't care because now you can also afford many more houses. But it does not make you richer. In fact, it makes you poorer.

Money is not a store of value. If everyone has more money but most people only have 2x more and a small group has a 1000x more, then the relative bargaining power changed so the small group is better off and the large group is worse off. This is what undetectable cheap mass plagiarism leads to for all intellectual work.

---

I wrote a lot of open source code, some of it under permissive licenses, some GPL, some AGPL. The conditions of those licenses are that you credit me. Some of them also require that if you build on top of my work, you release your work with the same licence.

LLMs launder my code to make profit off of it without giving me anything (while other people make profit, thus making me poorer) and without crediting me.

LLMs also take away the rights of the users of my code - (A)GPL forced anyone who builds on top of my work to release the code when asked, with LLM-laundered code, this right no longer seems to exist because who do you even ask?

replies(3): >>jay_ky+gd >>amenho+5e >>bonobo+Ke
◧◩◪
5. jay_ky+gd[view] [source] [discussion] 2025-06-24 21:57:32
>>martin+J9
>Fundamentally, fair compensation is based on the amount of work put in.

I think there is a problem with your initial position. Nobody is entitled to compensation for simply working on something. You have to work on things that people need or want. There is no such thing "fair compensation".

It is "unfair" to take the work of somebody else and sell it as your own. (I don't think the LLMs are doing this.)

replies(1): >>martin+kF2
◧◩◪
6. amenho+5e[view] [source] [discussion] 2025-06-24 22:04:11
>>martin+J9
The "you wouldn't download a car" argument made with a straight face. Remarkable.
◧◩◪
7. bonobo+Ke[view] [source] [discussion] 2025-06-24 22:10:07
>>martin+J9
I understand your sense of justice in cheering on David against Goliath. But the equation is not so clear. The common person is sometimes on this side, sometimes on that side. Copyright can also be weaponized by megacorps against normal people (copying Disney movie DVDs) and LLMs can also be in the hands of the decentralized public (llama ecosystem).

The house thing is a bit offtopic because to be considered for copyright, only its artistic, architectural expression matters. If you want to protect the ingenuity in the technical ways of how it's constructed, that's a patent law thing. It also muddies the water by bringing in aspects of the privacy of one's home by making us imagine paparazzi style photoshoots and sneaky X rays.

The thing is, houses can't be copied like bits and bytes. I would copy a car if I could. If you could copy a loaf of bread for free, it would be a moral imperative to do so, whatever the baker might think about it.

> fair compensation is based on the amount of work put in

This is the labor theory of value, but it has many known problems. For example that the amount of work put in can be disconnected from the amount of value it provides to someone. Pricing via supply/demand market forces have produced much better outcomes across the globe than any other type of allocation. Of course moderated by taxes and so on.

But overall the question is whether LLMs create value for the public. Does it foster prosperity of society? If yes, laws should be such that LLMs can digest more books rather than less. If LLMs are good, they should not be restricted to be trained on copyright-expired writings.

replies(1): >>ethbr1+6q
◧◩◪◨
8. ethbr1+6q[view] [source] [discussion] 2025-06-24 23:50:46
>>bonobo+Ke
The "fairness" argument is weaker than the "sustainable creation" one.

If LLMs could create quality literature, or social media create in-depth reporting, then I'd have no problem with the tide of technological progress flowing.

Unfortunately, recent history has shown that it's trivial for the market to cannibalize the financial model of creators without replacing it.

And as a result, society gets {no more that thing} + {watered down, shitty version}.

Which isn't great.

So I'd love to hear an argument from the 'fuck copyright, let's go AI' crowd (not the position you seem to be espousing) on what year +10 of rampant AI ingestion of copyrighted works looks like...

replies(1): >>bonobo+Nu
◧◩◪◨⬒
9. bonobo+Nu[view] [source] [discussion] 2025-06-25 00:38:37
>>ethbr1+6q
I guess the optimistic take would be that we will get novel, insightful synthesis of disparate fields of knowledge that no human so far was ever able to hold in their mind to contemplate their interrelations. And this will elevate the human spirit etc. The equivalent to the take that the Internet will bring peoples together and foster better understanding and love between people who so far were not in dialogue and this will bring peace and an understanding or how similar we all are etc etc. Not exactly how it played out in the end though. Or how social media and web 2.0 will bring enhanced democracy and transparency and clarity and that the common person will have a voice and so on.

So I'm not exactly naive, but we should then discuss this instead of the red herring of copyright.

replies(1): >>ethbr1+1t1
◧◩◪◨⬒⬓
10. ethbr1+1t1[view] [source] [discussion] 2025-06-25 12:05:40
>>bonobo+Nu
I suppose another strongman would be that LLMs substantially decrease the cost of human creation (i.e. the HITL assistant use case) while producing an output of equivalent quality.

As a result of this, everything gets cheaper and more plentiful.

The counterargument I'd make to that would be the requirement that the human have creative skills, which might atrophy in the absence of business models supporting a career creating.

replies(1): >>bonobo+h02
◧◩◪◨⬒⬓⬔
11. bonobo+h02[view] [source] [discussion] 2025-06-25 15:24:20
>>ethbr1+1t1
Generally, having cheap mass produced things can be great compared to only expensive artisanal stuff that only the rich can afford. Think about furniture, clothes etc. or all the other stuff you have in the house, compared to 100-150 years ago. Today we can buy pretty good mass produced furniture for example. A few generations ago people either did it themselves in a wonky way or paid a lot of money for a hand made carpentry option. Just like with LLMs. LLMs probably do a better job in general writing than a random person off the street. But it's not as good as the top performers. But it's much cheaper. It's a tradeoff.
replies(1): >>ethbr1+fi2
◧◩◪◨⬒⬓⬔⧯
12. ethbr1+fi2[view] [source] [discussion] 2025-06-25 16:59:20
>>bonobo+h02
The difficulty is the biggest gains there are for singular goods which can't be copied at low cost.

Exquisitely designed piece of furniture = expensive copy

Well-written book = cheap copy, post-printing press

So we're not necessarily going to get "more access to better" (because we already had that), but just "cheaper".

Whether that hollows out entire markets or only cannibalizes the bottom of the market (low quality/cheap) remains to be seen.

I wouldn't want to be writing pulp/romance novels these days...

◧◩◪◨
13. martin+kF2[view] [source] [discussion] 2025-06-25 19:16:55
>>jay_ky+gd
Yes, I meant when working on the same thing (which has a specific value as a whole).

If the LLM and its output are based on 10^12 hours of work, out of which 10^6 is working on the code of the LLM itself and 10^12-10^6 (so roughly still 10^12) is working on the training data, does it make sense for only those working on the 10^6 to be compensated for the work?

[go to top]