zlacker

[parent] [thread] 5 comments
1. pm215+(OP)[view] [source] 2025-05-21 12:17:45
That's funny, but also interesting that it didn't "sign" it. I would naively have expected that being handed a clear instruction like "reply with the following information" would strongly bias the LLM to reply as requested. I wonder if they've special cased that kind of thing in the prompt; or perhaps my intuition is just wrong here?
replies(2): >>Quarre+P2 >>Bedon2+P4
2. Quarre+P2[view] [source] 2025-05-21 12:39:34
>>pm215+(OP)
AI can't, as I understand it, have copyright over anything they do.

Nor can it be an entity to sign anything.

I assume the "not-copyrightable" issue, doesn't in anyway interfere with the rights trying to be protected by the CLA, but IANAL ..

I assume they've explicitly told it not to sign things (perhaps, because they don't want a sniff of their bot agreeing to things on behalf of MSFT).

replies(1): >>candid+x3
◧◩
3. candid+x3[view] [source] [discussion] 2025-05-21 12:45:20
>>Quarre+P2
Are LLM contributions effectively under public domain?
replies(2): >>ben-sc+n9 >>Quarre+go
4. Bedon2+P4[view] [source] 2025-05-21 12:56:42
>>pm215+(OP)
A comment on one of the threads, when a random person tried to have copilot change something, said that copilot will not respond to anyone without write access to the repo. I would assume that bot doesn't have write access, so copilot just ignores them.
◧◩◪
5. ben-sc+n9[view] [source] [discussion] 2025-05-21 13:30:22
>>candid+x3
IANAL. It's my understanding that this hasn't been determined yet. It could be under public domain, under the rights of everyone whose creations were used to train the AI or anywhere in-between.

We do know that LLMs will happily reproduce something from their training set and that is a clear copyright violation. So it can't be that everything they produce is public domain.

◧◩◪
6. Quarre+go[view] [source] [discussion] 2025-05-21 14:58:26
>>candid+x3
This is my understanding, at least in US law.

I can't remember the specific case now, but it has been ruled in the past, that you need human-novelty, and there was a case recently that confirmed this that involved LLMs.

[go to top]