zlacker

[parent] [thread] 19 comments
1. squigz+(OP)[view] [source] 2024-05-23 06:12:55
> for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.

Since this comes up all the time, I ask: What exactly is the number of books a human can ingest before it becomes illegal?

replies(4): >>Seattl+J >>dorkwo+c7 >>numpad+89 >>yoyohe+G41
2. Seattl+J[view] [source] 2024-05-23 06:19:03
>>squigz+(OP)
For a human? Whatever they can consume within their natural life.
replies(1): >>gitgud+E8
3. dorkwo+c7[view] [source] 2024-05-23 07:13:50
>>squigz+(OP)
This is a bit like someone saying they don't want cars traveling down the sidewalk because they're too big and heavy, and then having someone ask how big and heavy a person needs to get before it becomes illegal for them to travel down the sidewalk.

It misses the point, which is that cars aren't people. Arguments like "well a car uses friction to travel along the ground and fuel to create kinetic energy, just like humans do", aren't convincing to me. An algorithm is not a human, and we should stop pretending the same rules apply to each.

replies(4): >>pests+D8 >>Dylan1+la >>TeMPOr+Fa >>mike_h+Ja
◧◩
4. pests+D8[view] [source] [discussion] 2024-05-23 07:26:08
>>dorkwo+c7
Thank you, love this response.
◧◩
5. gitgud+E8[view] [source] [discussion] 2024-05-23 07:26:08
>>Seattl+J
Does natural life still count if a person is using an artificial heart?

What about if they have augmentation that allows them to read and interpret books really fast?

It’s not an easy question to answer…

replies(2): >>komboo+Ua >>Dylan1+zb
6. numpad+89[view] [source] 2024-05-23 07:30:32
>>squigz+(OP)
Depends on similarities between existing data and generative outputs, so minimum is zero. Humans are caught plagiarizing all the time.
replies(1): >>TeMPOr+ia
◧◩
7. TeMPOr+ia[view] [source] [discussion] 2024-05-23 07:40:04
>>numpad+89
Plagiarism is not illegal, it's merely frowned upon, and only in specific fields. Everywhere else, it's called learning from masters and/or practicing your art.
replies(1): >>numpad+3b
◧◩
8. Dylan1+la[view] [source] [discussion] 2024-05-23 07:40:12
>>dorkwo+c7
It's easy to explain the difference between a person and a car in a way that's both specific and relevant to the rules.

If we're at an analogy to "cars aren't people", then it sounds like it doesn't matter how many books the AI reads, even one book would cause problems.

But if that's the case, why make the argument about how many books it reads?

Are you sure you're arguing the same thing as the ancestor post? Or do you merely agree with their conclusion but you're making an entirely different argument?

◧◩
9. TeMPOr+Fa[view] [source] [discussion] 2024-05-23 07:42:34
>>dorkwo+c7
Then again, bicycles are neither people nor cars, and yet they make claim to both sidewalk and the road, even though they clearly are neither, and are a danger and a nuisance on both.
◧◩
10. mike_h+Ja[view] [source] [discussion] 2024-05-23 07:42:47
>>dorkwo+c7
Is that a good example? People have been arguing in court about that exact thing for years, first due to Segway and then due to e-scooters and bikes. There's plenty of people who make arguments of the form "it's not a car or a bike so I'm allowed on the sidewalk", or make arguments about limited top speeds etc.
replies(2): >>dorkwo+re >>CRConr+5D3
◧◩◪
11. komboo+Ua[view] [source] [discussion] 2024-05-23 07:43:59
>>gitgud+E8
"But what if a person was so thoroughly replaced with robot parts to be just like a computer" is just "if my grandma had wheels, she would be a truck, therefore it's not so easy to say that cars aren't allowed to drive inside the old folks home".

People and software are different things, and it makes total sense that there should be different rules for what they can and cannot do.

◧◩◪
12. numpad+3b[view] [source] [discussion] 2024-05-23 07:45:09
>>TeMPOr+ia
wtf.
replies(2): >>Captai+Tm >>throwa+RA
◧◩◪
13. Dylan1+zb[view] [source] [discussion] 2024-05-23 07:48:05
>>gitgud+E8
Your first question doesn't change the answer, and your second question depends on a premise that isn't real.

Natural life is plenty simple in this context.

◧◩◪
14. dorkwo+re[view] [source] [discussion] 2024-05-23 08:12:52
>>mike_h+Ja
> first due to Segway and then due to e-scooters and bikes

Those aren't cars.

But you've identified that the closer something comes to a human in terms of speed and scale, the blurrier the lines become. In these terms I would argue that GPT-4 is far, far removed from a human.

replies(1): >>numpad+ep
◧◩◪◨
15. Captai+Tm[view] [source] [discussion] 2024-05-23 09:27:46
>>numpad+3b
It's true.
◧◩◪◨
16. numpad+ep[view] [source] [discussion] 2024-05-23 09:46:47
>>dorkwo+re
Legally they're vehicles sometimes, and sometimes technically supposed to not drive on sidewalks. Perhaps that's Segway equivalent to fair use scientific researches on crawled web data.
replies(1): >>antice+ay7
◧◩◪◨
17. throwa+RA[view] [source] [discussion] 2024-05-23 11:27:58
>>numpad+3b
Learning is just a conditioned response to inputs.

You were conditioned to give that response.

If I ask an AI about the book Walden Two, for example, it can reproduce and/or remix that. Knowing is copying.

[Why Walden Two? BF Skinner. And an excellent book about how the book was lived: https://www.amazon.com/Living-Walden-Two-Behaviorist-Experim... ]

18. yoyohe+G41[view] [source] 2024-05-23 14:29:48
>>squigz+(OP)
100 per second.
◧◩◪
19. CRConr+5D3[view] [source] [discussion] 2024-05-24 11:24:46
>>mike_h+Ja
> Is that a good example?

Yes. It is pertinent not only to this particular instance (or instances, plural; AI copyright violations and scooters on sidewalks), but illustrates for example why treating corporations as "people" in freedom-of-speech law is misguided (and stupid, corrupt, and just fucking wrong). So it is a very good example.

◧◩◪◨⬒
20. antice+ay7[view] [source] [discussion] 2024-05-26 06:04:48
>>numpad+ep
Research exception is an explicit statutory exception to copyright, not a fair use case.
[go to top]