I read a lot of C&D letters from celebrities here and on Reddit, and a lot of them are in the form of "I am important so I am requesting that you do not take advantage of your legal rights." I am not a fan. (If you don't want someone to track how often you fly your private jet, buy a new one for each trip. That is the legal option that is available to you. But I digress...)
Is there a name for this AI fallacy? The one where programmers make an inductive leap like, for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.
Since this comes up all the time, I ask: What exactly is the number of books a human can ingest before it becomes illegal?
It misses the point, which is that cars aren't people. Arguments like "well a car uses friction to travel along the ground and fuel to create kinetic energy, just like humans do", aren't convincing to me. An algorithm is not a human, and we should stop pretending the same rules apply to each.
If we're at an analogy to "cars aren't people", then it sounds like it doesn't matter how many books the AI reads, even one book would cause problems.
But if that's the case, why make the argument about how many books it reads?
Are you sure you're arguing the same thing as the ancestor post? Or do you merely agree with their conclusion but you're making an entirely different argument?