I read a lot of C&D letters from celebrities here and on Reddit, and a lot of them are in the form of "I am important so I am requesting that you do not take advantage of your legal rights." I am not a fan. (If you don't want someone to track how often you fly your private jet, buy a new one for each trip. That is the legal option that is available to you. But I digress...)
Is there a name for this AI fallacy? The one where programmers make an inductive leap like, for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.
Since this comes up all the time, I ask: What exactly is the number of books a human can ingest before it becomes illegal?
It misses the point, which is that cars aren't people. Arguments like "well a car uses friction to travel along the ground and fuel to create kinetic energy, just like humans do", aren't convincing to me. An algorithm is not a human, and we should stop pretending the same rules apply to each.
Those aren't cars.
But you've identified that the closer something comes to a human in terms of speed and scale, the blurrier the lines become. In these terms I would argue that GPT-4 is far, far removed from a human.