I'm glad I'm not a lawyer or politician trying to sort this out. If AI gets commercially crippled, I really don't want to live in a world of black market training data.
It never has before. Why now?
Someone (more commonly, some group) invents Impressionism, or Art Deco, or heavy metal, or gangsta rap, or acid-washed jeans, or buzz cuts and pretty soon there are dozens or hundreds of other people creating works in that style, none of whom are paying the originators a cent.
It's hard to find a foothold. Human output doesn't have this restriction. Further, it feels like regulating solar power so coal miners can keep their jobs.
Just banning it or regulating output may seem like a solution to some, but all that means is that we'll cripple ourselves so other, more technologically progressive economies can sprint past us in affected markets. Neither saving the jobs in the end, and ultimately hurting more people than the markets we tried to save.
But we do desperately need to sort out how this is going to devastate entire markets of labor before it risks major economic upheaval with no safety nets in place.
Human artists already do this, extensively. We handle it by making their output the part of the process which holds relevant copyright protections. I can sell Picasso inspired pieces all day long as long as I don’t sell them as “Picasso.”
If I faithfully reproduced “The Old Guitarist”[1] and attempted to sell it as the original, or even as a version to copy and sell prints, I’d be open to legal claims and action. Rightfully so.
I personally haven’t heard a convincing argument as to why ML training should be handled as if it’s the output of the process, rather than the input that it is. I’m open to be swayed and make adjustments to my worldview so I keep looking for counterpoints.