zlacker

[return to "OpenAI didn’t copy Scarlett Johansson’s voice for ChatGPT, records show"]
1. jrockw+wL[view] [source] 2024-05-23 06:00:48
>>richar+(OP)
I was perusing some Simpsons clips this afternoon and came across a story to the effect of "So and so didn't want to play himself, so Dan Castellaneta did the voice." It's a good impression and people didn't seem very upset about that. I am not sure how this is different. (Apparently this particular "impression" predates the Her character, so it's even easier to not be mad about. It's just a coincidence. They weren't even trying to sound like her!)

I read a lot of C&D letters from celebrities here and on Reddit, and a lot of them are in the form of "I am important so I am requesting that you do not take advantage of your legal rights." I am not a fan. (If you don't want someone to track how often you fly your private jet, buy a new one for each trip. That is the legal option that is available to you. But I digress...)

◧◩
2. pavlov+fM[view] [source] 2024-05-23 06:10:13
>>jrockw+wL
Surely there’s some kind of difference between “voice impression for a two-line cameo in one episode of an animated sitcom” and “reproducing your voice as the primary interface for a machine that could be used by billions of people and is worth hundreds of billions of dollars.”

Is there a name for this AI fallacy? The one where programmers make an inductive leap like, for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.

◧◩◪
3. squigz+wM[view] [source] 2024-05-23 06:12:55
>>pavlov+fM
> for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.

Since this comes up all the time, I ask: What exactly is the number of books a human can ingest before it becomes illegal?

◧◩◪◨
4. dorkwo+IT[view] [source] 2024-05-23 07:13:50
>>squigz+wM
This is a bit like someone saying they don't want cars traveling down the sidewalk because they're too big and heavy, and then having someone ask how big and heavy a person needs to get before it becomes illegal for them to travel down the sidewalk.

It misses the point, which is that cars aren't people. Arguments like "well a car uses friction to travel along the ground and fuel to create kinetic energy, just like humans do", aren't convincing to me. An algorithm is not a human, and we should stop pretending the same rules apply to each.

◧◩◪◨⬒
5. mike_h+fX[view] [source] 2024-05-23 07:42:47
>>dorkwo+IT
Is that a good example? People have been arguing in court about that exact thing for years, first due to Segway and then due to e-scooters and bikes. There's plenty of people who make arguments of the form "it's not a car or a bike so I'm allowed on the sidewalk", or make arguments about limited top speeds etc.
◧◩◪◨⬒⬓
6. CRConr+Bp4[view] [source] 2024-05-24 11:24:46
>>mike_h+fX
> Is that a good example?

Yes. It is pertinent not only to this particular instance (or instances, plural; AI copyright violations and scooters on sidewalks), but illustrates for example why treating corporations as "people" in freedom-of-speech law is misguided (and stupid, corrupt, and just fucking wrong). So it is a very good example.

[go to top]