zlacker

[return to "OpenAI didn’t copy Scarlett Johansson’s voice for ChatGPT, records show"]
1. jrockw+wL[view] [source] 2024-05-23 06:00:48
>>richar+(OP)
I was perusing some Simpsons clips this afternoon and came across a story to the effect of "So and so didn't want to play himself, so Dan Castellaneta did the voice." It's a good impression and people didn't seem very upset about that. I am not sure how this is different. (Apparently this particular "impression" predates the Her character, so it's even easier to not be mad about. It's just a coincidence. They weren't even trying to sound like her!)

I read a lot of C&D letters from celebrities here and on Reddit, and a lot of them are in the form of "I am important so I am requesting that you do not take advantage of your legal rights." I am not a fan. (If you don't want someone to track how often you fly your private jet, buy a new one for each trip. That is the legal option that is available to you. But I digress...)

◧◩
2. pavlov+fM[view] [source] 2024-05-23 06:10:13
>>jrockw+wL
Surely there’s some kind of difference between “voice impression for a two-line cameo in one episode of an animated sitcom” and “reproducing your voice as the primary interface for a machine that could be used by billions of people and is worth hundreds of billions of dollars.”

Is there a name for this AI fallacy? The one where programmers make an inductive leap like, for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.

◧◩◪
3. squigz+wM[view] [source] 2024-05-23 06:12:55
>>pavlov+fM
> for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.

Since this comes up all the time, I ask: What exactly is the number of books a human can ingest before it becomes illegal?

◧◩◪◨
4. dorkwo+IT[view] [source] 2024-05-23 07:13:50
>>squigz+wM
This is a bit like someone saying they don't want cars traveling down the sidewalk because they're too big and heavy, and then having someone ask how big and heavy a person needs to get before it becomes illegal for them to travel down the sidewalk.

It misses the point, which is that cars aren't people. Arguments like "well a car uses friction to travel along the ground and fuel to create kinetic energy, just like humans do", aren't convincing to me. An algorithm is not a human, and we should stop pretending the same rules apply to each.

◧◩◪◨⬒
5. Dylan1+RW[view] [source] 2024-05-23 07:40:12
>>dorkwo+IT
It's easy to explain the difference between a person and a car in a way that's both specific and relevant to the rules.

If we're at an analogy to "cars aren't people", then it sounds like it doesn't matter how many books the AI reads, even one book would cause problems.

But if that's the case, why make the argument about how many books it reads?

Are you sure you're arguing the same thing as the ancestor post? Or do you merely agree with their conclusion but you're making an entirely different argument?

[go to top]