zlacker

[return to "Statement from Scarlett Johansson on the OpenAI "Sky" voice"]
1. anon37+t5[view] [source] 2024-05-20 22:58:41
>>mjcl+(OP)
Well, that statement lays out a damning timeline:

- OpenAI approached Scarlett last fall, and she refused.

- Two days before the GPT-4o launch, they contacted her agent and asked that she reconsider. (Two days! This means they already had everything they needed to ship the product with Scarlett’s cloned voice.)

- Not receiving a response, OpenAI demos the product anyway, with Sam tweeting “her” in reference to Scarlett’s film.

- When Scarlett’s counsel asked for an explanation of how the “Sky” voice was created, OpenAI yanked the voice from their product line.

Perhaps Sam’s next tweet should read “red-handed”.

◧◩
2. Increa+8g[view] [source] 2024-05-20 23:59:08
>>anon37+t5
I wouldn't necessarily call that damning. "Soundalikes" are very common in the ad industry.

For example, a car company approached the band sigur ros to include some of their music in a car commercial. Sigur ros declined. A few months later the commercial airs with a song that sounds like an unreleased sigur ros song, but really they just paid a composer to make something that sounds like sigur ros, but isn't. So maybe openai just had a random lady with a voice similar to Scarlett so the recording.

Taking down the voice could just be concern for bad press, or trying to avoid lawsuits regardless of whether you think you are in the right or not. Per this* CNN article:

> Johansson said she hired legal counsel, and said OpenAI “reluctantly agreed” to take down the “Sky” voice after her counsel sent Altman two letters.

So, Johansson's lawyers probably said something like "I'll sue your pants off if you don't take it down". And then they took it down. You can't use that as evidence that they are guilty. It could just as easily be the case that they didn't want to go to court over this even if they thought they were legally above board.

* https://www.cnn.com/2024/05/20/tech/openai-pausing-flirty-ch...

◧◩◪
3. romwel+4i[view] [source] 2024-05-21 00:12:46
>>Increa+8g
Sorry, that's apples-to-pizzas comparison. You're conflating work and identity.

There's an ocean of difference between mimicking the style of someone's art in an original work, and literally cloning someone's likeness for marketing/business reasons.

You can hire someone to make art in the style of Taylor Swift, that's OK.

You can't start selling Taylor Swift figurines by the same principle.

What Sam Altman did, figuratively, was giving out free T-Shirts featuring a face that is recognized as Taylor Swift by anyone who knows her.

◧◩◪◨
4. Increa+rn[view] [source] 2024-05-21 00:45:37
>>romwel+4i
But they aren't doing anything with her voice(allegedly?). They're doing something with a voice that some claim sounds like hers.

But if it isn't, then it is more like selling a figurine called Sally that happens to look a lot like Taylor Swift. Sally has a right to exist even if she happens to look like Taylor Swift.

Has there ever been an up and coming artist who was not allowed to sell their own songs, because they happened to sound a lot like an already famous artist? I doubt it.

[go to top]