zlacker

[return to "Statement from Scarlett Johansson on the OpenAI "Sky" voice"]
1. anon37+t5[view] [source] 2024-05-20 22:58:41
>>mjcl+(OP)
Well, that statement lays out a damning timeline:

- OpenAI approached Scarlett last fall, and she refused.

- Two days before the GPT-4o launch, they contacted her agent and asked that she reconsider. (Two days! This means they already had everything they needed to ship the product with Scarlett’s cloned voice.)

- Not receiving a response, OpenAI demos the product anyway, with Sam tweeting “her” in reference to Scarlett’s film.

- When Scarlett’s counsel asked for an explanation of how the “Sky” voice was created, OpenAI yanked the voice from their product line.

Perhaps Sam’s next tweet should read “red-handed”.

◧◩
2. Increa+8g[view] [source] 2024-05-20 23:59:08
>>anon37+t5
I wouldn't necessarily call that damning. "Soundalikes" are very common in the ad industry.

For example, a car company approached the band sigur ros to include some of their music in a car commercial. Sigur ros declined. A few months later the commercial airs with a song that sounds like an unreleased sigur ros song, but really they just paid a composer to make something that sounds like sigur ros, but isn't. So maybe openai just had a random lady with a voice similar to Scarlett so the recording.

Taking down the voice could just be concern for bad press, or trying to avoid lawsuits regardless of whether you think you are in the right or not. Per this* CNN article:

> Johansson said she hired legal counsel, and said OpenAI “reluctantly agreed” to take down the “Sky” voice after her counsel sent Altman two letters.

So, Johansson's lawyers probably said something like "I'll sue your pants off if you don't take it down". And then they took it down. You can't use that as evidence that they are guilty. It could just as easily be the case that they didn't want to go to court over this even if they thought they were legally above board.

* https://www.cnn.com/2024/05/20/tech/openai-pausing-flirty-ch...

◧◩◪
3. dragon+Oh[view] [source] 2024-05-21 00:11:24
>>Increa+8g
> I wouldn't necessarily call that damning. "Soundalikes" are very common in the ad industry.

As are disclaimers that celebrity voices are impersonated when there is additional context which makes it likely that the voice would be considered something other than a mere soundalike, like direct reference to a work in which the impersonated celebrity was involved as part of the same publicity campaign.

And liability for commercial voice appropriation, even by impersonation, is established law in some jurisdictions, including California.

◧◩◪◨
4. Increa+ao[view] [source] 2024-05-21 00:50:44
>>dragon+Oh
The most famous case of voice appropriation was Midler vs Ford, which involved Ford paying a Midler impersonator to perform a well known Midler song, creating the impression that it was actually Bette.

Where are the signs or symbols tying Scarlett to the openAI voice? I don't think a single word, contextless message on a separate platform that 99% of openAI users will not see is significant enough to form that connection in users heads.

◧◩◪◨⬒
5. pseuda+BC[view] [source] 2024-05-21 03:05:34
>>Increa+ao
The Midler v. Ford decision said her voice was distinctive. Not the song.

The replies to Altman's message showed readers did connect it to the film. And people noticed the voice sounded like Scarlett Johansson and connected it to the film when OpenAI introduced it in September.[1]

How do you believe Altman intended people to interpret his message?

[1] https://www.reddit.com/r/ChatGPT/comments/177v8wz/i_have_a_r...

[go to top]