- OpenAI approached Scarlett last fall, and she refused.
- Two days before the GPT-4o launch, they contacted her agent and asked that she reconsider. (Two days! This means they already had everything they needed to ship the product with Scarlett’s cloned voice.)
- Not receiving a response, OpenAI demos the product anyway, with Sam tweeting “her” in reference to Scarlett’s film.
- When Scarlett’s counsel asked for an explanation of how the “Sky” voice was created, OpenAI yanked the voice from their product line.
Perhaps Sam’s next tweet should read “red-handed”.
That is what matters. OWNERSHIP over her contributions to the world.
If someone licenses an impersonator's voice and it gets very close to the real thing, that feels like an impossible situation for a court to settle and it should probably just be legal (if repugnant).
They literally hired an impersonator, and it cost them 2.5 million (~6 million today).
https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...
Frito-Lay copied a song by Waits (with different lyrics) and had an impersonator sing it. Witnesses testified they thought Waits had sung the song.
If OpenAI were to anonymously copy someone's voice by training AI on an imitation, you wouldn't have:
- a recognizable singing voice
- music identified with a singer
- market confusion about whose voice it is (since it's novel audio coming from a machine)
I don't think any of this is ethical and think voice-cloning should be entirely illegal, but I also don't think we have good precedents for most AI issues.