zlacker

[return to "OpenAI didn’t copy Scarlett Johansson’s voice for ChatGPT, records show"]
1. jmull+P12[view] [source] 2024-05-23 15:22:46
>>richar+(OP)
Well, here are some things that aren't really being disputed:

* OpenAI wanted an AI voice that sounds like SJ

* SJ declined

* OpenAI got an AI voice that sounds like SJ anyway

I guess they want us to believe this happened without shenanigans, but it's bit hard to.

The headline of the article is a little funny, because records can't really show they weren't looking for an SJ sound-alike. They can just show that those records didn't mention it. The key decision-makers could simply have agreed to keep that fact close-to-the-vest -- they may have well understood that knocking off a high-profile actress was legally perilous.

Also, I think we can readily assume OpenAI understood that one of their potential voices sounded a lot like SJ. Since they were pursuing her they must have had a pretty good idea of what they were going after, especially considering the likely price tag. So even if an SJ voice wasn't the original goal, it clearly became an important goal to them. They surely listened to demos for many voice actors, auditioned a number of them, and may even have recorded many of them, but somehow they selected one for release who seemed to sound a lot like SJ.

◧◩
2. spiral+5b2[view] [source] 2024-05-23 16:03:27
>>jmull+P12
Sky does not really sound like SJ though if you listen side by side. According to OAI's timeline, they intended to have Sky in addition to SJ. OAIs voice models including Sky predate the GPT4o voice assistant. Also:

"In a statement from the Sky actress provided by her agent, she wrote that at times the backlash “feels personal being that it’s just my natural voice and I’ve never been compared to her by the people who do know me closely.”"

It did not seem like an issue before and the Sky voice was public many months before GPT4o. I don't believe SJ can claim to own all young, attractive woman voices whether they are used as a voice assistant or not. It seems like the issue is being blown out of proportion. It does make a good story. The public perception of AI right now is generally negative and people are looking for reasons to disparage AI companies. Maybe there are good reasons sometimes, but this one is not it.

◧◩◪
3. buu700+vi2[view] [source] 2024-05-23 16:41:29
>>spiral+5b2
I'm also curious, legally speaking, is it an issue even if Sky's actress does sound like Scarlett? What if OpenAI admits they intentionally chose someone who sounded like Scarlett? Does it matter whether she was using her natural speaking voice or intentionally mimicking Scarlett's voice and mannerisms?

This seems similar to the latest season of Rick and Morty. Whether justified or not in that particular case, it rubs me the wrong way a bit in principle to think that a production can fire someone only to hire someone else to do a near-perfect copy of their likeness. If (as in the OpenAI case) they'd gone further and trained an AI on the impressions of Justin's voice, would that have been considered an AI impersonation of Justin with extra steps?

All of which is to say, this seems like a pretty interesting legal question to me, and potentially broader than just AI.

◧◩◪◨
4. sowbug+mp2[view] [source] 2024-05-23 17:15:50
>>buu700+vi2
The fired actor would have already signed away any claim to the character's likeness. The likeness the company cares about is that of the character, not of the actor portraying the character. The actor never owned the character, so the actor shouldn't be miffed that someone else gets the part for future performances.
◧◩◪◨⬒
5. buu700+8r2[view] [source] 2024-05-23 17:24:27
>>sowbug+mp2
That's probably the case. Having said that, there are also a lot of one-off side characters which use Justin's distinctive voice style, although I can't remember specifically whether that was the case in the latest season, and I'm not aware that detailed information about their internal agreements is public knowledge either way. I was speaking more about the general principle, not strictly that particular situation. Maybe I'm wrong, but it seems like there are some interesting overlapping legal and moral dilemmas in all of the discussions about both situations, regardless of what the specific facts of the OpenAI and R&M cases may be.
◧◩◪◨⬒⬓
6. sowbug+sE2[view] [source] 2024-05-23 18:39:23
>>buu700+8r2
Yes, I can see a plausible argument that a character is so intertwined with a well-known real-life persona that a company can't replicate a character without borrowing some of the persona's value. One might also make the case that the actor developed so much depth in an initially thin character that they deserve more credit than just acting the part.

I don't personally subscribe to the notion that the recent legal invention of intellectual property is a moral right. Capitalism has been doing just fine as a productivity motivator. We don't need to capitalize expression of ideas, let alone pure ideas. I accept the tradeoff of the temporary monopolies of copyright and patent, and I appreciate that trademark and trade secrets disincentivize bad behavior. But I have no desire to try to find new boxes to store new kinds of intellectual property, like Scarlett Johansson's right to monopolize performances of a character in an app that remind people of her performance of a character in a movie. Such a kind of property right is not necessary.

[go to top]