zlacker

[return to "OpenAI didn’t copy Scarlett Johansson’s voice for ChatGPT, records show"]
1. jmull+P12[view] [source] 2024-05-23 15:22:46
>>richar+(OP)
Well, here are some things that aren't really being disputed:

* OpenAI wanted an AI voice that sounds like SJ

* SJ declined

* OpenAI got an AI voice that sounds like SJ anyway

I guess they want us to believe this happened without shenanigans, but it's bit hard to.

The headline of the article is a little funny, because records can't really show they weren't looking for an SJ sound-alike. They can just show that those records didn't mention it. The key decision-makers could simply have agreed to keep that fact close-to-the-vest -- they may have well understood that knocking off a high-profile actress was legally perilous.

Also, I think we can readily assume OpenAI understood that one of their potential voices sounded a lot like SJ. Since they were pursuing her they must have had a pretty good idea of what they were going after, especially considering the likely price tag. So even if an SJ voice wasn't the original goal, it clearly became an important goal to them. They surely listened to demos for many voice actors, auditioned a number of them, and may even have recorded many of them, but somehow they selected one for release who seemed to sound a lot like SJ.

◧◩
2. HarHar+r82[view] [source] 2024-05-23 15:52:05
>>jmull+P12
Clearly an SJ voice was the goal, given that Altman asked her to do it, asked her a second time just two days before the ChatGPT-4o release, and then tweeted "her" on the release day. The next day Karpathy, recently ex-OpenAI, then tweets "The killer app of LLMs is Scarlett Johansson".

Altman appears to be an habitual liar. Note his recent claim not to be aware of the non-disparagement and claw-back terms he had departing employees agree to. Are we supposed to believe that the company lawyer or head of HR did this without consulting (or more likely being instructed by) the co-founder and CEO?!

◧◩◪
3. tptace+B82[view] [source] 2024-05-23 15:52:43
>>HarHar+r82
They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.
◧◩◪◨
4. summer+6m2[view] [source] 2024-05-23 16:58:04
>>tptace+B82
My guess: Sam wanted to imitate the voice from Her and became aware of Midler v. Ford cases so reached out to SJ. He probably didn't expect her decline. Anyway, this prior case tells that you cannot mimic other's voice without their permission and the overall timeline indicates OpenAI's "intention" of imitation. It does not matter if they used SJ's voice in the training set or not. Their intention matters.
◧◩◪◨⬒
5. tptace+kX2[view] [source] 2024-05-23 20:19:34
>>summer+6m2
A random person's normal speaking voice is nobody's intellectual property. The burden would have been on SJ to prove that the voice actor they hired was "impersonating" SJ. She was not: the Washington Post got her to record a voice sample to illustrate that she wasn't doing an impersonation.

Unless & until some 3rd other shoe drops, what we know now strongly --- overwhelmingly, really --- suggests that there was simply no story here. But we are all biased towards there being an interesting story behind everything, especially when it ratifies our casting of good guys and bad guys.

◧◩◪◨⬒⬓
6. tangen+eh3[view] [source] 2024-05-23 22:29:47
>>tptace+kX2
You’re right that a random person’s voice is not IP, but SJ is not a random person. She’s much more like Mr. Waits or Ms. Milder than you or I.

I don’t believe the burden would be to prove that the voice actor was impersonating, but that she was misappropriating. Walking down the street sounding like Bette Midler isn’t a problem but covering her song with an approximation of her voice is.

You are dead right that the order of operations recently uncovered precludes misappropriation. But it’s an interesting situation otherwise, hypothetically, to wonder if using SJ’s voice to “cover” her performance as the AI in the movie would be misappropriation.

◧◩◪◨⬒⬓⬔
7. qarl+Ez3[view] [source] 2024-05-24 00:59:55
>>tangen+eh3
> You are dead right that the order of operations recently uncovered precludes misappropriation.

I don't think that follows. It's entirely possible that OpenAI wanted to get ScarJo, but believed that simply wasn't possible so went with a second choice. Later they decided they might as well try anyway.

This scenario does not seem implausible in the least.

Remember, Sam Altman has stated that "Her" is his favorite movie. It's inconceivable that he never considered marketing his very similar product using the film's IP.

◧◩◪◨⬒⬓⬔⧯
8. tangen+4T5[view] [source] 2024-05-24 21:36:44
>>qarl+Ez3
I interpret it completely differently given that the voice actor does not sound like SJ.

1. OpenAI wants to make a voice assistant. 2. They hire the voice actor. 3. Someone at OpenAI wonders why they would make a voice assistant that doesn’t sound like the boss’s favorite movie. 4. They reach out to SJ who tells them to pound sand.

Accordingly, there is no misappropriation because there is no use.

◧◩◪◨⬒⬓⬔⧯▣
9. qarl+226[view] [source] 2024-05-24 22:59:44
>>tangen+4T5
I understand that the voice actor does not sound like ScarJo to you.

But you need to understand that it does sound like ScarJo to a lot of people. Maybe 50% of the people who hear it.

Those kinds of coincidences are the things that make you lose in court.

◧◩◪◨⬒⬓⬔⧯▣▦
10. throwt+qP6[view] [source] 2024-05-25 11:42:45
>>qarl+226
The voice is different enough that anyone who listens to samples longer than 5 seconds side by side and says they can’t tell them apart is obviously lying.

All the reporting around this I’ve seen uses incredibly short clips. There are hours of recorded audio of SJ speaking and there are lots of examples of the Sky voice out there since it’s been around since September.

◧◩◪◨⬒⬓⬔⧯▣▦▧
11. qarl+I67[view] [source] 2024-05-25 14:48:02
>>throwt+qP6
To elaborate on the other comment -

It doesn't even need to sound like the person. It's about the intent. Did OpenAI intend to imitate ScarJo.

Half the people of the world thinking it's ScarJo is strong evidence that it's not an accident.

Given that "Her" is Sam's favorite movie, and that he cryptically tweeted "her" the day it launched, and that he reached out to ScarJo to do the voice, and that the company reached out to her again to reconsider two days before the launch -

I personally think the situation is very obvious. I understand that some people strongly disagree - but then there are some people who think the Earth is flat. So.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨
12. foobar+1Z7[view] [source] 2024-05-25 23:55:50
>>qarl+I67
I don't think the intent matters (though it's moot in this case because I think there is clear intent): If someone unknowingly used a celebrity's likeness I think they would still be able to prohibit its use since the idea is that they have a "right" to its use in general, not that they have a defence against being wronged by a person in particular.

For example if someone anonymously used a celebrity's likeness to promote something you wouldn't need to identify the person (which would be necessary to prove intent) in order to stop have the offending material removed or prevented from being distributed.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨◲
13. qarl+jU8[view] [source] 2024-05-26 14:05:45
>>foobar+1Z7
> I don't think the intent matters

The closing statement of Midler v. Ford is:

"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."

Deliberate is a synonym for intentional.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨◲◳
14. foobar+BC9[view] [source] 2024-05-26 20:16:30
>>qarl+jU8
The passage you cited reads "we hold only that when" (compare with "we hold that only when") which I understand as that they are defining the judgment narrowly and punting on other questions (like whether the judgment would be different if there were no intent) as courts often do. In fact the preceding sentence is "We need not and do not go so far as to hold..."

It might make sense for intent to be required in order to receive damages but it would surprise me if you couldn't stop an inadvertent use of someone's likeness. In fact the Midler case cites the Ford one: 'The defendants were held to have invaded a "proprietary interest" of Motschenbacher in his own identity.'. I think you can invade someone's "proprietary interest" inadvertently just as you can take someone's property inadvertently; and courts can impose a corrective in both cases, in the first by ordering the invasion of proprietary interest be stopped and in the second by returning the taken property.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨◲◳⚿
15. qarl+4V9[view] [source] 2024-05-26 23:01:33
>>foobar+BC9
Fair enough. But then Midler v Ford doesn't support your argument. Do you have a case that does?
◧◩◪◨⬒⬓⬔⧯▣▦▧▨◲◳⚿⛋
16. foobar+qmc[view] [source] 2024-05-27 22:55:20
>>qarl+4V9
No. (I did cite the Ford statement about "proprietary interest" which I think supports my argument).

I'm not familiar with all the case law but I assume that no case has been brought that directly speaks to the issue but people can and do discuss cases that don't yet have specific precedent.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨◲◳⚿⛋⬕
17. qarl+rqc[view] [source] 2024-05-27 23:32:04
>>foobar+qmc
Well - sure - for exotic areas of the law. Can the president pardon himself, etc.

Just seems like this area isn't that exotic.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨◲◳⚿⛋⬕⬚
18. foobar+Jtc[view] [source] 2024-05-28 00:03:50
>>qarl+rqc
I don't think that's true. I can't cite them off the top of my head but when I read about supreme court cases often a big point of contention of the ruling is whether they decided to issue a narrow or broad ruling. Sometimes they decide to hear a case or not based on whether it would serve as a good basis for the type (narrow/broad) ruling they want to issue.

And the legal universe is vast with new precedent case law being made every year so I don't think the corpus of undecided law is confined to well known legal paradoxes.

As for this case it doesn't seem that odd to me that the issue of intent has never been at issue: I would expect that typically the intent would be obvious (as it is in the OpenAI case) so no one has ever had to decide whether it mattered.

[go to top]