OpenAI's mistake was caving to SJ. They should have kept Sky and told SJ to get lost. If SJ sued, they could simply prove another voice actor was used and make the legitimate argument that SJ doesn't have a monopoly on voices similar to hers.
I think what’s going on here is that Scarlett is famous, and so media outlets will widely cover this. In other words, this latest incident hasn’t riled up people any more than usual — if you scan the comments, they’re not much different from how people already felt about OpenAI. But now there’s an excuse for everybody to voice their opinions simultaneously.
They’re acting like the company literally stole something.
It also didn’t help that OpenAI removed the Sky voice. Why would they do that unless they have something to hide? The answer of course is that Scarlett is famously rich, and a famously rich person can wage a famously expensive lawsuit against OpenAI, even if there’s no basis. But OpenAI should’ve paid the cost. Now it just looks like their hand was caught in some kind of cookie jar, even though no one can say precisely what kind of cookies were being stolen.
2. Ford explicitly hired an impersonator. OpenAI hired someone that sounded like her, and it’s her natural voice. Should movies be held to the same standard when casting their actors? This is about as absurd as saying that you’re not allowed to hire an actor to play a role.
OpenAI should’ve owned their actions. "Yes, we wanted to get a voice that sounded like the one from Her." There’s nothing wrong with that.
This whole thing is reminiscent of Valve threatening to sue S2 for allegedly making a similar character. Unsurprisingly, the threats went nowhere.
For the “her” reference(s?), was there anything beyond the single tweet?
The voice sounds remarkably like Scarlett Johansson's.
From a moral perspective, I can’t believe that people are trying to argue that someone’s voice should be protected under law. But that’s a personal opinion.
Waits v. Frito Lay, Inc was '92, and cited it. They used a Tom Waits-sounding voice on an original song, and Waits successfully sued:
> Discussing the right of publicity, the Ninth Circuit affirmed the jury’s verdict that the defendants had committed the “Midler tort” by misappropriating Tom Waits’ voice for commercial purposes. The Midler tort is a species of violation of the right of publicity that protects against the unauthorized imitation of a celibrity’s voice which is distinctive and widely known, for commercial purposes.
https://tiplj.org/wp-content/uploads/Volumes/v1/v1p109.pdf
Of course, who knows what a court will find at the end of this. There is precedent, however.
How do you know?
That may not be how it should work, but it is very much how the law currently works.
Yes, they should have not reached out again, but now they are screwed. In no way will they want a trial and associated discovery. SJ can write her own ticket here.
That’s annoying, but we live in a country with lots of annoying laws that we nonetheless abide by. In this case I guess OpenAI just didn’t want to risk losing a court battle.
I still think legal = moral is mistaken in general, and from a moral standpoint it’s bogus that OpenAI couldn’t replicate the movie Her. It would’ve been cool. But, people can feel however they want to feel about it, and my personal opinion is worth about two milkshakes. But it’s still strange to me that anyone has a problem with what they did.
That's assuming they did, right now they're asking us to pretty please trust them that their girlfriend from Canada is really real! She's real, you guys! No I can't show her to you.
Voice impersonation has been a settled matter for decades. It doesn't matter that they used another actress. What matters is that they tried to pass the voice off as SJ's voice several times.
Unfortunately a commenter pointed out that there’s legal precedent for protecting people’s voices from commercial usage specifically (thanks to a court case from four decades ago), so I probably wouldn’t have tried this. The cost of battling it out in the legal system is outweighed by the coolness factor of replicating Her. I personally feel it’s a battle worth winning, since it’s bogus that they have to worry about some annoyed celebrity, and your personal freedoms aren’t being trodden on in this case. But I can see why OpenAI would back down.
Now, if some company was e.g. trying to commercialize everybody’s voices at scale, this would be a different conversation. That should obviously not be allowed. But replicating a culturally significant voice is one of the coolest aspects of AI (have you seen those recreations of historical voices from other languages translated into English? If not, you’re missing out) but that’s not what OpenAI did here.
If so, I have a bridge you might be interested in buying
Two, it’s bogus that conceptually this isn’t allowed. I’m already anti-IP — I think that IP is a tool that corporations wield to prevent us from using "their" ideas, not to protect us from being exploited as workers. And now this is yet another thing we’re Not Allowed To Do. Great, that sounds like a wonderful world, just peachy. Next time maybe we’ll stop people from monetizing the act of having fun at all, and then the circle of restrictions will be complete.
Or, another way of putting it: poor Scarlett, whatever will she do? Her voice is being actively exploited by a corporation. Oh no.
In reality, she’s rich, powerful, and will be absolutely fine. She’d get over it. The sole reason that she’s being allowed to act like a bully is because the law allows her to (just barely, in this case, but there is one legal precedent) and everyone happens to hate or fear OpenAI, so people love rooting for their downfall and calling Sam an evil sociopath.
Someone, please, make me a moral, ethical argument why what they did here was wrong. I’m happy to change my mind on this. Name one good reason that they shouldn’t be allowed to replicate Her. It would’ve been cool as fuck, and sometimes it feels like I’m the only one who thinks so, other than OpenAI.
Or... hear me out... maybe they couldn't prove that, which is why they caved. Caved within a day or so of her lawyers asking "So if it's not SJ's voice, whose is it?"
They say so, yes. Seems like they didn't want to go through discovery in order to prove it.
Correct, while Midler presents a similar fact pattern and is a frequently taught and cited foundational case in this area, the case law has evolved since Midler, to an even stronger protection of celebrity publicity rights, that is even more explicitly not concerned with with the mechanism by which the identity is appropriated. Waits v. Frito Lay (!992), another case where voice sound-alike was a specific issue, has been mentioned in the thread, but White v. Samsung Electronics America (1993) [0], while its fact pattern wasn't centered on sound-alike voice appropriation, may be more important in that it underlines that the mechanism of appropriation is immaterial so long as the appropriation can be shown:
—quote—
In Midler, this court held that, even though the defendants had not used Midler's name or likeness, Midler had stated a claim for violation of her California common law right of publicity because "the defendants … for their own profit in selling their product did appropriate part of her identity" by using a Midler sound-alike. Id. at 463-64.
In Carson v. Here's Johnny Portable Toilets, Inc., 698 F.2d 831 (6th Cir. 1983), the defendant had marketed portable toilets under the brand name "Here's Johnny"--Johnny Carson's signature "Tonight Show" introduction–without Carson's permission. The district court had dismissed Carson's Michigan common law right of publicity claim because the defendants had not used Carson's "name or likeness." Id. at 835. In reversing the district court, the sixth circuit found "the district court's conception of the right of publicity … too narrow" and held that the right was implicated because the defendant had appropriated Carson's identity by using, inter alia, the phrase "Here's Johnny." Id. at 835-37.
These cases teach not only that the common law right of publicity reaches means of appropriation other than name or likeness, but that the specific means of appropriation are relevant only for determining whether the defendant has in fact appropriated the plaintiff's identity. The right of publicity does not require that appropriations of identity be accomplished through particular means to be actionable. It is noteworthy that the Midler and Carson defendants not only avoided using the plaintiff's name or likeness, but they also avoided appropriating the celebrity's voice, signature, and photograph. The photograph in Motschenbacher did include the plaintiff, but because the plaintiff was not visible the driver could have been an actor or dummy and the analysis in the case would have been the same.
Although the defendants in these cases avoided the most obvious means of appropriating the plaintiffs' identities, each of their actions directly implicated the commercial interests which the right of publicity is designed to protect.
–end quote–
> Ford explicitly hired an impersonator. OpenAI hired someone that sounded like her, and it’s her natural voice.
Hiring a natural sound-alike voice vs. an impersonator as a mechanism is not the legal issue, the issue is the intent of the defendant in so doing (Ford in the Midler case, OpenAI in a hypothetical Johansson lawsuit) and the commercial effect of them doing so.
[0] https://law.justia.com/cases/federal/appellate-courts/F2/971...
So the overall argument isn't strange, you just disagree without having articulated exactly what biases you to disagree. It is moral disagreement ultimately.
Actually, there's a similar court case from 1988 that creates legal precedent for her to sue.
"That's just one case! And it's from 1988! That's 36 years ago: rounded up, that's 4 decades!"
Actually, there's a court case from 1992 that built on that judgement and expanded it to define a specific kind of tort.
"That's bad law! Forget the law! I demand a moral justification."
Anyway, asking a person if you can make money off their identity, them saying no, and you going ahead and doing that anyway seems challenging to justify on moral grounds. I don't think you're willing to change your mind, your claim notwithstanding.
Unrelated, but as someone who came along into this world after Carson's Tonight Show, I had no idea that that moment from The Shining was a play on that. Today's lucky 10,000.
Which is a shame, since you had a decent argument.
Except it isn’t. Again, you’re acting like OpenAI tried to profit off of Scarlett. They tried to profit off of the portrayal she did in the movie Her. These are not the same thing, and treating them as interchangeable is some next level moral rationalization. One is taking advantage of someone. The other is what the movie industry is for.
Now, where’s this case from 1992 that expended and defined the scope of this?
You can have an opinion on it, but they are going to get sued. Just like I can't take Moana and throw her in an ad where it says "I like [insert cereal here]", they can't take a character and use it without expecting Disney/whoever to come sue them.
Ahhh... so you admit OpenAI has been shady, but you argue they're actually ripping of Spike Jones not Scarlett Johansson?
HEH. The people who say Sam is shady aren't really interested in this distinction.
(And you're wrong, both ScarJo and the film own aspects of the character they created together.)
From her statement:
> I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people.
So, they wanted to profit off of her voice, as her voice is comforting. She said no, and they did it anyway. Nothing about, "come in and do that song and dance from your old movie."
> where’s this case from 1992
With Johansson voicing the AI. And now they're marketing their AI sounding like Johansson, referencing the movie that had Johansson voicing the AI.
Yeah, no similarities at all there.
Seems you mut have reason to want to believe them.
Otherwise you'd have noticed all the reasons not to.
Yeah, so stop doing that then.
So I guess you wouldn't mind if someone killed you, since laws against murder are much older than that? Shit, outmoded old boomer thinking, amirite?
Wow, when you realise how you're coming off here...
This is subjective. I, personally, don't hear it, at all: >>40435695