https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
Bette Middler successfully sued Ford for impersonating her likeness in a commercial.
Then also:
https://casetext.com/case/waits-v-frito-lay-inc
Tom Waits successfully sued Frito Lay for using an imitator without approval in a radio commercial.
The key seems to be that if someone is famous and their voice is distinctly attributeable to them, there is a case. In both of these cases, the artists in question were also solicited first and refused.
> In a novel case of voice theft, a Los Angeles federal court jury Tuesday awarded gravel-throated recording artist Tom Waits $2.475 million in damages from Frito-Lay Inc. and its advertising agency.
> The U.S. District Court jury found that the corn chip giant unlawfully appropriated Waits’ distinctive voice, tarring his reputation by employing an impersonator to record a radio ad for a new brand of spicy Doritos corn chips.
https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...
A company can't take a photo from your Facebook and plaster it across an advertisement for their product without you giving them the rights to do that.
And if you're a known public figure, this includes lookalikes and soundalikes as well. You can't hire a ScarJo impersonator that people will think is ScarJo.
This is clearly a ScarJo soundalike. It doesn't matter whether it's an AI voice or clone or if they hired someone to sound just like her. Because she's a known public figure, that's illegal if she hasn't given them the rights.
(However, if you generate a synthetic voice that just happens to sound exactly like a random Joe Schmo, it's allowed because Joe Schmo isn't a public figure, so there's no value in the association.)
If you imitate Darth Vader, I don't think James Earl Jones has as much case for likeliness as Star Wars franchise
And not see how over the top it is... cmon.
If so, I suspect they’ll be okay in a court of law — having a voice similar to a celebrity isn’t illegal.
It’ll likely cheese off actors and performers though.
[1] https://www.forbes.com/sites/roberthart/2024/05/20/openai-sa...
https://www.hollywoodreporter.com/business/business-news/bac...
If you just want ScarJo's (or James Earl Jones') voice, you need the rights from them. Period.
If you want to reuse the character of her AI bot from the movie (her name, overall personality, tone, rhythm, catchphrases, etc.), or the character of Darth Vader, you also need to license that from the producers.
And also from ScarJo/Jones if you want the same voice to accompany the character. (Unless they've sold all rights for future re-use to the producers, which won't usually be the case, because they want to be paid for sequels.)
If they accidentally hired someone who sounds identical, that's not illegal. But if they intended to, even if it is a pretty poor imitation, it would be illegal because the intent to do it was there.
A court of law would be looking for things like emails about what sort of actress they were looking for, how they described that requirement, how they evaluated the candidate and selected her, and of course, how the CEO announced it alongside a movie title Scarlett starred in.
it takes more than money to fuel these types, and they would have far better minders and bumpers if the downside outweighed the upside. they aren’t stupid, just addicted.
musk was addict smart, owned up to his proclivities and bought the cartel.
"when voice is sufficient indicia of a celebrity's identity, the right of publicity protects against its imitation for commercial purposes without the celebrity's consent."
Is there a distinction?
Are they trying to make it sound like Her, or SJ? Or just trying to go for a similar style? i.e. making artistic choices in designing their product
Note: I've never watched the movie.
Just find someone who sounds like her, then hire them for the rights to their voice.
There's not a lot of precedent around voice impersonation, but there is for a very, very similar case against Ford
Yes, that would be a copyright violation on top of everything else.
Great idea though!
I'm going to start selling Keanu Reeves T-Shirts using this little trick.
See, I'm not using Keanu's likeness if I don't label it as Keanu. I'm just going to write Neo in a Tweet, and then say I'm just cloning Neo's likeness.
Neo is not a real person, so Keanu can't sue me! Bwahahaha
There is no doubt that the hired actor was an impersonator, this was explicitly stated by scama himself.
Your argument may be stronger if OpenAI said something like “the movie studio owns the rights to this character’s likeness, so we approached them,” but it’s not clear they attempted that.
She may have something only if it turns out that the training set for that voice is composed of some recordings of her (the person, not the movie), which I highly doubt and is, unfortunately, extremely hard to prove. Even that wouldn't be much, though, as it could be ruled a derivative work, or something akin to any celebrity impersonator. Those guys can even advertise themselves using the actual name of the celebrities involved and it's allowed.
Me personally, I hope she takes them to court anyway, as it will be an interesting trial to follow.
An interesting facet is, copyright law goes to the substance of the copyrighted work; in this case, because of the peculiarities of her character in "her", she is pretty much only voice, I wonder if that make things look different to the eyes of a judge.
If they had just screened a bunch of voice actors and chosen the same one no one would care (legally or otherwise).
As to whether she owns the rights of that performance or somebody else, we'd have to read the contract; most likely she doesn't, though.
There's no way "you" (the people that engage in these tactics) believe anyone is that gullible to not see what's happening. You either believe yourselves to be exceedingly clever or everyone else has the intelligence of toddler.
With the gumption some tech "leaders" display, maybe both.
If you have to say "technically it's not" 5x in a row to justify a position in a social context just short-circuit your brain and go do something else.
OpenAI has gone the "it's easier to ask forgiveness than permission" route, and it seemed like they might get away with that, but if this results in a lot more stories like this they'll risk running afoul of public opinion and future legislation turning sharply against them.
(and given the timeline ScarJo laid out in her Twitter feed, I'd be inclined to vote to convict at the present moment)
The discovery process may help figuring the intent - especially any internal communication before and after the two(!) failed attempts to get her sign-off, as well as any notes shared with the people responsible for casting.
That’s why she was the voice actor for the AI voice in Her.
In the case of acquiring a likeness, if it's done legally you acquire someone else's likeness that happens to be shared with your target.
The likeness is shared and non-unique.
If you objective is to take someone's life, there is no other pathway to the objective but their life. With likeness that isn't the case.
That falls under copyright, trademarks, ...
OpenAI didn't just use a voice like Scarlett Johansson's. They used it in an AI system they wanted people to associate with AI from movies and the movie where Johansson played an AI particularly.[1][2]
I think a lot of people are wondering about a situation (which clearly doesn’t apply here) in which someone was falsely accused of impersonation based on an accidental similarity. I have more sympathy for that.
But that’s giving OpenAI far more than just the benefit of the doubt: there is no doubt in this case.
Hiring someone with a voice you want isn't illegal; hiring someone with a voice you want because it is similar to a voice that someone expressly denied you permission to use is illegal.
Actually, it's so foundational to the common law legal system that there's a specialized Latin term to represent the concept: mens rea (literally 'guilty mind').
And here's some caselaw where another major corporation got smacked down for doing the exact same thing: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
But given how unscrupulous Sam Altman appears to be, I wouldn't be surprised if OpenAI hired an impersonator as some kind half-ass legal cover, and went about using Johansson's voice anyway. Tech people do stupid shut sometimes because they assume they're so much cleverer than everyone else.
But I'm not a lawyer of any sort either, so... ::shrug::
Yes, it will be interesting in June 1988 when we will find out "where this lands": https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus
https://x.com/AlexBlechman/status/1457842724128833538?lang=e...
Plus the tone of the voice is likely an unimportant detail to theor success. So pushing up against the legal boundaries in this specific domain is at best strange and at worst a huge red flag for their ethics and how they operate.
Explicit brand reference? Bad. Circumstantial insinuation? Let it go.
She was used in Her because she has a dry/monotone/lifeless form of diction that at the time seemed like a decent stand-in for an non-human AI.
IMDB is riddled with complaints about his vocal-style/diction/dead-pan on every one of her movies. Ghost World, Ghost in the Shell, Lost in Translation, Comic-Book-Movie-1-100 -- take a line from one movie and dub it across the character of another and most people would be fooled, that's impressive given the breadth of quality/style/age across the movies.
When she was first on the scene I thought it was bad acting, but then it continued -- now I tend to think that it's an effort to cultivate a character personality similar to Steven Wright or Tom Waits; the fact that she's now litigating towards protection of her character and likeness reinforces that fact for me.
It's unique to her though , that's for sure.
...it wouldn't make any difference.
A Barack Obama figurine is a Barack Obama figurine, no matter how much you say that it's actually a figurine of Boback O'Rama, a random person that coincidentally looks identically to the former US President.
https://www.quimbee.com/cases/waits-v-frito-lay-inc
https://en.wikipedia.org/wiki/White_v._Samsung_Electronics_A....
This isn't actually complicated at all. OpenAI robbed her likeness against her express will.
Do you have a source for this?
People who want to use an actor's likeness can't get around likeness rights by saying they impersonated a specific performance actually.
California Civil Code Section 3344(a) states:
Any person who knowingly uses another’s name, voice, signature, photograph, or likeness, in any manner, on or in products, merchandise, or goods, or for purposes of advertising or selling, or soliciting purchases of, products, merchandise, goods or services, without such person’s prior consent, or, in the case of a minor, the prior consent of his parent or legal guardian, shall be liable for any damages sustained by the person or persons injured as a result thereof.
Writing this comment mostly to say - damn, I didn't think about it this way, but I guess "either believe yourselves to be exceedingly clever or everyone else has the intelligence of toddler" is indeed the mindset.
The only other alternative I can think of is "we all know it's BS, but do they have more money than us to spend on lawyers to call it out?" - which isn't much better TBH.
Not make a living posing for pictures without consent of the said celebrity?
Re: "get real" - the law is pretty real.
Things like parody are protected under fair use, explicitly.
Yeah not moving from my position at all. Just a very generic featureless female voice. I suppose I hear some similarities in timbre, but it’s such an unremarkable voice and diction that it’s hard to put your finger on anything past “generic low affect American alto”.
It’s a great computer voice. Taking it down is for sure the right call PR wise, regardless of whether they may have done.