If so, I suspect they’ll be okay in a court of law — having a voice similar to a celebrity isn’t illegal.
It’ll likely cheese off actors and performers though.
[1] https://www.forbes.com/sites/roberthart/2024/05/20/openai-sa...
it takes more than money to fuel these types, and they would have far better minders and bumpers if the downside outweighed the upside. they aren’t stupid, just addicted.
musk was addict smart, owned up to his proclivities and bought the cartel.
"when voice is sufficient indicia of a celebrity's identity, the right of publicity protects against its imitation for commercial purposes without the celebrity's consent."
Is there a distinction?
Are they trying to make it sound like Her, or SJ? Or just trying to go for a similar style? i.e. making artistic choices in designing their product
Note: I've never watched the movie.
There's not a lot of precedent around voice impersonation, but there is for a very, very similar case against Ford
Yes, that would be a copyright violation on top of everything else.
Great idea though!
I'm going to start selling Keanu Reeves T-Shirts using this little trick.
See, I'm not using Keanu's likeness if I don't label it as Keanu. I'm just going to write Neo in a Tweet, and then say I'm just cloning Neo's likeness.
Neo is not a real person, so Keanu can't sue me! Bwahahaha
Your argument may be stronger if OpenAI said something like “the movie studio owns the rights to this character’s likeness, so we approached them,” but it’s not clear they attempted that.
She may have something only if it turns out that the training set for that voice is composed of some recordings of her (the person, not the movie), which I highly doubt and is, unfortunately, extremely hard to prove. Even that wouldn't be much, though, as it could be ruled a derivative work, or something akin to any celebrity impersonator. Those guys can even advertise themselves using the actual name of the celebrities involved and it's allowed.
Me personally, I hope she takes them to court anyway, as it will be an interesting trial to follow.
An interesting facet is, copyright law goes to the substance of the copyrighted work; in this case, because of the peculiarities of her character in "her", she is pretty much only voice, I wonder if that make things look different to the eyes of a judge.
If they had just screened a bunch of voice actors and chosen the same one no one would care (legally or otherwise).
As to whether she owns the rights of that performance or somebody else, we'd have to read the contract; most likely she doesn't, though.
There's no way "you" (the people that engage in these tactics) believe anyone is that gullible to not see what's happening. You either believe yourselves to be exceedingly clever or everyone else has the intelligence of toddler.
With the gumption some tech "leaders" display, maybe both.
If you have to say "technically it's not" 5x in a row to justify a position in a social context just short-circuit your brain and go do something else.
OpenAI has gone the "it's easier to ask forgiveness than permission" route, and it seemed like they might get away with that, but if this results in a lot more stories like this they'll risk running afoul of public opinion and future legislation turning sharply against them.
(and given the timeline ScarJo laid out in her Twitter feed, I'd be inclined to vote to convict at the present moment)
The discovery process may help figuring the intent - especially any internal communication before and after the two(!) failed attempts to get her sign-off, as well as any notes shared with the people responsible for casting.
That falls under copyright, trademarks, ...
I think a lot of people are wondering about a situation (which clearly doesn’t apply here) in which someone was falsely accused of impersonation based on an accidental similarity. I have more sympathy for that.
But that’s giving OpenAI far more than just the benefit of the doubt: there is no doubt in this case.
Explicit brand reference? Bad. Circumstantial insinuation? Let it go.
...it wouldn't make any difference.
A Barack Obama figurine is a Barack Obama figurine, no matter how much you say that it's actually a figurine of Boback O'Rama, a random person that coincidentally looks identically to the former US President.
https://www.quimbee.com/cases/waits-v-frito-lay-inc
https://en.wikipedia.org/wiki/White_v._Samsung_Electronics_A....
This isn't actually complicated at all. OpenAI robbed her likeness against her express will.
People who want to use an actor's likeness can't get around likeness rights by saying they impersonated a specific performance actually.
California Civil Code Section 3344(a) states:
Any person who knowingly uses another’s name, voice, signature, photograph, or likeness, in any manner, on or in products, merchandise, or goods, or for purposes of advertising or selling, or soliciting purchases of, products, merchandise, goods or services, without such person’s prior consent, or, in the case of a minor, the prior consent of his parent or legal guardian, shall be liable for any damages sustained by the person or persons injured as a result thereof.
Writing this comment mostly to say - damn, I didn't think about it this way, but I guess "either believe yourselves to be exceedingly clever or everyone else has the intelligence of toddler" is indeed the mindset.
The only other alternative I can think of is "we all know it's BS, but do they have more money than us to spend on lawyers to call it out?" - which isn't much better TBH.
Not make a living posing for pictures without consent of the said celebrity?
Re: "get real" - the law is pretty real.
Things like parody are protected under fair use, explicitly.