It seems she has every reason to benefit from claiming Sky sounded like her even if it was a coincidence. "Go away" payments are very common, even for celebrities - and OpenAI has deep pockets...
Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?
If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?
- It's fine to hire a voice actor.
- It's fine to train a system to sound like that voice actor.
- It's fine to hire a voice actor who sounds like someone else.
- It's probably fine to go out of your way to hire a voice actor who sounds like someone else.
- It's probably not fine to hire a voice actor and tell them to imitate someone else.
- It's very likely not fine to market your AI as "sounds like Jane Doe, who sounds like SJ".
- It's definitely not fine to market your AI as "sounds like SJ".
Say I wanted to make my AI voice sound like Patrick Stewart. Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising. If so, would it have been OK for OpenAI to do all this as long as they didn't mention SJ? Or is SJ so clearly identifiable with her role in "Her" that it's never OK to try to make a product like "Her" that sounds like SJ?
Pretty sure this is fine, otherwise cartoons like the simpsons or south park would've gotten in trouble years ago.
This particular area of law or even just type of "fairness" is by necessity very muddy, there isn't a set of well-defined rules you can follow that will guarantee an outcome where everyone is happy no matter what, sometimes you have to step back and evaluate how people feel about things at various steps along the way.
I'd speculate that OAI's attempts to reach out to SJ are probably the result of those evaluations - "this seems like it could make her people upset, so maybe we should pay her to not be mad?"
Thanks to Sam, this OpenAI case is clearer than others since he made a number of clear evidence against him.
I'll defer to a judge and jury about the legalities. As you noted, Sam gave them a lot of help.
Not necessarily, when you're hiring them because they like someone else—especially someone else who has said that they don't want to work with you. OpenAI took enough steps to show they wanted someone who sounded like SJ.
> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.
See https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. and also Tom Waits vs. Frito-Lay.
> as long as they didn't mention SJ
Or tried to hire SJ repeatedly, even as late as 2 days before the launch.
You can make a Saturday Night Live sketch making fun of Darth Vader.
You cannot use a Darth Vader imitator to sell light sabers.
You can hear them here:
Midler version https://www.youtube.com/watch?v=WFVhL0jbutU&t=22s
Car ad https://youtu.be/hxShNrpdVRs
As for your second question, yes. Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable.
It protect celebrities who rely on endorsements and “who they are” for income.
It very clearly prohibits copycats with near-likeness as a workaround to getting permission from a celebrity.
OpenAI asked SJ to use her voice. That right there helps her case immensely.
She said no. They went ahead anyway, presumably with someone or someone’s with a similar voice.
They publicized the product by referencing SJ.
These facts are damning.
They might be just a part of the story. Maybe 100 actresses, all sounding roughly the same, were given the offer over a two year period.
Maybe they all were given the same praise. Maybe one other, who signed an agreement, was praised on social media much more.
But this isn’t a slippery slope or a grey area. SJ was asked and said no.
That prohibits using a similar sounding copycat and publicizing as SJ.
Unless & until some 3rd other shoe drops, what we know now strongly --- overwhelmingly, really --- suggests that there was simply no story here. But we are all biased towards there being an interesting story behind everything, especially when it ratifies our casting of good guys and bad guys.
Then I'd say you have a point. But given all the other info, I'd have to say you're in denial.
Correct, that is not allowed in the US.
No.
The fact that it sounds very much like her and it is for a virtual assistant that clearly draws a parallel to the virtual assistant voiced by SJ in the movie (and it was not a protected use like parody) makes it not OK and not legal.
> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.
Nope.
If you made it sound identical to Patrick Stewart that would also likely not be OK or legal since his voice and mannerisms are very distinctive.
If you made it sound kind of like Patrick Stewart that is where things get really grey and it is probably allowed (but if you're doing other things to draw parallels to Patrick Stewart / Star Trek / Picard then that'd make your case worse).
And the law deals with grey areas all the time. You can drag experts in voice and language into the court to testify as to how similar or dissimilar the two voices and mannerisms are. It doesn't nullify the law that there's a grey area that needs to get litigated over.
The things that make this case a slam dunk are that there's the relevant movie, plus there's the previous contact to SJ, plus there's the tweet with "Her" and supporting tweets clearly conflating the two together. You don't even really need the expert witnesses in this case because the behavior was so blatant.
And remember that you're not asking a computer program to analyze two recordings and determine similarity or dissimilarity in isolation. You're asking a judge to determine if someone was ripping off someone else's likeness for commercial purposes, and that judge will absolutely use everything they've learned about human behavior in their lifetime to weigh what they think was actually going on, including all the surrounding human context to the two voices in question.
Then there's no actual confusion.
Actually it only says they reviewed "brief recordings of her initial voice test", which I assume refers to the voice test she did for OpenAI.
The "impersonating SJ" thing seems a straw man someone made up. The OpenAI talent call was for "warm, engaging, charismatic" voices sounding like 25-45 yr old (I assume SJ would have qualified, given that Altman specifically wanted her). They reviewed 400 applicants meeting this filtering criteria, and it seems threw away 395 of the ones that didn't remind Altman of SJ. It's a bit like natural selection and survival of the fittest. Take 400 giraffes, kill the 395 shortest ones, and the rest will all be tall. Go figure.
She doesn't need "go away" payments, and in any case that is not what we're looking at here. OpenAI offered her money to take the part, and she said no.
According to celebrity net worth website, SJ is worth $165M.
Taking the recent screwup out of account... It's tough. A commercial product shouldn't try to assossiate with another brand. But if we're being realistic: "Her" is nearly uncopyrightable.
Since Her isn't a tech branch it would be hard to get in trouble based on that association alone for some future company. Kind of like how Skynet in theory could have been taken by a legitimate tech company and how the Terminator IP owner would struggle to seek reparations (to satisfy curiosity, Skynet is a US govt. Program. So that's already taken care of).
As long as you don't leave a trail, you can probably get away with copying Stewart. But if you start making Star trek references (even if you never contacted Stewart), you're stepping in hit water.
I don’t believe the burden would be to prove that the voice actor was impersonating, but that she was misappropriating. Walking down the street sounding like Bette Midler isn’t a problem but covering her song with an approximation of her voice is.
You are dead right that the order of operations recently uncovered precludes misappropriation. But it’s an interesting situation otherwise, hypothetically, to wonder if using SJ’s voice to “cover” her performance as the AI in the movie would be misappropriation.
I don't think that follows. It's entirely possible that OpenAI wanted to get ScarJo, but believed that simply wasn't possible so went with a second choice. Later they decided they might as well try anyway.
This scenario does not seem implausible in the least.
Remember, Sam Altman has stated that "Her" is his favorite movie. It's inconceivable that he never considered marketing his very similar product using the film's IP.
> She doesn't need "go away" payments
> According to celebrity net worth website, SJ is worth $165M.
I have no idea what Johansson's estimated net worth, or her acting career have to do with this? Wealthy people sue all the time for all kinds of ridiculous things.
The voice is, in fact, not Johansson. Yet, it appears she will be suing them non-the-less...
It's not illegal to sound like someone else - despite what people might be claiming. If it turns out to be true that Sky's voice actor was recorded prior to the attempted engagement with Johansson, then all of this is extra absurd.
Also, Sky doesn't sound like Johansson anyway... but apparently that isn't going to matter in this situation.
That is not decided. There have been high profile cases were someone's likeness was explicitly used without permission, and they still had no recourse. It was argued the person was notable enough they could not protect their likeness.
Regardless, it appears debated if Sky even sounded like Johansson, which will make this very difficult for anyone to argue (being subjective and all). If the Sky voice actor was recorded prior to engaging with Johansson (which has been claimed by OpenAI), then it seems even more difficult to argue.
In the end, this will net Johansson a nice "go away" payday and then everyone will forget about it.
Yes
> Say I wanted to make my AI voice sound like Patrick Stewart
Don't tweet "engage" of "boldly go where no man has gone before" when you release the product and you should be ok.
Even if this did go down the way you suppose, once they realized the obvious similarities, the ethical thing to do was to not use the voice. It doesn’t matter if the intention was pure. It doesn’t matter if it was an accident.
Isla Fischer / Amy Adams
Margot Robbie / Samara Weaving / Jamie Pressley
Justin Long / Ezra Miller
Not to mention all of the token/stereotype characters where it hardly matters who the actor is at all. Need to fill the funny fat lady trope? If Rebel Wilson wasn't available, maybe they can get Melissa McCarthy.
The voice from Her isn't even the first voice I'd think of for a female computer voice. That trope has been around for decades. I'm sure OpenAI just wanted SJ specifically because she's currently one of the most popular celebrities in the world.
This is where your post breaks down. Many people say they don't think the voice sounds like SJ. Others do. But it appears you've made up your mind that they deliberately emulated her voice?
> The appellate court ruled that the voice of someone famous as a singer is distinctive to their person and image and therefore, as a part of their identity, it is unlawful to imitate their voice without express consent and approval. The appellate court reversed the district court's decision and ruled in favor of Midler, indicating her voice was protected against unauthorized use.
1. OpenAI wants to make a voice assistant. 2. They hire the voice actor. 3. Someone at OpenAI wonders why they would make a voice assistant that doesn’t sound like the boss’s favorite movie. 4. They reach out to SJ who tells them to pound sand.
Accordingly, there is no misappropriation because there is no use.
But you need to understand that it does sound like ScarJo to a lot of people. Maybe 50% of the people who hear it.
Those kinds of coincidences are the things that make you lose in court.
Bryce Dallas Howard & Jessica Chastain
Selena Gomez & Lucy Hale
Amy Adams & Isla Fisher
Keira Knightley & Natalie Portman
Not to mention that anytime an actor ages out of the early adulthood age range, a lookalike begins to play the same roles.
If you can't think of any examples, it's probably because you haven't been able to tell them apart yourself.
All the reporting around this I’ve seen uses incredibly short clips. There are hours of recorded audio of SJ speaking and there are lots of examples of the Sky voice out there since it’s been around since September.
"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
Ford having multiple ads would not have changed the determination.
It doesn't even need to sound like the person. It's about the intent. Did OpenAI intend to imitate ScarJo.
Half the people of the world thinking it's ScarJo is strong evidence that it's not an accident.
Given that "Her" is Sam's favorite movie, and that he cryptically tweeted "her" the day it launched, and that he reached out to ScarJo to do the voice, and that the company reached out to her again to reconsider two days before the launch -
I personally think the situation is very obvious. I understand that some people strongly disagree - but then there are some people who think the Earth is flat. So.
For example if someone anonymously used a celebrity's likeness to promote something you wouldn't need to identify the person (which would be necessary to prove intent) in order to stop have the offending material removed or prevented from being distributed.
And yet so many people think it does. What a weird coincidence.
> there’s a fair amount of proof to back up the claim that it wasn’t meant to
Sam has said his favorite movie is "Her". Sam tweeted "her" the day it was released. Sam wrote to ScarJo to try to get her to do the voice. OpenAI wrote to her two days before the release to try to get her to change her mind. A large percentage of the people who hear the voice thinks it sounds like ScarJo.
I think we're just going to have to agree to disagree about what the evidence says. You take care now.
The closing statement of Midler v. Ford is:
"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
Deliberate is a synonym for intentional.
It might make sense for intent to be required in order to receive damages but it would surprise me if you couldn't stop an inadvertent use of someone's likeness. In fact the Midler case cites the Ford one: 'The defendants were held to have invaded a "proprietary interest" of Motschenbacher in his own identity.'. I think you can invade someone's "proprietary interest" inadvertently just as you can take someone's property inadvertently; and courts can impose a corrective in both cases, in the first by ordering the invasion of proprietary interest be stopped and in the second by returning the taken property.
Wait, which one is the "off-brand" here?
> Bryce Dallas Howard & Jessica Chastain
I must confess I used to confuse these two!
Another one you didn't mention: until relatively recently, I thought actors Elias Koteas (Exotica) and Christopher Meloni (CSI: Special Victims Unit) were the same person!
I'm not familiar with all the case law but I assume that no case has been brought that directly speaks to the issue but people can and do discuss cases that don't yet have specific precedent.
Just seems like this area isn't that exotic.
And the legal universe is vast with new precedent case law being made every year so I don't think the corpus of undecided law is confined to well known legal paradoxes.
As for this case it doesn't seem that odd to me that the issue of intent has never been at issue: I would expect that typically the intent would be obvious (as it is in the OpenAI case) so no one has ever had to decide whether it mattered.
I don't see much merit in continuing our discussion. You take care now.