- the original report by Scarlett said she was approached months ago, and then two days prior to launch of GPT-4o she was approached again
Because of the above, my immediate assumption was that OpenAI definitely did her dirty. But this report from WaPo debunks at least some of it, because the records they have seen show that the voice actor was contacted months in advance prior to OpenAI contacting Scarlett for the first time. (also goes to show just how many months in advance OpenAI is working on projects)
However, this does not dispel the fact that OpenAI did contact Scarlett, and Sam Altman did post the tweet saying "her", and the voice has at least "some" resemblance of Scarlett's voice, at least enough to have two different groups saying that it does, and the other saying that it does not.
To me, it sounds like they had the idea to make their AI sound like "her". For the initial version, they had a voice actor that sounds like the movie, as a proof of concept.
They still liked it, so it was time to contact the real star. In the end, it's not just the voice, it would have been the brand, just imagine the buzz they would have got if Scarlett J was the official voice of the company. She said no, and they were like, "too bad, we already decided how she will sound like, the only difference is whether it will be labelled as SJ or not".
In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny, they gave it another go, probably ready to offer more money, she still refused, but in the end, it didn't change a thing.
I don't think it's less malicious if they decided to copy her voice without her consent, but just didn't tell her until the project was underway, then continued even after she said no.
There's legal precedent that hiring a copycat is not OK, so it's not like proving it was a copycat salvages their situation.
I wouldn't be surprised if the real reason they hired a copycat early is because they realized they'd need far more of Johansson's time than she'd be willing to provide, and the plan was typical SV "ask forgiveness not permission, but do it anyway regardless."
That doesn't matter because it's an impersonation. Ford lost, even though they didn't use Bette Midler's voice either: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
"We believe that AI voices should not deliberately mimic a celebrity's distinctive voice — Sky's voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using *her own natural speaking voice*"
I think this possibility doesn't receive enough attention, there is a class of people who've figured out that they can say the most scandalous things online and it's a net positive because it generates so much exposure. (As a lowly middle class employee you can't do this - you just get fired and go broke - but at a certain level of wealth and power you're immune from that.) It is the old PT Barnum principle, "They can say whatever they want about me as long as they spell my name right." Guys like Trump and Musk know exactly what they're doing. Why wouldn't Sam?
Johansson's complaint is starting to look a little shaky especially if you remove that "her" Tweet from the equation. I wouldn't put this past Altman at all, he knows exactly what happened and what didn't inside OpenAI, so maybe he knew she didn't have a case and decided to play Sociopathic 3D Chess with her (and beat her in one round)
In order to sue, there need to be damages, and if they didn’t copy the voice then the rest doesn’t matter, which sam and team clearly knew and were fast to work with the news. I agree that smart people take advantage of what they can get away with, but this controversy couldn’t have turned out better for increasing brand awareness good or bad (as you say, just like trump and musk know how to do)
What if it wasn’t a computer voice model but rather a real-life voice actress that you could pay a few cents to try to imitate Scarlett Johansson’s voice as best as she could?
That’s effectively what’s happening here, and it isn’t illegal.
It guess it also leads to the bigger question: do celebrities own their particular frequency range? Is no one allowed to publicly sound like them? Feels like the AACS DVD encryption key controversy all-over again.
That was just a few days before launch, right? What was their plan if she said yes at that point? Continue using the "not-her" voice but say it was her? Or did they also have her voice already cloned by then and just needed to flip a switch?
Consider when a company recasts a voice actor in something: i.e. the VA Rick and Morty have been replaced, Robin Williams was not the voice of genie in Aladdin 2 or the animated series.
One or the other. It doesn't really matter as SJ herself would not have necessarily been able to make sure it is not her and not a glitch in how the tech work with her voice.
It is more complicated than that. Check out Midler v. Ford Motor Co, or Waits V. Frito Lay.
people are allowed to sound like other people. But if you go to actor 1 and say we want to use your voice for our product, and then they say no, and then you go to actor 2 and tell them I want you to sound like actor 1 for our product, and then you release a statement hey you know that popular movie by actor 1 that just used their voice in a context extremely reminiscent of our product?!? Well, listen to what we got: (actor 2 voice presented)
Then you may run into legal problems.
https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
on edit: assuming that reports I am reading that the actress used for the voicework claimed not to have been instructed to sound like Her vocal work it sounds like it is probably not likely that a suit would be successful.
Copying a famous actor’s voice without any kind of agreement at all is something else.
When discovery happens and there’s a trail of messages suggesting either getting ScarJo or finding someone that sounds enough like her this isn’t going to look good with all the other events in timeline.
If it goes to court, they’ll settle.
(1) They cast the current actor to test the technology and have a fallback. The actor sounds somewhat different from Johansson but the delivery of the lines is similar.
(2) They then ask Johansson because they want to be the company that brought “Her” to life. She declines.
(3) They try again shortly before the event because they really want it to happen.
(4) They proceed with the original voice, and the “her” tweet happens because they want to be the ones that made it real.
Asking shortly before the release is the weakest link here. It’s possible they already had a version trained or fine tuned on her voice that they could swap in at the last minute. That could explain some of the caginess. Not saying it’s what happened or is even likely, but it feels like a reasonable possibility.If you listen to the imitation version linked from that Wikipedia article and the original 1958 you'll hear that they didn't only find a singer that sounded like her, but copied the music and cadence from Bette's version.
I think that's way past what whatever OpenAI did in this case. It would be analogous if they were publishing something that only regurgitated lines Scarlett Johansson is famous for having said in her movies.
But they're not doing that, they just found a person who sounds like Scarlett Johansson.
This would only be analogous to the Ford case if the cover artist in that case was forbidden from releasing any music, including original works, because her singing voice could be confused with Bette Midler's.
Now, would they have done this if Scarlett Johansson wasn't famous? No, but we also wouldn't have had a hundred grunge bands with singers playing up their resemblance to Kurt Cobain if Nirvana had never existed.
So wherever this case lands (likely in a boring private settlement) it's clearly in more of a gray area than the Ford case.
She's allowed to be a voice actor using her real voice.
Your can point to the "Her" tweet, but it's a pretty flimsy argument.
> That’s effectively what’s happening here, and it isn’t illegal.
Profiting from someone else's likeness is illegal.
And besides, it sounds more like Rashida Jones anyway. It's clearly not an impersonation.
Nothing in this article changes the essence of her complaint.
The only real, though partial, rebuttal to her is that OpenAI copied a work product she did for a movie, and the movie was was more than her voice, so it's not totally her own work. So maybe the movie team as a whole has a stronger complaint than the voice actor alone.
She didn't lose any game of wits. She just got done dirty by someone who got away with it. She doesn't need money from them. She has respect from people who matter, SAM and OpenAI behaved badly like big tech always does. If OpenAI permanently stops using Johansson-like Sky voice, she'll win what she wanted.
Of course, anyone whose voice sounds like an AI has the unpleasantness of that experience, and a rich person is more able to endure it than a regular Johansson.
I'm not a lawyer, but this seems unfair to the voice actor they did use, and paid, who happens to sound like ScarJo (or vice versa!)
So if I sound like a famous person, then I cant monetize my own voice? Who's to say it isnt the other way around, perhaps it is ScarJo that sounds like me and i'm owed money?
e.g.
Vanna White vs Samsung - https://w.wiki/AAUR
Crispin Glover Back to the Future 2 lawsuit - https://w.wiki/AAUT#Back_to_the_Future_Part_II_lawsuit
It's not necessarily what will prove true at the end of the day but I think we owe people the presumption of innocence.
Cases are not a spell you can cast to win arguments, especially when the facts are substantially different.
Though admittedly, so does Johansson in "Her". I don't think the voices are very similar but the style is.
Just because one is a singer and the other is an actor isn’t the big difference you think it is. Actors do voice over work all the time. Actors in fact get cast for their voice all the time.
Yelling, “Parody!” Isn’t some get out of jail free card, particularly where there is actual case law, even more particularly when there are actual laws to address this very act.
The problem here is that someone inside of OpenAI wanted to create a marketing buzz around a new product launch and capitalize on a movie. In order to do that they wanted a voice that sounded like that movie. They hired a voice actor that sounded enough like ScarJo to hedge against actually getting the actor to do it. When she declined they decided to implement their contingency plan.
If they're liable is for a jury to decide, but the case precedent that I've seen, along with the intent, wouldn't look good if I were on that jury.
That's not what she said happened. She said they released it anyway before she and Sam could connect, after Sam had reached out, for the second time, two days prior to the release.
But when it comes to specific questions that hinge on evidence, I think you have to maintain the typical presumption of innocence, just to balance out the possibilities of mob psychology getting out of control.
As I understand it, that's essentially OpenAI's defense here.
No, you do not owe "corporations, especially those with a tendency, incentive, and history of being ruthless in this way."
Wise up, people.
> Who's to say it isnt the other way around, perhaps it is ScarJo that sounds like me and i'm owed money?
It seems like you don't get the fundamental principal underlying "right of publicity" laws if you are asking this question.
A more similar context would be: they ask Tom Hanks to create a voice similar to Woody, the cowboy from Toy Story . Tom Hanks says no, Disney says no. Then they ask you to voice their cowboy voice. It's obviously related: they tried the OG, failed, they're going for a copycat after.
But if never approached Tom Hanks or Disney, then there would be room for deniability - without mentions to real names, it would require someone to judge if it's an unauthorized copycat or just a random actor voicing a random cowboy voice.
It was a bad play from their part.
Seems she is prevented from doing work, if companies can get sued for hiring/using voice actors who sounds like ScarJo, then any voice actor who sounds like ScarJo has effectively been de-platformed. Similarly, imagine I look very much like George Clooney -- if George Clooney can sue magazines for featuring my handsome photos, then I lose all ability to model for pay. (Strictly hypothetical, I am a developer, not a fashion model.)
>> It seems like you don't get the fundamental principal underlying "right of publicity" laws if you are asking this question.
Totally, i have no idea of the laws here, but very curious to understand what OpenAI did wrong here.
If her customers can get sued for using her voice, then this voice actor can never get another job and can never get paid again -- all because she happens to sound like ScarJo. That seems unfair to the voice actor.
>Totally, i have no idea of the laws here, but very curious to understand what OpenAI did wrong here.
It is illegal to profit off the likeness of others. If it wasn't, what's to stop any company from hiring any impersonator to promote that company as the person they are impersonating?
I need 1.5x speeds even if I have to use a worse voice. I am a TTS power user, listening to all online text since 2010s. Maybe GPT-4o has a more flexible voice, perhaps you can just ask it to speak faster.
It's like, the people dropping leaflets in your physical mailbox are delivering spam, but you wouldn't automatically assume those same people are also trying to scam you and your neighbors by delivering you physical letters meant to trick people into parting with their savings. In both cases, the messages are spam, but one is legal, other is not, and there's a huge gap between them.
Are they? Where did they advertise this? The voice doesn't even sound that much like ScarJo!
> Just because one is a singer and the other is an actor isn’t the big difference you think it is. Actors do voice over work all the time. Actors in fact get cast for their voice all the time.
It's a very big difference when the jurisprudence here rests on how substantial the voice is as a proportion of the brand, especially in the presence of the other disanalogies.
> Yelling, “Parody!” Isn’t some get out of jail free card, particularly where there is actual case law, even more particularly when there are actual laws to address this very act.
Sure -- If you read that back, I'm clearly not doing that. An impression in a parody in the artist's unique style (Waits) was a case where it was a violation of publicity rights. This is radically different from that. It's not clear that Midler and Waits have much bearing on this case at all.
So in your opinion, if a movie needs to have a tall, skinny red head, and then they approach someone who has those qualities and the role is turned down, then it would be illegal to get any other different tall skinny red head.
That sounds absurd to me. If you have a role, obviously the role has qualities and requirements.
And just because person 1 who happens to have those qualities turns you down, it is still valid to get a different person who fulfils your original requirements.
If you had a voice like Scarlett, and you were hired to create the voice of an AI assistant, there's no legal problem - as long as the voice isn't marketed using references to Scarlett.
However, in this case, the voice is similar to Scarlett's, AND they referenced a popular movie where Scarlett voiced an AI assistant, and named the assistant in a way that is evocative of Scarlett's name, and reached out to Scarlett wanting to use her voice. It is those factors that make it legally questionable, as it appears that they knowingly capitalized on the voice's similarity to Scarlett's without her permission.
It is about intent, and how the voice is marketed. Voice sounds like a famous person = fine, voice sounds like a famous person and the voice is marketed as being similar to the famous person's = not fine.
It is not a clear-cut 'this is definitely illegal' in this case, it is a grey area that a court would have to decide on.
I'm not making arguments which are not already explicitly written in my post.
My argument is simple: jorvi commented that you can hire "a real-life voice actress" to "try to imitate Scarlett Johansson’s voice as best as she could", and that is not illegal.
I said that the legality of that is more complicated. What jorvi describes might or might not be illegal based on various factors. And I pointed them towards the two references to support my argument.
I explicitly didn't say in that comment anything about the OpenAI/ScarJo case. You are reacting as if you think that I have some opinion about it. You are wrong, and it would be better if you would not try to guess my state of mind. If I have some opinion about something you will know because I will explicitly state it.
But they are subject to right of publicity in many US jurisdictions.
Which, while more like trademark than copyright (the other thing that keeps getting raised as if it should dispose of this issue), is its own area of law, distinct from either trademark or copyright.
> The ONLY way they get in trouble is if they claim to be Morgan Freeman.
That’s…not true. Though such an explicit claim would definitely be a way that they could get in trouble.
Those of us accusing and talking about it have no power -- thus there is literally no harm, and possible good in, putting them on the defense about this.
edit: In fact, the First Amendment of the Constitution essentially directly upholds the idea of "people saying whatever they want" in this regard.
People don't need to be careful just talking; in fact we generally support the idea of "people saying whatever" in the form of the First Amendment.
Same way you can't get someone who sounds like a famous celebrity to do voice in a commercial and just let people think it's the famous celebrity when it's not
https://www.inta.org/topics/right-of-publicity/#:~:text=In%2....
As people tend to look up at celebrities and admire them they start associating this with good things and I think this is why they adopted such styles for chatbots.
> Is it a crime for voice actors to sound similar to, say, James Earl Jones?
And the answer is, of course: It depends. For one thing, it depends on whether the company using the sound-alike's voice are in a business closely related to the theme of Star Wars, and whether they market whatever it is they're marketing by referring to Jones' iconic performance as Vader. ("<PANT> ... <PANT>") If they do that, then yes, it most likely is.