If they didn't and just cloned her voice, it's more disregard for creators and artists than I would have thought possible. What were they thinking?
Edit after reading the official story... not sure I believe it, seems disingenuous, at best they chose someone because they really really sounded like Scarlett Johansson, and no one said, it might be a problem.
https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
Her - https://www.youtube.com/watch?v=Ij0ZmgG6wCA
If it was a soundalike then it would be a legal issue obviously, nothing to discuss.
Looking at the HN comments they are told it sounds like Her so that's what they believe. So you can't trust the NPC's to decide, they just regurgitate media headlines.
How do you quantify it?
This is the case for all mobile and web apps and has been the norm for over a decade now. If you want control over the UI or functionality, use local software that doesn't check some server for feature flags.
Even if Scarlett Johansson would agree to license her voice, which is a big if, it's not her voice that's being removed. It's a different actress that some users thinks sounds like Scarlett.
https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
https://x.com/sama/status/1790075827666796666
> her
And this one:
https://x.com/prafdhar/status/1790789900650037441
> @alex_conneau: came up with the vision of HER before anyone at OpenAI had, and executed relentlessly!
So for me, this voice had no impact — sure I noticed it seemed a bit "flirty", but that's not a thing that engages me in any way as it feels equally fake when a human does it, and if anything I pattern-matched to the Pierson's Puppeteers in Ringworld; the original Alexa advert was moderately creepy, but I could see they were trying to mimic the computer in Star Trek; but one example I do have of being disturbed by a product advert was the use of a cheerful up-beat soundtrack for "The Robot Dog With A Flamethrower | Thermonator": https://www.youtube.com/watch?v=rj9JSkSpRlM
The cases brought forth by Marvin Gaye's family [1] showed that some judges will declare copyright infringement even if the melody, harmony and rhythm are different. Note that the author saying he reverse-engineered the original song in question probably had something to do with it, so in the end intent and artistic perception will always remain factors that no computer function can compute.
[1] https://en.m.wikipedia.org/wiki/Pharrell_Williams_v._Bridgep...
https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
Anyway, let’s say he was negatively affected by that relationship, IIRC.
Reminds me a bit of this: https://www.uniladtech.com/news/ai/man-married-hologram-no-l... , up to you if you find people developing strong feelings to inanimate objects that can’t care less, dystopic or not.
But I'd also argue that flirtatiousness is a very good match for the character of Samatha in "Her" -
2024-05-20T07:16:23.679Z
OpenAI to Pull Johansson Soundalike Sky's Voice From ChatGPT
OpenAI is working to pause the use of the Sky voice from an audible version of ChatGPT after users said that it sounded too much like actress Scarlett Johansson.
The company said that the voice, one of five available on ChatGPT, was from an actress and was not chosen to be an "imitation" of Johansson, according to a blog post.^1 Johansson played a fictional virtual assistant in the film Her, about a man who falls in love with an AI system.
The voices are part of OpenAI's updated GPT-4o, which debuted earlier this month and can reply to verbal questions from users with an audio response.
1. https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
Maybe I have missed something. Is this really the fulltext of the "article"??
https://daringfireball.net/linked/2024/05/20/openai-johansso...
These rights should have their limits but also serve a very real purpose in that such people should have some protection from others pretending to be/sound like/etc them in porn, ads for objectionable products/organizations/etc, and all the above without compensation.
> Rather than write George out of the film, Zemeckis used previously filmed footage of Glover from the first film as well as new footage of actor Jeffrey Weissman, who wore prosthetics including a false chin, nose, and cheekbones to resemble Glover. [...]
> Unhappy with this, Glover filed a lawsuit against the producers of the film on the grounds that they neither owned his likeness nor had permission to use it. As a result of the suit, there are now clauses in the Screen Actors Guild collective bargaining agreements stating that producers and actors are not allowed to use such methods to reproduce the likeness of other actors.[
> Glover's legal action, while resolved outside of the courts, has been considered as a key case in personality rights for actors with increasing use of improved special effects and digital techniques, in which actors may have agreed to appear in one part of a production but have their likenesses be used in another without their agreement.
https://en.wikipedia.org/wiki/Back_to_the_Future_Part_II#Rep...