zlacker

OpenAI pulls Johansson soundalike Sky’s voice from ChatGPT

submitted by helsin+(OP) on 2024-05-20 11:13:24 | 143 points 190 comments
[view article] [source] [go to bottom]

NOTE: showing posts with links only show all posts
1. helsin+4[view] [source] 2024-05-20 11:13:36
>>helsin+(OP)
https://archive.ph/EXl8X
3. RockyM+Z6[view] [source] 2024-05-20 12:22:30
>>helsin+(OP)
When I heard it, I thought, wow, they licensed Scarlett Johansson, what an amazing Easter egg.

If they didn't and just cloned her voice, it's more disregard for creators and artists than I would have thought possible. What were they thinking?

Edit after reading the official story... not sure I believe it, seems disingenuous, at best they chose someone because they really really sounded like Scarlett Johansson, and no one said, it might be a problem.

https://openai.com/index/how-the-voices-for-chatgpt-were-cho...

◧◩
5. verdve+o7[view] [source] [discussion] 2024-05-20 12:26:10
>>chrisj+m5
https://youtube.com/shorts/eVMNvm67Y-A (daily show skit)
12. aaron6+l8[view] [source] 2024-05-20 12:32:54
>>helsin+(OP)
Doesn't sound anything like Scarlet Johansson - https://x.com/AngelinaPTech/status/1741450012314280431/video...

Her - https://www.youtube.com/watch?v=Ij0ZmgG6wCA

If it was a soundalike then it would be a legal issue obviously, nothing to discuss.

Looking at the HN comments they are told it sounds like Her so that's what they believe. So you can't trust the NPC's to decide, they just regurgitate media headlines.

How do you quantify it?

◧◩
13. sorenj+q8[view] [source] [discussion] 2024-05-20 12:33:39
>>luke-s+t7
> really gives end users a lack of control of the interface

This is the case for all mobile and web apps and has been the norm for over a decade now. If you want control over the UI or functionality, use local software that doesn't check some server for feature flags.

Even if Scarlett Johansson would agree to license her voice, which is a big if, it's not her voice that's being removed. It's a different actress that some users thinks sounds like Scarlett.

https://openai.com/index/how-the-voices-for-chatgpt-were-cho...

◧◩
33. ycombi+4b[view] [source] [discussion] 2024-05-20 12:52:35
>>embik+v8
Don't Create the Torment Nexus [0]

[0] https://knowyourmeme.com/memes/torment-nexus

45. qgin+9d[view] [source] 2024-05-20 13:09:57
>>helsin+(OP)
It’s hard to say it wasn’t intentional after this tweet:

https://x.com/sama/status/1790075827666796666

> her

And this one:

https://x.com/prafdhar/status/1790789900650037441

> @alex_conneau: came up with the vision of HER before anyone at OpenAI had, and executed relentlessly!

◧◩◪◨⬒
46. exitb+hd[view] [source] [discussion] 2024-05-20 13:11:26
>>TheCra+sb
Sam Altman and other OpenAI employees have been referencing the movie ahead of the presentation[1][2]. I think it'd be really challenging to prove that they didn't have that exact outcome in mind.

[1] https://x.com/sama/status/1790075827666796666

[2] https://x.com/alex_conneau

47. brianj+Cd[view] [source] 2024-05-20 13:13:05
>>helsin+(OP)
Archive: https://archive.is/EXl8X
◧◩
56. ben_w+pg[view] [source] [discussion] 2024-05-20 13:31:21
>>embik+v8
I'm not sure, but my guess is everyone's got a slightly different "uncanny valley" function.

So for me, this voice had no impact — sure I noticed it seemed a bit "flirty", but that's not a thing that engages me in any way as it feels equally fake when a human does it, and if anything I pattern-matched to the Pierson's Puppeteers in Ringworld; the original Alexa advert was moderately creepy, but I could see they were trying to mimic the computer in Star Trek; but one example I do have of being disturbed by a product advert was the use of a cheerful up-beat soundtrack for "The Robot Dog With A Flamethrower | Thermonator": https://www.youtube.com/watch?v=rj9JSkSpRlM

◧◩◪◨
70. probab+hn[view] [source] [discussion] 2024-05-20 14:15:21
>>Frustr+ec
IMHO a court would accept a waveform comparison that proved beyond all doubt that the two voices are similar, but I doubt you'd find a court that would settle the issue just because the comparison said no.

The cases brought forth by Marvin Gaye's family [1] showed that some judges will declare copyright infringement even if the melody, harmony and rhythm are different. Note that the author saying he reverse-engineered the original song in question probably had something to do with it, so in the end intent and artistic perception will always remain factors that no computer function can compute.

[1] https://en.m.wikipedia.org/wiki/Pharrell_Williams_v._Bridgep...

◧◩◪
71. dbsmit+Ao[view] [source] [discussion] 2024-05-20 14:22:50
>>browni+mm
Possibly, but that is not what their statement suggests to me, unless those 'questions' came from her/her legal team. I suppose they could also be worried about future legal action from her. Maybe it just isn't worth getting sued, even if you technically did nothing wrong. But I also have a hard time believing they went thru this careful process of voice selection and it never occurred to them that the voice sounded like her, and didn't think about the legal repercussions

https://openai.com/index/how-the-voices-for-chatgpt-were-cho...

74. ChrisA+nr[view] [source] 2024-05-20 14:35:52
>>helsin+(OP)
[dupe]

More discussion on official post:

>>40412903

◧◩◪◨⬒
75. aranel+5s[view] [source] [discussion] 2024-05-20 14:38:56
>>lucubr+Bb
I agree with you on a personal level, though again I’m sure if I CTRL+F replika subreddit I’d find many people describing their emotions with similar words.

Anyway, let’s say he was negatively affected by that relationship, IIRC.

Reminds me a bit of this: https://www.uniladtech.com/news/ai/man-married-hologram-no-l... , up to you if you find people developing strong feelings to inanimate objects that can’t care less, dystopic or not.

◧◩
91. qarl+Bp1[view] [source] [discussion] 2024-05-20 20:06:50
>>hacker+cU
I agree that the Sky voice is flirtatious.

But I'd also argue that flirtatiousness is a very good match for the character of Samatha in "Her" -

https://www.youtube.com/watch?v=GV01B5kVsC0&t=84s

◧◩
100. fragme+D32[view] [source] [discussion] 2024-05-20 23:53:30
>>honest+or1
Apple shipped that as an accessibility feature in 2023

https://support.apple.com/en-us/104993

102. 1vuio0+J52[view] [source] 2024-05-21 00:07:23
>>helsin+(OP)
Works where archive.ph is blocked:

2024-05-20T07:16:23.679Z

OpenAI to Pull Johansson Soundalike Sky's Voice From ChatGPT

OpenAI is working to pause the use of the Sky voice from an audible version of ChatGPT after users said that it sounded too much like actress Scarlett Johansson.

The company said that the voice, one of five available on ChatGPT, was from an actress and was not chosen to be an "imitation" of Johansson, according to a blog post.^1 Johansson played a fictional virtual assistant in the film Her, about a man who falls in love with an AI system.

The voices are part of OpenAI's updated GPT-4o, which debuted earlier this month and can reply to verbal questions from users with an audio response.

1. https://openai.com/index/how-the-voices-for-chatgpt-were-cho...

Maybe I have missed something. Is this really the fulltext of the "article"??

◧◩◪◨
115. browni+Nc2[view] [source] [discussion] 2024-05-21 00:51:28
>>dbsmit+Ao
How about this statement from Scarlett Johansson herself?

https://daringfireball.net/linked/2024/05/20/openai-johansso...

◧◩◪
118. gregw2+cd2[view] [source] [discussion] 2024-05-21 00:54:08
>>LewisV+s92
Asking someone to license their voice, getting a refusal, then asking them again two days before launch and then releasing the product without permission, then tweeting post launch that the product should remind you of that character in a movie they didn't get rights to from the actress or film company is all sketchy and -- if similar enough to the famous actress's voice, is a violation of her personality rights under California and various other jurisdictions: https://en.m.wikipedia.org/wiki/Personality_rights

These rights should have their limits but also serve a very real purpose in that such people should have some protection from others pretending to be/sound like/etc them in porn, ads for objectionable products/organizations/etc, and all the above without compensation.

◧◩◪◨⬒⬓⬔⧯▣
190. Captai+yXa[view] [source] [discussion] 2024-05-23 16:00:25
>>timsch+zW5
First time I hear about it, but reading about it, it seems that specific case actually changed the typical terms for actors to prevent similar issues?

> Rather than write George out of the film, Zemeckis used previously filmed footage of Glover from the first film as well as new footage of actor Jeffrey Weissman, who wore prosthetics including a false chin, nose, and cheekbones to resemble Glover. [...]

> Unhappy with this, Glover filed a lawsuit against the producers of the film on the grounds that they neither owned his likeness nor had permission to use it. As a result of the suit, there are now clauses in the Screen Actors Guild collective bargaining agreements stating that producers and actors are not allowed to use such methods to reproduce the likeness of other actors.[

> Glover's legal action, while resolved outside of the courts, has been considered as a key case in personality rights for actors with increasing use of improved special effects and digital techniques, in which actors may have agreed to appear in one part of a production but have their likenesses be used in another without their agreement.

https://en.wikipedia.org/wiki/Back_to_the_Future_Part_II#Rep...

[go to top]