zlacker

[return to "OpenAI pulls Johansson soundalike Sky’s voice from ChatGPT"]
1. projec+DZ1[view] [source] 2024-05-20 23:30:20
>>helsin+(OP)
It is incredibly disheartening to see what was born a non-profit dedicated to guiding AI towards beneficial (or at the very least neutral) ends, to predictably fall into the well-worn SV groove of progressively shittier behavior in the pursuit of additional billions. What is victory for Open AI even supposed to look like by now?
◧◩
2. LewisV+s92[view] [source] 2024-05-21 00:31:21
>>projec+DZ1
What's shitty about this?

They approached Johansson and she said no. They found another voice actor who sounds slightly similar and paid her instead.

The movie industry does this all the time.

Johansson is probably suing them so they're forced to remove the Sky voice while the lawsuit is happening.

Nothing here is shitty.

◧◩◪
3. gregw2+cd2[view] [source] 2024-05-21 00:54:08
>>LewisV+s92
Asking someone to license their voice, getting a refusal, then asking them again two days before launch and then releasing the product without permission, then tweeting post launch that the product should remind you of that character in a movie they didn't get rights to from the actress or film company is all sketchy and -- if similar enough to the famous actress's voice, is a violation of her personality rights under California and various other jurisdictions: https://en.m.wikipedia.org/wiki/Personality_rights

These rights should have their limits but also serve a very real purpose in that such people should have some protection from others pretending to be/sound like/etc them in porn, ads for objectionable products/organizations/etc, and all the above without compensation.

◧◩◪◨
4. LewisV+4e2[view] [source] 2024-05-21 00:59:23
>>gregw2+cd2
I will agree with you if

- they used Johannson's actual voice in training the text to speech model

or

- a court finds that they violated Johannson's likeness.

From hearing the demo videos, I don't think the voice sounded that similar to Johannson.

But hiring another actor to replicate someone you refused your offer is not illegal and is done all the time by hollywood.

◧◩◪◨⬒
5. Captai+U83[view] [source] 2024-05-21 10:00:07
>>LewisV+4e2
> But hiring another actor to replicate someone you refused your offer is not illegal and is done all the time by hollywood.

Probably this could indeed make them "win" (or not lose rather) in a legal battle/courts.

But doing so will easily make them lose in the PR/public sense, as it's a shitty thing to do to another person, and hopefully not everyone is completely emotionless.

◧◩◪◨⬒⬓
6. LewisV+Id4[view] [source] 2024-05-21 16:20:31
>>Captai+U83
> But doing so will easily make them lose in the PR/public sense, as it's a shitty thing to do to another person, and hopefully not everyone is completely emotionless.

If an actor is saying no and you have a certain creative vision then what do you do?

Johansson doesn't own the idea of a "flirty female AI voice".

◧◩◪◨⬒⬓⬔
7. Captai+tn4[view] [source] 2024-05-21 17:08:52
>>LewisV+Id4
Find someone else? You think this is a new problem? Directors/producers frequently have a specific person in mind for casting in movies, but if the person says no, they'll have to find someone else. The solution is not to create a fictional avatar that "borrows" the non-consenting person's visual appearance.
◧◩◪◨⬒⬓⬔⧯
8. timsch+zW5[view] [source] 2024-05-22 03:44:31
>>Captai+tn4
> The solution is not to create a fictional avatar that "borrows" the non-consenting person's visual appearance.

That's exactly what was done when Jeffrey Weissman replaced Crispin Glover in Back to the Future Part II.

[go to top]