zlacker

[parent] [thread] 21 comments
1. LewisV+(OP)[view] [source] 2024-05-21 00:31:21
What's shitty about this?

They approached Johansson and she said no. They found another voice actor who sounds slightly similar and paid her instead.

The movie industry does this all the time.

Johansson is probably suing them so they're forced to remove the Sky voice while the lawsuit is happening.

Nothing here is shitty.

replies(4): >>0x6c6f+Y2 >>zarmin+s3 >>gregw2+K3 >>yumraj+de
2. 0x6c6f+Y2[view] [source] 2024-05-21 00:49:04
>>LewisV+(OP)
If they didn't use her voice at all, doesn't seem like there would be a case or even concern.

Also, they proceeded to ask her for rights just 2 days before they demoed the Sky voice. It would be pretty coincidental that they actually didn't use her voice for the training at all if they were still trying to get a sign off from her.

replies(3): >>LewisV+f3 >>grutur+6M >>im3w1l+Gk1
◧◩
3. LewisV+f3[view] [source] [discussion] 2024-05-21 00:50:57
>>0x6c6f+Y2
If they used her actual voice for training the model that shipped then I agree with you. It seems like they used the voice from another woman who sounds similar though.
replies(1): >>numpad+xs
4. zarmin+s3[view] [source] 2024-05-21 00:52:42
>>LewisV+(OP)
What I'm wondering is why are they doing that in the first place. Why is the best AI company in the world trying to stick a flirty voice into their product?
replies(2): >>ecjhdn+E5 >>Valodi+us
5. gregw2+K3[view] [source] 2024-05-21 00:54:08
>>LewisV+(OP)
Asking someone to license their voice, getting a refusal, then asking them again two days before launch and then releasing the product without permission, then tweeting post launch that the product should remind you of that character in a movie they didn't get rights to from the actress or film company is all sketchy and -- if similar enough to the famous actress's voice, is a violation of her personality rights under California and various other jurisdictions: https://en.m.wikipedia.org/wiki/Personality_rights

These rights should have their limits but also serve a very real purpose in that such people should have some protection from others pretending to be/sound like/etc them in porn, ads for objectionable products/organizations/etc, and all the above without compensation.

replies(1): >>LewisV+C4
◧◩
6. LewisV+C4[view] [source] [discussion] 2024-05-21 00:59:23
>>gregw2+K3
I will agree with you if

- they used Johannson's actual voice in training the text to speech model

or

- a court finds that they violated Johannson's likeness.

From hearing the demo videos, I don't think the voice sounded that similar to Johannson.

But hiring another actor to replicate someone you refused your offer is not illegal and is done all the time by hollywood.

replies(1): >>Captai+sZ
◧◩
7. ecjhdn+E5[view] [source] [discussion] 2024-05-21 01:08:35
>>zarmin+s3
It pains me to say it, but I really think it pays dividends to consider the very obvious possibility that the people who are doing this are in general just not socially well-adjusted.

Everything about OpenAI speaks of people who do not put great value on shared human connections, no?

Hey, I like that artist. I am going to train a computer to produce nearly identical work as if by them so I can have as many as I like, to meet my own wishes.

Why is it surprising that it didn't really cross their mind that a virtual girlfriend is not a good look?

This is not an organisation that has the feelings of people central to its mission. It's almost definitionally the opposite.

replies(1): >>zarmin+87
◧◩◪
8. zarmin+87[view] [source] [discussion] 2024-05-21 01:21:52
>>ecjhdn+E5
Yes, it seEms a LOt of big Names in tech have this same problem. Curious that, isn't it?

I also think it is tipping their hand a bit. I know companies can do multiple things at once, but what might this flirty assistant focus suggest about how AGI is coming along?

replies(1): >>AutoDu+rr1
9. yumraj+de[view] [source] 2024-05-21 02:28:33
>>LewisV+(OP)
> The movie industry does this all the time.

Such as? Please give example..

◧◩
10. Valodi+us[view] [source] [discussion] 2024-05-21 04:53:34
>>zarmin+s3
...because human brains enjoy being talked to in a flirty voice, and they benefit from doing things that their customers like? Doesn't seem that mysterious
replies(1): >>zarmin+bu
◧◩◪
11. numpad+xs[view] [source] [discussion] 2024-05-21 04:53:49
>>LewisV+f3
It doesn't "seem like" in this instance, "no no that is not what we did we commissioned someone else" without specifying who is their claim.

From technical standpoint, a finetuned voice model can be built from just few minutes of data and GPU time on top of an existing voice model, almost like how artists LoRAs are built for images. So it is entirely within possibility that that had happened.

◧◩◪
12. zarmin+bu[view] [source] [discussion] 2024-05-21 05:11:49
>>Valodi+us
Guess you are their target market.
◧◩
13. grutur+6M[view] [source] [discussion] 2024-05-21 08:20:05
>>0x6c6f+Y2
I guess it takes more than a couple of days to organize things with an A list star, esp. if there's a studio recording session involved rather than just using existing material.

This strongly suggests they weren't trying to get her voice until the last minute (would have been too late for the launch) but, rather, they had already used the other actress, and realized they were exposing themselves to a lawsuit due to how similar they were.

It was a CYA move, it failed, and now their ass is uncovered.

◧◩◪
14. Captai+sZ[view] [source] [discussion] 2024-05-21 10:00:07
>>LewisV+C4
> But hiring another actor to replicate someone you refused your offer is not illegal and is done all the time by hollywood.

Probably this could indeed make them "win" (or not lose rather) in a legal battle/courts.

But doing so will easily make them lose in the PR/public sense, as it's a shitty thing to do to another person, and hopefully not everyone is completely emotionless.

replies(1): >>LewisV+g42
◧◩
15. im3w1l+Gk1[view] [source] [discussion] 2024-05-21 12:40:00
>>0x6c6f+Y2
Maybe despite not using her voice at all they wanted to give her some money as a gesture of good will and/or derisk the project.
replies(2): >>NekkoD+TG1 >>blitza+pH1
◧◩◪◨
16. AutoDu+rr1[view] [source] [discussion] 2024-05-21 13:13:11
>>zarmin+87
Perhaps it is stumbling over the question of whether its creators are good people.
◧◩◪
17. NekkoD+TG1[view] [source] [discussion] 2024-05-21 14:35:35
>>im3w1l+Gk1
Surely the company that has been gobbling up data and information without the rights to them or any form of compensation have suddenly turned a new leaf and decided to try and pay an actress that isn't involved.

Like, lets be real here. This wouldn't be the first time they would be using material without the right to them and I don't expect this to change any time soon without a major overhaul of EVERYTHING IN THE COMPANY and even then it will probably only happen after lawsuits and fines.

◧◩◪
18. blitza+pH1[view] [source] [discussion] 2024-05-21 14:37:06
>>im3w1l+Gk1
I would like to buy you a horse as a gesture of goodwill to derisk this flight attendant / passenger situation.
◧◩◪◨
19. LewisV+g42[view] [source] [discussion] 2024-05-21 16:20:31
>>Captai+sZ
> But doing so will easily make them lose in the PR/public sense, as it's a shitty thing to do to another person, and hopefully not everyone is completely emotionless.

If an actor is saying no and you have a certain creative vision then what do you do?

Johansson doesn't own the idea of a "flirty female AI voice".

replies(1): >>Captai+1e2
◧◩◪◨⬒
20. Captai+1e2[view] [source] [discussion] 2024-05-21 17:08:52
>>LewisV+g42
Find someone else? You think this is a new problem? Directors/producers frequently have a specific person in mind for casting in movies, but if the person says no, they'll have to find someone else. The solution is not to create a fictional avatar that "borrows" the non-consenting person's visual appearance.
replies(1): >>timsch+7N3
◧◩◪◨⬒⬓
21. timsch+7N3[view] [source] [discussion] 2024-05-22 03:44:31
>>Captai+1e2
> The solution is not to create a fictional avatar that "borrows" the non-consenting person's visual appearance.

That's exactly what was done when Jeffrey Weissman replaced Crispin Glover in Back to the Future Part II.

replies(1): >>Captai+6O8
◧◩◪◨⬒⬓⬔
22. Captai+6O8[view] [source] [discussion] 2024-05-23 16:00:25
>>timsch+7N3
First time I hear about it, but reading about it, it seems that specific case actually changed the typical terms for actors to prevent similar issues?

> Rather than write George out of the film, Zemeckis used previously filmed footage of Glover from the first film as well as new footage of actor Jeffrey Weissman, who wore prosthetics including a false chin, nose, and cheekbones to resemble Glover. [...]

> Unhappy with this, Glover filed a lawsuit against the producers of the film on the grounds that they neither owned his likeness nor had permission to use it. As a result of the suit, there are now clauses in the Screen Actors Guild collective bargaining agreements stating that producers and actors are not allowed to use such methods to reproduce the likeness of other actors.[

> Glover's legal action, while resolved outside of the courts, has been considered as a key case in personality rights for actors with increasing use of improved special effects and digital techniques, in which actors may have agreed to appear in one part of a production but have their likenesses be used in another without their agreement.

https://en.wikipedia.org/wiki/Back_to_the_Future_Part_II#Rep...

[go to top]