zlacker

[parent] [thread] 33 comments
1. andrew+(OP)[view] [source] 2024-05-20 23:51:59
I thought it sounded like Jodie Foster.
replies(1): >>ncr100+H2
2. ncr100+H2[view] [source] 2024-05-21 00:09:19
>>andrew+(OP)
Scar Jo thought it sounded like herself, and so did people who knew her personally.

That is what matters. OWNERSHIP over her contributions to the world.

replies(5): >>mcphag+E4 >>smt88+ib >>minima+yb >>dyno12+ii >>parine+fl
◧◩
3. mcphag+E4[view] [source] [discussion] 2024-05-21 00:21:39
>>ncr100+H2
Clearly Sam Altman though it sounded like ScarJo as well :-(
◧◩
4. smt88+ib[view] [source] [discussion] 2024-05-21 01:03:32
>>ncr100+H2
I mostly agree with you, but I actually don't think it matters if it sounded exactly like her or not. The crime is in the training: did they use her voice or not?

If someone licenses an impersonator's voice and it gets very close to the real thing, that feels like an impossible situation for a court to settle and it should probably just be legal (if repugnant).

replies(7): >>toomuc+Ac >>sangno+Yd >>randog+Df >>aseipp+Tf >>dv_dt+Hg >>XorNot+Xs >>jonath+Av
◧◩
5. minima+yb[view] [source] [discussion] 2024-05-21 01:06:39
>>ncr100+H2
More notably for legal purposes, there were several independent news reports corroborating the vocal similarity.
replies(1): >>sangno+Be
◧◩◪
6. toomuc+Ac[view] [source] [discussion] 2024-05-21 01:14:16
>>smt88+ib
https://en.wikipedia.org/wiki/Personality_rights
◧◩◪
7. sangno+Yd[view] [source] [discussion] 2024-05-21 01:28:19
>>smt88+ib
> The crime is in the training: did they use her voice or not?

This is a civil issue, and actors get broad rights to their likeliness. Kim Kardashian sued Old Navy for using a look-alike actress in an ad; old Navy chose to settle, which makes it appear like "the real actress wasn't involved in any way" may not be a perfect defense. The timeline makes it clear they wanted it to sound like Scarlett's voice, the actual mechanics on how they got the AI to sound like that is only part of the story.

◧◩◪
8. sangno+Be[view] [source] [discussion] 2024-05-21 01:32:56
>>minima+yb
...and sama's tweet referencing "Her"
◧◩◪
9. randog+Df[view] [source] [discussion] 2024-05-21 01:42:05
>>smt88+ib
> If someone licenses an impersonator's voice and it gets very close to the real thing, that feels like an impossible situation for a court to settle and it should probably just be legal (if repugnant).

Does that mean if cosplayers dress up like some other character, they can use that version of the character in their games/media? I think it should be equally simple to settle. It's different if it's their natural voice. Even then, it brings into question whether they can use "doppelgangers" legally.

◧◩◪
10. aseipp+Tf[view] [source] [discussion] 2024-05-21 01:44:15
>>smt88+ib
It is not an impossible situation, courts have settled it, and what you describe is not how the law works (despite how many computer engineers think to the contrary.)
replies(1): >>smt88+qI
◧◩◪
11. dv_dt+Hg[view] [source] [discussion] 2024-05-21 01:50:25
>>smt88+ib
As I understand it (though I may be wrong) in music sampling cases, it doesn’t matter if the “sample” is using an actual clip from a recording or if were recreated from scratch using a new media (e.g. direct midi sequence), if a song sampling another song is recognizable it is still infringing.
replies(1): >>parine+Hl
◧◩
12. dyno12+ii[view] [source] [discussion] 2024-05-21 02:05:27
>>ncr100+H2
I'm not sure how much you currently legally own imitations of your own voice. There's a whole market for voice actors who can imitate particular famous voices.
replies(2): >>adolph+Ri >>stonog+xU2
◧◩◪
13. adolph+Ri[view] [source] [discussion] 2024-05-21 02:13:28
>>dyno12+ii
Should have renamed it

https://en.wikipedia.org/wiki/Sosumi

Or

https://www.reddit.com/r/todayilearned/comments/9n44b6/til_t...

◧◩
14. parine+fl[view] [source] [discussion] 2024-05-21 02:39:43
>>ncr100+H2
She doesn't own most (all probably) of her contributions to the world.

If the voice was only trained on the voice of the character she played in Her, would she have any standing in claiming some kind of infringement?

replies(1): >>lesost+Zx1
◧◩◪◨
15. parine+Hl[view] [source] [discussion] 2024-05-21 02:44:19
>>dv_dt+Hg
Sampling is not the same as duplication. Sampling is allowed as it's a derivitive work as long as it's substantially different from the original.

It's a "I know it when I see it" situation so it's not clear cut.

replies(1): >>Findec+xO
◧◩◪
16. XorNot+Xs[view] [source] [discussion] 2024-05-21 03:55:14
>>smt88+ib
If OpenAI commissioned a voice actor to lend their voice to the Sky model, and cast on the basis of trying to get someone who is similar sounding to the Scarlett Johannson, but then did not advertise or otherwise use the voice model created to claim it was Scarlett Johannson - then they're completely in the clear.

Because then the actual case would be fairly bizarre: an entirely separate person, selling the rights to their own likeness as they are entitled to do, is being prohibited from doing that by the courts because they sound too much like an already famous person.

EDIT: Also up front I'm not sure you can entirely discuss timelines for changing out technology here. We have voice cloning systems that can do it with as little as 15 seconds of audio. So having a demo reel of what they wanted to do that they could've used on a few days notice isn't unrealistic - and training a model and not using it or releasing it also isn't illegal.

replies(2): >>cycoma+8z >>jeroje+2C
◧◩◪
17. jonath+Av[view] [source] [discussion] 2024-05-21 04:23:40
>>smt88+ib
This has been settled law for 34 years. See Tom Waits v Frito-Lay.

They literally hired an impersonator, and it cost them 2.5 million (~6 million today).

https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...

replies(1): >>smt88+bJ
◧◩◪◨
18. cycoma+8z[view] [source] [discussion] 2024-05-21 04:59:11
>>XorNot+Xs
That's confidently incorrect. Many others already posted that this has been settled case law for many years. I mean would you argue that if someone build a macbook lookalike, but not using the same components would be completely clear?
replies(1): >>XorNot+Oz
◧◩◪◨⬒
19. XorNot+Oz[view] [source] [discussion] 2024-05-21 05:06:40
>>cycoma+8z
I ask you what do you call the Framework [1]? Or Dell's offerings?[2] Compared to the Macbook? [3]

Look kind of similar right? Lot of familiar styling queues? What would take it from "similar" to actual infringement? Well if you slapped an Apple Logo on there, that would do it. Did OpenAI make an actual claim? Did they actually use Scarlett Johannson's public image and voice as sampling for the system?

[1] https://images.prismic.io/frameworkmarketplace/25c9a15f-4374...

[2] https://i.dell.com/is/image/DellContent/content/dam/ss2/prod...

[3] https://cdn.arstechnica.net/wp-content/uploads/2023/06/IMG_1...

replies(2): >>mrbung+IE >>einher+9N
◧◩◪◨
20. jeroje+2C[view] [source] [discussion] 2024-05-21 05:28:00
>>XorNot+Xs
Well Sam Altman tweeted "her" so that does seem to me like they're trying to claim a similarity to Scarlett Johannson.
◧◩◪◨⬒⬓
21. mrbung+IE[view] [source] [discussion] 2024-05-21 05:58:33
>>XorNot+Oz
You're not arguing your way out of jurisprudence, especially when the subject is a human and not a device nor IP. They (OpenAI) fucked up.
replies(1): >>XorNot+zI
◧◩◪◨
22. smt88+qI[view] [source] [discussion] 2024-05-21 06:39:09
>>aseipp+Tf
Courts have settled almost nothing related to AI. We don't even know if training AI using copyrighted works is a violating of copyright law.

Please point to a case where someone was successfully sued for sounding too much like a celebrity (while not using the celebrity's name or claiming to be them).

replies(2): >>davidg+lN >>ascorb+GP
◧◩◪◨⬒⬓⬔
23. XorNot+zI[view] [source] [discussion] 2024-05-21 06:40:37
>>mrbung+IE
There is not clear jurisprudence on this. They're only in trouble if they actually used ScarJo's voice samples to train the model, or if they intentionally tried to portray their imitation as her without her permission.

The biggest problem on that front (assuming the former is not true) is Altman's tweets, but court-wise that's defensible (though I retract what I had here previously - probably not easily) as a reference to the general concept of the movie.

Because otherwise the situation you have is OpenAI seeking a particular style, hiring someone who can provide it, not trying to pass it off as that person (give or take the Tweet's) and the intended result effectively being: "random voice actress, you sound too much like an already rich and famous person. Good luck having no more work in your profession" - which would be the actual outcome.

The question entirely hinges on, did they include any data at all which includes ScarJo's voice samples in the training. And also whether it actually does sound similar enough - Frito-Lay went down because of intent and similarity. There's the hilarious outcome here that the act of trying to contact ScarJo is the actual problem they had.

EDIT 2: Of note also - to have a case, they actually have to show reputational harm. Of course on that front, the entire problem might also be Altman. Continuing the trend I suppose of billionaires not shutting up on Twitter being the main source of their legal issues.

replies(2): >>einher+gN >>zdp7+cH5
◧◩◪◨
24. smt88+bJ[view] [source] [discussion] 2024-05-21 06:46:11
>>jonath+Av
That case seems completely dissimilar to what OpenAI did.

Frito-Lay copied a song by Waits (with different lyrics) and had an impersonator sing it. Witnesses testified they thought Waits had sung the song.

If OpenAI were to anonymously copy someone's voice by training AI on an imitation, you wouldn't have:

- a recognizable singing voice

- music identified with a singer

- market confusion about whose voice it is (since it's novel audio coming from a machine)

I don't think any of this is ethical and think voice-cloning should be entirely illegal, but I also don't think we have good precedents for most AI issues.

replies(1): >>jonath+kW
◧◩◪◨⬒⬓
25. einher+9N[view] [source] [discussion] 2024-05-21 07:29:37
>>XorNot+Oz
Grey laptops that share some ideas in their outline while being distinct enough to not get lawyers from Cupertino on their necks?
◧◩◪◨⬒⬓⬔⧯
26. einher+gN[view] [source] [discussion] 2024-05-21 07:30:57
>>XorNot+zI
Are you a lawyer?
◧◩◪◨⬒
27. davidg+lN[view] [source] [discussion] 2024-05-21 07:31:28
>>smt88+qI
Multiple cases already answering your question in this thread.
◧◩◪◨⬒
28. Findec+xO[view] [source] [discussion] 2024-05-21 07:47:22
>>parine+Hl
Oh, the day when an artist could sample other artists without attribution and royalties is long gone. The music labels are very hard on this these days.
◧◩◪◨⬒
29. ascorb+GP[view] [source] [discussion] 2024-05-21 07:59:24
>>smt88+qI
Midler vs Ford: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
◧◩◪◨⬒
30. jonath+kW[view] [source] [discussion] 2024-05-21 08:55:40
>>smt88+bJ
Let me connect the dots for you.

Company identifies celebrity voice they want. (Frito=Waits, OpenAi=ScarJo)

Company comes up with novel thing for the the voice to say. (Frito=Song, OpenAI=ChatGpt)

Company decides they don’t need the celebrity they want (Frito=Waits, OpenAI=ScarJo) and instead hire an impersonator (Frito=singer, {OpenAI=impersonator or OpenAI=ScarJo-public-recordings}) to get what they want (Frito=a-facsimile-of-Tom-Waitte’s-voice-in-a-commercial, OpenAi=a-fascimilie-of-ScarJo’s-voice-in-their-chatbot)

When made public, people confuse the fascimilie as the real thing.

I don’t see how you don’t see a parallel. It’s literally best for beat the same, particularly around the part about using an impersonator as an excuse.

◧◩◪
31. lesost+Zx1[view] [source] [discussion] 2024-05-21 13:15:32
>>parine+fl
IANAL, but – I think it's most likely to be an infringement on her right of publicity (i.e. the right to control the commercial value of your name, likeness, etc.)

She doesn't have to own anything to claim this right, if the value of her voice is recognizable.

◧◩◪
32. stonog+xU2[view] [source] [discussion] 2024-05-21 20:17:54
>>dyno12+ii
When those imitations are commercialized, there is a disclaimer that a celebrity voice is being impersonated, and parody is a legally protected form of speech. OpenAI is not parodying anything, and failed even the low bar of having a disclaimer.
◧◩◪◨⬒⬓⬔⧯
33. zdp7+cH5[view] [source] [discussion] 2024-05-22 18:05:09
>>XorNot+zI
A bit late here, but... You are ignoring a lot of evidence. SJ stating they asked for permission twice. One of which was requested days before they released it. The Her tweet would seem to corroborate it's meant to sound like her. They then take it down (presumably since they aren't confident they'd win and don't want to be subjected to discovery.) Because of their tweet, even if the voice actors normal voice was identical to SJ, it's pretty clear they were trying to profit off her voice.
replies(1): >>XorNot+uQ6
◧◩◪◨⬒⬓⬔⧯▣
34. XorNot+uQ6[view] [source] [discussion] 2024-05-22 23:54:09
>>zdp7+cH5
I repeatedly referred to the fact that they tried to contact SJ, and Sam Altman's tweet, as being the biggest problems.
[go to top]