zlacker

[parent] [thread] 103 comments
1. skille+(OP)[view] [source] 2024-05-23 06:13:55
The thing that worried me initially was that:

- the original report by Scarlett said she was approached months ago, and then two days prior to launch of GPT-4o she was approached again

Because of the above, my immediate assumption was that OpenAI definitely did her dirty. But this report from WaPo debunks at least some of it, because the records they have seen show that the voice actor was contacted months in advance prior to OpenAI contacting Scarlett for the first time. (also goes to show just how many months in advance OpenAI is working on projects)

However, this does not dispel the fact that OpenAI did contact Scarlett, and Sam Altman did post the tweet saying "her", and the voice has at least "some" resemblance of Scarlett's voice, at least enough to have two different groups saying that it does, and the other saying that it does not.

replies(10): >>palad1+s >>stingr+51 >>serial+81 >>eps+qa >>user_7+fx >>ZooCow+kF >>JCM9+4H >>m3kw9+eU >>tmaly+CW >>beeboo+n21
2. palad1+s[view] [source] 2024-05-23 06:17:47
>>skille+(OP)
Yeah, sus af because of the call 2 days before they released it to the world. And they were just asking for it when they tweeted the frickin "her". I mean, come on.
3. stingr+51[view] [source] 2024-05-23 06:21:10
>>skille+(OP)
Yes, but it changes the narrative from “they couldn’t get Scarlett to record the voice, so they copied her voice” to something much less malicious. Contacting Scarlett, when you already have voice recordings ready but would prefer someone famous, isn’t that bad of a thing imho.
replies(2): >>tivert+G1 >>aprilt+h52
4. serial+81[view] [source] 2024-05-23 06:21:44
>>skille+(OP)
I don't know, to me, it's just sounds like they know how to cover all their bases.

To me, it sounds like they had the idea to make their AI sound like "her". For the initial version, they had a voice actor that sounds like the movie, as a proof of concept.

They still liked it, so it was time to contact the real star. In the end, it's not just the voice, it would have been the brand, just imagine the buzz they would have got if Scarlett J was the official voice of the company. She said no, and they were like, "too bad, we already decided how she will sound like, the only difference is whether it will be labelled as SJ or not".

In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny, they gave it another go, probably ready to offer more money, she still refused, but in the end, it didn't change a thing.

replies(8): >>gnicho+A1 >>dingcl+V9 >>jorvi+Af >>lupusr+Cg >>Terret+RC >>bengal+kM >>Action+MX >>freeja+a61
◧◩
5. gnicho+A1[view] [source] [discussion] 2024-05-23 06:26:42
>>serial+81
Agreed — seems like they had a plan, and probably talked extensively with Legal about how to develop and execute the plan to give themselves plausible deniability. The tweet was inadvisable, and undoubtedly not part of the actual plan (unless it was to get PR).
replies(2): >>safety+p7 >>visarg+Wa
◧◩
6. tivert+G1[view] [source] [discussion] 2024-05-23 06:28:07
>>stingr+51
> Yes, but it changes the narrative from “they couldn’t get Scarlett to record the voice, so they copied her voice” to something much less malicious.

I don't think it's less malicious if they decided to copy her voice without her consent, but just didn't tell her until the project was underway, then continued even after she said no.

There's legal precedent that hiring a copycat is not OK, so it's not like proving it was a copycat salvages their situation.

I wouldn't be surprised if the real reason they hired a copycat early is because they realized they'd need far more of Johansson's time than she'd be willing to provide, and the plan was typical SV "ask forgiveness not permission, but do it anyway regardless."

replies(1): >>MattGa+n2
◧◩◪
7. MattGa+n2[view] [source] [discussion] 2024-05-23 06:34:12
>>tivert+G1
They used a different person, so it is not her voice.
replies(1): >>tivert+H2
◧◩◪◨
8. tivert+H2[view] [source] [discussion] 2024-05-23 06:37:03
>>MattGa+n2
> They used a different person, so it is not her voice.

That doesn't matter because it's an impersonation. Ford lost, even though they didn't use Bette Midler's voice either: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.

replies(4): >>gnicho+F3 >>franka+R3 >>tptace+D4 >>avar+UF
◧◩◪◨⬒
9. gnicho+F3[view] [source] [discussion] 2024-05-23 06:45:43
>>tivert+H2
It seems like the key difference is that the advertisements in those cases involved people who sounded like particular musical artists, singing songs that those artists were well-known for singing. If you hired the woman who voiced Sky to say lines that Scarlett had in some of her movies, that would be similar. The fact that this is a chatbot makes it somewhat of an echo of those cases, but it strikes me (a former lawyer) as being a bridge too far. After all, you have to balance Scarlett's rights against the rights of someone who happens to have a voice that sounds like Scarlett's (it would be different if this were someone doing an impersonation of Scarlett, but whose natural voice sounds different).
replies(1): >>lupusr+hq1
◧◩◪◨⬒
10. franka+R3[view] [source] [discussion] 2024-05-23 06:47:47
>>tivert+H2
It's not an impersonation it is the actor using their own natural voice.

"We believe that AI voices should not deliberately mimic a celebrity's distinctive voice — Sky's voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using *her own natural speaking voice*"

replies(1): >>mzl+Vd
◧◩◪◨⬒
11. tptace+D4[view] [source] [discussion] 2024-05-23 06:54:27
>>tivert+H2
The story addresses this as well.
◧◩◪
12. safety+p7[view] [source] [discussion] 2024-05-23 07:16:49
>>gnicho+A1
> unless it was to get PR

I think this possibility doesn't receive enough attention, there is a class of people who've figured out that they can say the most scandalous things online and it's a net positive because it generates so much exposure. (As a lowly middle class employee you can't do this - you just get fired and go broke - but at a certain level of wealth and power you're immune from that.) It is the old PT Barnum principle, "They can say whatever they want about me as long as they spell my name right." Guys like Trump and Musk know exactly what they're doing. Why wouldn't Sam?

Johansson's complaint is starting to look a little shaky especially if you remove that "her" Tweet from the equation. I wouldn't put this past Altman at all, he knows exactly what happened and what didn't inside OpenAI, so maybe he knew she didn't have a case and decided to play Sociopathic 3D Chess with her (and beat her in one round)

replies(2): >>repeek+Pa >>lupire+lX
◧◩
13. dingcl+V9[view] [source] [discussion] 2024-05-23 07:38:09
>>serial+81
Sure but Skye is still not SJ.
14. eps+qa[view] [source] 2024-05-23 07:41:47
>>skille+(OP)
Unless they can clearly demostrate reproducing the voice from raw voice actor recordings, this could be just a parallel construction to cover their asses for exactly this sort of case.
replies(2): >>jonath+Qc >>johnbe+Ky
◧◩◪◨
15. repeek+Pa[view] [source] [discussion] 2024-05-23 07:44:14
>>safety+p7
As a lowly middle class employee it could be interpreted externally as “representing the company” which is why you see disclaimers like “all views are my own” on some social media profiles. Sam is the company, so they can’t get mad at him, and beyond that he’s a private individual saying whatever he wants on social media without lying.

In order to sue, there need to be damages, and if they didn’t copy the voice then the rest doesn’t matter, which sam and team clearly knew and were fast to work with the news. I agree that smart people take advantage of what they can get away with, but this controversy couldn’t have turned out better for increasing brand awareness good or bad (as you say, just like trump and musk know how to do)

◧◩◪
16. visarg+Wa[view] [source] [discussion] 2024-05-23 07:45:02
>>gnicho+A1
I am sure it was for free PR. Streisand effect trap for ScarJo.
replies(1): >>z7+xi1
◧◩
17. jonath+Qc[view] [source] [discussion] 2024-05-23 07:59:30
>>eps+qa
Doesn't matter. Waits v Frito Lay
replies(2): >>XorNot+lh >>Mattic+D11
◧◩◪◨⬒⬓
18. mzl+Vd[view] [source] [discussion] 2024-05-23 08:09:04
>>franka+R3
With the law, it is often about intent. If OpenAI had the intent to make a voice that sounded like Scarlett Johansson's in her, then I think that might be problematic for OpenAI. I am not a lawyer though.
◧◩
19. jorvi+Af[view] [source] [discussion] 2024-05-23 08:23:55
>>serial+81
> In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny

What if it wasn’t a computer voice model but rather a real-life voice actress that you could pay a few cents to try to imitate Scarlett Johansson’s voice as best as she could?

That’s effectively what’s happening here, and it isn’t illegal.

It guess it also leads to the bigger question: do celebrities own their particular frequency range? Is no one allowed to publicly sound like them? Feels like the AACS DVD encryption key controversy all-over again.

replies(4): >>krisof+Dh >>bryanr+Nj >>nailer+1X >>Terrif+m11
◧◩
20. lupusr+Cg[view] [source] [discussion] 2024-05-23 08:33:21
>>serial+81
> In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny, they gave it another go, probably ready to offer more money, she still refused,

That was just a few days before launch, right? What was their plan if she said yes at that point? Continue using the "not-her" voice but say it was her? Or did they also have her voice already cloned by then and just needed to flip a switch?

replies(1): >>prmous+Bh
◧◩◪
21. XorNot+lh[view] [source] [discussion] 2024-05-23 08:39:00
>>jonath+Qc
Which is not as similar as people keep saying though: both that case, and Bette Midler's involved singers, who perform as themselves and are their own brand.

Consider when a company recasts a voice actor in something: i.e. the VA Rick and Morty have been replaced, Robin Williams was not the voice of genie in Aladdin 2 or the animated series.

replies(1): >>bdowli+an
◧◩◪
22. prmous+Bh[view] [source] [discussion] 2024-05-23 08:40:29
>>lupusr+Cg
> Continue using the "not-her" voice but say it was her? Or did they also have her voice already cloned by then and just needed to flip a switch?

One or the other. It doesn't really matter as SJ herself would not have necessarily been able to make sure it is not her and not a glitch in how the tech work with her voice.

◧◩◪
23. krisof+Dh[view] [source] [discussion] 2024-05-23 08:40:35
>>jorvi+Af
> That’s effectively what’s happening here, and it isn’t illegal.

It is more complicated than that. Check out Midler v. Ford Motor Co, or Waits V. Frito Lay.

replies(1): >>theult+eT
◧◩◪
24. bryanr+Nj[view] [source] [discussion] 2024-05-23 09:00:18
>>jorvi+Af
>guess it also leads to the bigger question

people are allowed to sound like other people. But if you go to actor 1 and say we want to use your voice for our product, and then they say no, and then you go to actor 2 and tell them I want you to sound like actor 1 for our product, and then you release a statement hey you know that popular movie by actor 1 that just used their voice in a context extremely reminiscent of our product?!? Well, listen to what we got: (actor 2 voice presented)

Then you may run into legal problems.

https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.

on edit: assuming that reports I am reading that the actress used for the voicework claimed not to have been instructed to sound like Her vocal work it sounds like it is probably not likely that a suit would be successful.

replies(1): >>Tobu+Ei1
◧◩◪◨
25. bdowli+an[view] [source] [discussion] 2024-05-23 09:30:40
>>XorNot+lh
Recasting a voice actor when there was a contract with the prior actor (and such a contract would typically allow for recasting) is one thing.

Copying a famous actor’s voice without any kind of agreement at all is something else.

26. user_7+fx[view] [source] 2024-05-23 10:58:46
>>skille+(OP)
I'm not sure if that's enough to protect OAI, it feels like they wanted SJ, found a similar voice actor as a version 1, tried to "officially" get SJ's voice, and when it failed instead of pulling it continued on. It still feels quite a deliberate move to use her likeness, and the "contact 2 days before" sounds like they really wanted to get her okay before using the other VA's voice.
◧◩
27. johnbe+Ky[view] [source] [discussion] 2024-05-23 11:12:52
>>eps+qa
Intent matters.

When discovery happens and there’s a trail of messages suggesting either getting ScarJo or finding someone that sounds enough like her this isn’t going to look good with all the other events in timeline.

If it goes to court, they’ll settle.

replies(1): >>Turing+x01
◧◩
28. Terret+RC[view] [source] [discussion] 2024-05-23 11:43:07
>>serial+81
Sky doesn't sound like the movie, much less "uncanny".
replies(1): >>wkat42+B31
29. ZooCow+kF[view] [source] 2024-05-23 12:00:18
>>skille+(OP)
A plausible alternative explanation for asking Johansson:

  (1) They cast the current actor to test the technology and have a fallback.  The actor sounds somewhat different from Johansson but the delivery of the lines is similar.  

  (2) They then ask Johansson because they want to be the company that brought “Her” to life.  She declines.  

  (3) They try again shortly before the event because they really want it to happen.

  (4) They proceed with the original voice, and the “her” tweet happens because they want to be the ones that made it real. 
Asking shortly before the release is the weakest link here. It’s possible they already had a version trained or fine tuned on her voice that they could swap in at the last minute. That could explain some of the caginess. Not saying it’s what happened or is even likely, but it feels like a reasonable possibility.
replies(1): >>pavon+Xs1
◧◩◪◨⬒
30. avar+UF[view] [source] [discussion] 2024-05-23 12:02:46
>>tivert+H2
Ford commissioned a cover of a 1958 song[1] using a singer that would clearly be mistaken for Bette Midler's existing cover of that song, as part of an advertisement campaign where they first tried to get the rights to the original songs.

If you listen to the imitation version linked from that Wikipedia article and the original 1958 you'll hear that they didn't only find a singer that sounded like her, but copied the music and cadence from Bette's version.

I think that's way past what whatever OpenAI did in this case. It would be analogous if they were publishing something that only regurgitated lines Scarlett Johansson is famous for having said in her movies.

But they're not doing that, they just found a person who sounds like Scarlett Johansson.

This would only be analogous to the Ford case if the cover artist in that case was forbidden from releasing any music, including original works, because her singing voice could be confused with Bette Midler's.

Now, would they have done this if Scarlett Johansson wasn't famous? No, but we also wouldn't have had a hundred grunge bands with singers playing up their resemblance to Kurt Cobain if Nirvana had never existed.

So wherever this case lands (likely in a boring private settlement) it's clearly in more of a gray area than the Ford case.

1. https://en.wikipedia.org/wiki/Do_You_Want_to_Dance

31. JCM9+4H[view] [source] 2024-05-23 12:10:37
>>skille+(OP)
Sounds more plausible that someone pointed out to them internally they could be in a heap of trouble if Scarlett objected after they released it. It doesn’t matter if it was actually her voice or not it matters if people think it was her voice. If someone pointed this out late in the process than yeah there would have been a mad scramble to get Scarlett to sign off. When she didn’t then that put them in a bad spot.
◧◩
32. bengal+kM[view] [source] [discussion] 2024-05-23 12:49:37
>>serial+81
A more charitable scenario might be that they hire the voice actor and it sounds a bit like her. Someone suggests why don't we just get Scarlett to do it properly, wouldn't that be cooler? They reach out and she says no. They decide to continue with the one that sounds a bit like her.
replies(2): >>jrm4+CX >>lupire+jY
◧◩◪◨
33. theult+eT[view] [source] [discussion] 2024-05-23 13:31:35
>>krisof+Dh
Ford hired impersonators, she's not an impersonator, that's her real voice.

She's allowed to be a voice actor using her real voice.

Your can point to the "Her" tweet, but it's a pretty flimsy argument.

replies(3): >>senorr+DW >>Mattic+3X >>krisof+jJ1
34. m3kw9+eU[view] [source] 2024-05-23 13:35:47
>>skille+(OP)
We are just nit picking now because we are bored?
35. tmaly+CW[view] [source] 2024-05-23 13:49:04
>>skille+(OP)
Why would they have taken down the voice if they were operating on a level of truth in their favor?
replies(1): >>KHRZ+mn1
◧◩◪◨⬒
36. senorr+DW[view] [source] [discussion] 2024-05-23 13:49:06
>>theult+eT
Whether the actor was an impersonator or not is still up to debate. I can see an argument being made when you consider the entire context.
replies(1): >>theult+751
◧◩◪
37. nailer+1X[view] [source] [discussion] 2024-05-23 13:51:00
>>jorvi+Af
> What if it wasn’t a computer voice model but rather a real-life voice actress that you could pay a few cents to try to imitate Scarlett Johansson’s voice as best as she could?

> That’s effectively what’s happening here, and it isn’t illegal.

Profiting from someone else's likeness is illegal.

◧◩◪◨⬒
38. Mattic+3X[view] [source] [discussion] 2024-05-23 13:51:21
>>theult+eT
This is correct, and is very different from both the Midler and Waits cases. The courts are never going to tell a voice actor she can't use her real voice because she sounds too much like a famous person.

And besides, it sounds more like Rashida Jones anyway. It's clearly not an impersonation.

replies(1): >>BeefWe+cM2
◧◩◪◨
39. lupire+lX[view] [source] [discussion] 2024-05-23 13:53:30
>>safety+p7
Johansson might not win a lawsuit, but she isn't looking bad at all. She is totally standup in the Arts vs BigTech AI cultural battle. (See also, Apple's recent iPad crushes all artist material" commercial.)

Nothing in this article changes the essence of her complaint.

The only real, though partial, rebuttal to her is that OpenAI copied a work product she did for a movie, and the movie was was more than her voice, so it's not totally her own work. So maybe the movie team as a whole has a stronger complaint than the voice actor alone.

She didn't lose any game of wits. She just got done dirty by someone who got away with it. She doesn't need money from them. She has respect from people who matter, SAM and OpenAI behaved badly like big tech always does. If OpenAI permanently stops using Johansson-like Sky voice, she'll win what she wanted.

Of course, anyone whose voice sounds like an AI has the unpleasantness of that experience, and a rich person is more able to endure it than a regular Johansson.

◧◩◪
40. jrm4+CX[view] [source] [discussion] 2024-05-23 13:54:21
>>bengal+kM
Genuine question;

Why in the world would one expect the more charitable scenario?

replies(4): >>glenst+s11 >>KHRZ+9l1 >>bengal+Pq2 >>alickz+6u3
◧◩
41. Action+MX[view] [source] [discussion] 2024-05-23 13:55:03
>>serial+81
This will be used as a template by the entertainment industry to screw over so many people.
replies(1): >>Uehrek+411
◧◩◪
42. lupire+jY[view] [source] [discussion] 2024-05-23 13:56:51
>>bengal+kM
That's the same thing, in fewer words. It doesn't change that the beginning and the end are still imitating the original, and this is a billion dollar corporation, not an Elvis personator doing a little show.
◧◩◪
43. Turing+x01[view] [source] [discussion] 2024-05-23 14:07:24
>>johnbe+Ky
>> When discovery happens and there’s a trail of messages suggesting either getting ScarJo or finding someone that sounds enough like her this isn’t going to look good with all the other events in timeline.

I'm not a lawyer, but this seems unfair to the voice actor they did use, and paid, who happens to sound like ScarJo (or vice versa!)

So if I sound like a famous person, then I cant monetize my own voice? Who's to say it isnt the other way around, perhaps it is ScarJo that sounds like me and i'm owed money?

replies(2): >>johnbe+Z41 >>freeja+kf1
◧◩◪
44. Uehrek+411[view] [source] [discussion] 2024-05-23 14:09:40
>>Action+MX
How? This kind of thing is already illegal. If I’m producing a commercial for Joe’s Hot Dogs, and I hire a voice actor who sounds like Morgan Freeman, and he never says “I’m Morgan Freeman” but he’s the main voice in the commercial and the cartoon character he’s voicing looks like Morgan Freeman… well, many consumers will be confused into thinking Morgan Freeman likes Joe’s Hot Dogs, and that’s a violation of Morgan Freeman’s trademark.
replies(2): >>throwa+I41 >>nilamo+o71
◧◩◪
45. Terrif+m11[view] [source] [discussion] 2024-05-23 14:11:56
>>jorvi+Af
Right of publicity. Profiting of their image without their permission will get you sued. Even if you use an impersonator. If there is a chance the public will connect it with them, you are probably screwed.

e.g.

Vanna White vs Samsung - https://w.wiki/AAUR

Crispin Glover Back to the Future 2 lawsuit - https://w.wiki/AAUT#Back_to_the_Future_Part_II_lawsuit

◧◩◪◨
46. glenst+s11[view] [source] [discussion] 2024-05-23 14:12:16
>>jrm4+CX
It's just a best practice that serves as a healthy counterbalance to cognitive biases, that might otherwise urge us to convict without evidence.

It's not necessarily what will prove true at the end of the day but I think we owe people the presumption of innocence.

replies(3): >>ranger+ha1 >>jrm4+id1 >>ryandr+ie1
◧◩◪
47. Mattic+D11[view] [source] [discussion] 2024-05-23 14:12:59
>>jonath+Qc
That's an impersonation of a parody song in his style. This is a voice actor who has a voice that's kinda similar to ScarJo and kinda similar to Rashida Jones but not quite either one doing something different.

Cases are not a spell you can cast to win arguments, especially when the facts are substantially different.

replies(1): >>jonath+M41
48. beeboo+n21[view] [source] 2024-05-23 14:16:48
>>skille+(OP)
Is it a crime for voice actors to sound similar to, say, Darth Vader?
replies(1): >>CRConr+6e3
◧◩◪
49. wkat42+B31[view] [source] [discussion] 2024-05-23 14:24:41
>>Terret+RC
I think it sounds overly enthusiastic though, to the point that it sounds fake. Very overacted and dramatic. I wouldn't want to chat with that voice.

Though admittedly, so does Johansson in "Her". I don't think the voices are very similar but the style is.

replies(1): >>gg82+zV2
◧◩◪◨
50. throwa+I41[view] [source] [discussion] 2024-05-23 14:30:23
>>Uehrek+411
no, that's definitely not illegal. Voices are not trademarkable, only jingles (melody, words + tone), and of course specific recordings of voices are copyrighted. The ONLY way they get in trouble is if they claim to be Morgan Freeman.
replies(4): >>zerocr+K71 >>thehap+f91 >>fakeda+ef1 >>dragon+XS1
◧◩◪◨
51. jonath+M41[view] [source] [discussion] 2024-05-23 14:30:48
>>Mattic+D11
In both cases the companies are specifically trading on creating confusion of a celebrity’s likeness in an act that celebrity trades in, and with the motivation of circumventing that very celebrity’s explicit rejection of the offer for that very work.

Just because one is a singer and the other is an actor isn’t the big difference you think it is. Actors do voice over work all the time. Actors in fact get cast for their voice all the time.

Yelling, “Parody!” Isn’t some get out of jail free card, particularly where there is actual case law, even more particularly when there are actual laws to address this very act.

replies(1): >>Mattic+jz1
◧◩◪◨
52. johnbe+Z41[view] [source] [discussion] 2024-05-23 14:31:51
>>Turing+x01
There isn't an unfairness to the voice actor. She did her job and got paid.

The problem here is that someone inside of OpenAI wanted to create a marketing buzz around a new product launch and capitalize on a movie. In order to do that they wanted a voice that sounded like that movie. They hired a voice actor that sounded enough like ScarJo to hedge against actually getting the actor to do it. When she declined they decided to implement their contingency plan.

If they're liable is for a jury to decide, but the case precedent that I've seen, along with the intent, wouldn't look good if I were on that jury.

replies(1): >>Turing+1j1
◧◩◪◨⬒⬓
53. theult+751[view] [source] [discussion] 2024-05-23 14:32:27
>>senorr+DW
I don't see any argument considering the entire context, care to explain?
◧◩
54. freeja+a61[view] [source] [discussion] 2024-05-23 14:37:53
>>serial+81
> they gave it another go, probably ready to offer more money, she still refused, but in the end, it didn't change a thing.

That's not what she said happened. She said they released it anyway before she and Sam could connect, after Sam had reached out, for the second time, two days prior to the release.

◧◩◪◨
55. nilamo+o71[view] [source] [discussion] 2024-05-23 14:43:17
>>Uehrek+411
Which part of that is illegal? Because I don't see anything.
replies(1): >>Shrimp+Ga1
◧◩◪◨⬒
56. zerocr+K71[view] [source] [discussion] 2024-05-23 14:45:11
>>throwa+I41
"Trademark" is not the correct way to talk about it, but commercial use of an impersonation can be a breach of the right of publicity, even if they don't actually say they're the person.
◧◩◪◨⬒
57. thehap+f91[view] [source] [discussion] 2024-05-23 14:53:09
>>throwa+I41
A reasonable person would assume that the voice in the commercial is Morgan Freeman, which could be very problematic for the commercial maker.
replies(1): >>throwa+bi2
◧◩◪◨⬒
58. ranger+ha1[view] [source] [discussion] 2024-05-23 14:57:54
>>glenst+s11
Is it necessarily a bad bias to assume OpenAI is still behaving as it's been behaving during its entire history: recklessly taking other people's IP?
replies(2): >>glenst+hb1 >>TeMPOr+4z1
◧◩◪◨⬒
59. Shrimp+Ga1[view] [source] [discussion] 2024-05-23 14:59:54
>>nilamo+o71
It's in the very article > He compared Johansson’s case to one brought by the singer Bette Midler against the Ford Motor Company in the 1980s. Ford asked Midler to use her voice in ads. After she declined, Ford hired an impersonator. The U.S. appellate courts ruled in Midler’s favor, indicating her voice was protected against unauthorized use.
replies(2): >>m348e9+Ec1 >>nilamo+xd1
◧◩◪◨⬒⬓
60. glenst+hb1[view] [source] [discussion] 2024-05-23 15:02:49
>>ranger+ha1
I think the way I would split the difference here is that your point should inform how we think about regulation and investigation. How we write rules, how we decide to proceed in terms of investigating things.

But when it comes to specific questions that hinge on evidence, I think you have to maintain the typical presumption of innocence, just to balance out the possibilities of mob psychology getting out of control.

◧◩◪◨⬒⬓
61. m348e9+Ec1[view] [source] [discussion] 2024-05-23 15:11:08
>>Shrimp+Ga1
There's a gray area. If Ford Motor Company hired an actor happened to sound a lot like Bette Midler using their normal speaking voice, Ford would have had a much better chance in defending their case.

As I understand it, that's essentially OpenAI's defense here.

replies(1): >>CRConr+0d3
◧◩◪◨⬒
62. jrm4+id1[view] [source] [discussion] 2024-05-23 15:14:06
>>glenst+s11
Yes, you owe people that.

No, you do not owe "corporations, especially those with a tendency, incentive, and history of being ruthless in this way."

Wise up, people.

◧◩◪◨⬒⬓
63. nilamo+xd1[view] [source] [discussion] 2024-05-23 15:15:10
>>Shrimp+Ga1
So if I happen to sound like Tom Hanks, anyone recording me in passing would be breaking the law? How does anyone see that as reasonable?
replies(1): >>pnt12+Sh1
◧◩◪◨⬒
64. ryandr+ie1[view] [source] [discussion] 2024-05-23 15:18:26
>>glenst+s11
I think we owe people outside of a commercial environment the presumption of innocence and benefit of the doubt. But we owe profit-seeking corporations (or their officers) neither, and the assumption should be that they are simply amorally doing whatever maximizes profit. As soon as someone hangs their shingle out there as a business, our presumptions should change.
◧◩◪◨⬒
65. fakeda+ef1[view] [source] [discussion] 2024-05-23 15:23:06
>>throwa+I41
Voices may not be trademarkable but I'm pretty sure styles and intonations are.
◧◩◪◨
66. freeja+kf1[view] [source] [discussion] 2024-05-23 15:23:24
>>Turing+x01
How is it unfair to the voice actor? Is she getting sued? Is she paying damages? Is she being prevented from doing her work? No.

> Who's to say it isnt the other way around, perhaps it is ScarJo that sounds like me and i'm owed money?

It seems like you don't get the fundamental principal underlying "right of publicity" laws if you are asking this question.

replies(1): >>Turing+pi1
◧◩◪◨⬒⬓⬔
67. pnt12+Sh1[view] [source] [discussion] 2024-05-23 15:35:02
>>nilamo+xd1
That's a bit of a strawman: you're twisting the scope and arguing for it.

A more similar context would be: they ask Tom Hanks to create a voice similar to Woody, the cowboy from Toy Story . Tom Hanks says no, Disney says no. Then they ask you to voice their cowboy voice. It's obviously related: they tried the OG, failed, they're going for a copycat after.

But if never approached Tom Hanks or Disney, then there would be room for deniability - without mentions to real names, it would require someone to judge if it's an unauthorized copycat or just a random actor voicing a random cowboy voice.

It was a bad play from their part.

replies(3): >>nilamo+bl1 >>tpmone+Pp1 >>stale2+LA1
◧◩◪◨⬒
68. Turing+pi1[view] [source] [discussion] 2024-05-23 15:37:36
>>freeja+kf1
>> How is it unfair to the voice actor? Is she getting sued? Is she paying damages? Is she being prevented from doing her work? No.

Seems she is prevented from doing work, if companies can get sued for hiring/using voice actors who sounds like ScarJo, then any voice actor who sounds like ScarJo has effectively been de-platformed. Similarly, imagine I look very much like George Clooney -- if George Clooney can sue magazines for featuring my handsome photos, then I lose all ability to model for pay. (Strictly hypothetical, I am a developer, not a fashion model.)

>> It seems like you don't get the fundamental principal underlying "right of publicity" laws if you are asking this question.

Totally, i have no idea of the laws here, but very curious to understand what OpenAI did wrong here.

replies(1): >>freeja+hp1
◧◩◪◨
69. z7+xi1[view] [source] [discussion] 2024-05-23 15:37:54
>>visarg+Wa
They removed ChatGPT's most popular voice in response, causing anger among many of their customers... for PR?
replies(1): >>visarg+yr1
◧◩◪◨
70. Tobu+Ei1[view] [source] [discussion] 2024-05-23 15:38:16
>>bryanr+Nj
The other actress wasn't the only one involved in the production; she provided input but OpenAI building a voice model would involve a lot of data and input. They had to have a model of her ready to go when they asked her for permission immediately before launching; possibly they had one that had been built from her, and another legal-approved that they had converged to be close to the first one but that didn't include her as a direct source.
◧◩◪◨⬒
71. Turing+1j1[view] [source] [discussion] 2024-05-23 15:39:40
>>johnbe+Z41
>> There isn't an unfairness to the voice actor. She did her job and got paid.

If her customers can get sued for using her voice, then this voice actor can never get another job and can never get paid again -- all because she happens to sound like ScarJo. That seems unfair to the voice actor.

replies(1): >>Vegeno+VE1
◧◩◪◨
72. KHRZ+9l1[view] [source] [discussion] 2024-05-23 15:48:35
>>jrm4+CX
Because it follows the legal principle of innocent untill proven guilty? Unlike the "OpenAI must have cloned Scarlett Johansson's voice" wild dystophia speculations.
replies(1): >>jrm4+032
◧◩◪◨⬒⬓⬔⧯
73. nilamo+bl1[view] [source] [discussion] 2024-05-23 15:48:43
>>pnt12+Sh1
You're describing a situation different from the one I replied to, though... >>40454969
◧◩
74. KHRZ+mn1[view] [source] [discussion] 2024-05-23 15:58:28
>>tmaly+CW
"out of respect" for the angry woman rather than argue with her, you never had this problem with a wife/girlfriend?
◧◩◪◨⬒⬓
75. freeja+hp1[view] [source] [discussion] 2024-05-23 16:06:37
>>Turing+pi1
You are skipping past intent and turning it into strict liability. That's not the case.

>Totally, i have no idea of the laws here, but very curious to understand what OpenAI did wrong here.

It is illegal to profit off the likeness of others. If it wasn't, what's to stop any company from hiring any impersonator to promote that company as the person they are impersonating?

◧◩◪◨⬒⬓⬔⧯
76. tpmone+Pp1[view] [source] [discussion] 2024-05-23 16:09:15
>>pnt12+Sh1
This sort of thing happens all the time though doesn’t it? Take any animated film that get a TV show spinoff, their voice actors get replaced all the time, and I really can’t imagine that spinoff tv shows are dependent on getting all the original voice actors to grant permission. How many different actors have voiced looney toons characters over the years? Didn’t Ernie Hudson audition to play his own character for the Ghost Busters cartoon and lose out to someone else? And these are all cases where there is clear intent to sound as close to the original actor as possible.
replies(1): >>Uehrek+aB1
◧◩◪◨⬒⬓
77. lupusr+hq1[view] [source] [discussion] 2024-05-23 16:12:05
>>gnicho+F3
Tom Waits sued Frito Lay for using an impersonator to sing about Doritos.
◧◩◪◨⬒
78. visarg+yr1[view] [source] [discussion] 2024-05-23 16:19:06
>>z7+xi1
I can't say much about that particular voice because I almost never used chatGPT's voices. They are too slow.

I need 1.5x speeds even if I have to use a worse voice. I am a TTS power user, listening to all online text since 2010s. Maybe GPT-4o has a more flexible voice, perhaps you can just ask it to speak faster.

replies(1): >>gnicho+eq2
◧◩
79. pavon+Xs1[view] [source] [discussion] 2024-05-23 16:26:27
>>ZooCow+kF
My unsubstantiated theory: They have a voice trained on Johansson's body of work ready to go, but didn't release it because they didn't get her permission. This explains why they were still asking her right up to the ChatGPT-4o release. Then people (including Johansson) associate this Sky voice with Johansson and Her. OpenAI realizes it looks bad, despite not being intentional, so they pull Sky for PR reasons.
◧◩◪◨⬒⬓
80. TeMPOr+4z1[view] [source] [discussion] 2024-05-23 16:56:16
>>ranger+ha1
Yes, because the courts have yet to decide whether OpenAI has been "recklessly taking other people's IP" in an illegal way. Right now, it's only something believed by people who wish it to be true; legally, it's not clear just yet. In contrast, actually doing SJ impersonation here would be a much clearer violation. There's a huge gap between the two deeds, and I don't see the reason to just assume OpenAI crossed it.

It's like, the people dropping leaflets in your physical mailbox are delivering spam, but you wouldn't automatically assume those same people are also trying to scam you and your neighbors by delivering you physical letters meant to trick people into parting with their savings. In both cases, the messages are spam, but one is legal, other is not, and there's a huge gap between them.

replies(1): >>jrm4+I22
◧◩◪◨⬒
81. Mattic+jz1[view] [source] [discussion] 2024-05-23 16:57:04
>>jonath+M41
> In both cases the companies are specifically trading on creating confusion of a celebrity’s likeness in an act that celebrity trades in, and with the motivation of circumventing that very celebrity’s explicit rejection of the offer for that very work.

Are they? Where did they advertise this? The voice doesn't even sound that much like ScarJo!

> Just because one is a singer and the other is an actor isn’t the big difference you think it is. Actors do voice over work all the time. Actors in fact get cast for their voice all the time.

It's a very big difference when the jurisprudence here rests on how substantial the voice is as a proportion of the brand, especially in the presence of the other disanalogies.

> Yelling, “Parody!” Isn’t some get out of jail free card, particularly where there is actual case law, even more particularly when there are actual laws to address this very act.

Sure -- If you read that back, I'm clearly not doing that. An impression in a parody in the artist's unique style (Waits) was a case where it was a violation of publicity rights. This is radically different from that. It's not clear that Midler and Waits have much bearing on this case at all.

◧◩◪◨⬒⬓⬔⧯
82. stale2+LA1[view] [source] [discussion] 2024-05-23 17:06:24
>>pnt12+Sh1
Ok got it.

So in your opinion, if a movie needs to have a tall, skinny red head, and then they approach someone who has those qualities and the role is turned down, then it would be illegal to get any other different tall skinny red head.

That sounds absurd to me. If you have a role, obviously the role has qualities and requirements.

And just because person 1 who happens to have those qualities turns you down, it is still valid to get a different person who fulfils your original requirements.

replies(1): >>Uehrek+XJ1
◧◩◪◨⬒⬓⬔⧯▣
83. Uehrek+aB1[view] [source] [discussion] 2024-05-23 17:08:18
>>tpmone+Pp1
I Am Not An Entertainment Lawyer either, so I can’t answer with too much certainty, but I’d suspect the actors have a clause in the contract they sign allowing the character to be played by someone else post-contract. Think about how many clauses were in the last employment contract you signed.
replies(1): >>tpmone+h62
◧◩◪◨⬒⬓
84. Vegeno+VE1[view] [source] [discussion] 2024-05-23 17:25:40
>>Turing+1j1
It is not that the voice is similar to Scarlett, it is that it appears that Scarlett's identity was intentionally capitalized on to market the voice.

If you had a voice like Scarlett, and you were hired to create the voice of an AI assistant, there's no legal problem - as long as the voice isn't marketed using references to Scarlett.

However, in this case, the voice is similar to Scarlett's, AND they referenced a popular movie where Scarlett voiced an AI assistant, and named the assistant in a way that is evocative of Scarlett's name, and reached out to Scarlett wanting to use her voice. It is those factors that make it legally questionable, as it appears that they knowingly capitalized on the voice's similarity to Scarlett's without her permission.

It is about intent, and how the voice is marketed. Voice sounds like a famous person = fine, voice sounds like a famous person and the voice is marketed as being similar to the famous person's = not fine.

It is not a clear-cut 'this is definitely illegal' in this case, it is a grey area that a court would have to decide on.

◧◩◪◨⬒
85. krisof+jJ1[view] [source] [discussion] 2024-05-23 17:47:58
>>theult+eT
> Your can point to the "Her" tweet, but it's a pretty flimsy argument.

I'm not making arguments which are not already explicitly written in my post.

My argument is simple: jorvi commented that you can hire "a real-life voice actress" to "try to imitate Scarlett Johansson’s voice as best as she could", and that is not illegal.

I said that the legality of that is more complicated. What jorvi describes might or might not be illegal based on various factors. And I pointed them towards the two references to support my argument.

I explicitly didn't say in that comment anything about the OpenAI/ScarJo case. You are reacting as if you think that I have some opinion about it. You are wrong, and it would be better if you would not try to guess my state of mind. If I have some opinion about something you will know because I will explicitly state it.

◧◩◪◨⬒⬓⬔⧯▣
86. Uehrek+XJ1[view] [source] [discussion] 2024-05-23 17:52:25
>>stale2+LA1
Yeah I think the person you’re responding to gave a bad example. I like to give examples involving commercials because they center the issue of celebrity endorsement of and association with a brand, which is the thing at issue in this OpenAI case (public corporate keynotes are essentially just multi-hour commercials).
◧◩◪◨⬒
87. dragon+XS1[view] [source] [discussion] 2024-05-23 18:43:55
>>throwa+I41
> Voices are not trademarkable

But they are subject to right of publicity in many US jurisdictions.

Which, while more like trademark than copyright (the other thing that keeps getting raised as if it should dispose of this issue), is its own area of law, distinct from either trademark or copyright.

> The ONLY way they get in trouble is if they claim to be Morgan Freeman.

That’s…not true. Though such an explicit claim would definitely be a way that they could get in trouble.

replies(1): >>throwa+ti2
◧◩◪◨⬒⬓⬔
88. jrm4+I22[view] [source] [discussion] 2024-05-23 19:32:58
>>TeMPOr+4z1
Exactly wrong; it's the job of the law to "be careful," not of the people.

Those of us accusing and talking about it have no power -- thus there is literally no harm, and possible good in, putting them on the defense about this.

edit: In fact, the First Amendment of the Constitution essentially directly upholds the idea of "people saying whatever they want" in this regard.

◧◩◪◨⬒
89. jrm4+032[view] [source] [discussion] 2024-05-23 19:34:50
>>KHRZ+9l1
This is goofy. "The law" should be careful because the law has power to do real harm.

People don't need to be careful just talking; in fact we generally support the idea of "people saying whatever" in the form of the First Amendment.

◧◩
90. aprilt+h52[view] [source] [discussion] 2024-05-23 19:47:17
>>stingr+51
If the goal was to make the voice sound like the one from Her, then it's still illegal.

Same way you can't get someone who sounds like a famous celebrity to do voice in a commercial and just let people think it's the famous celebrity when it's not

◧◩◪◨⬒⬓⬔⧯▣▦
91. tpmone+h62[view] [source] [discussion] 2024-05-23 19:53:25
>>Uehrek+aB1
Sure, I would imagine such clauses exist to make this sort of thing a lot cleaner and easier. I also don't see how it could be reasonable to assert that in the absence of such a clause, an actor owns the rights to the voice of that character and no one else can portray that character with the same or similar voice without their express consent. Not everyone who creates something is signing Hollywood acting contracts with their hired actors, and I just can't see any court asserting that "CoolTubeProductions YouTube Channel" can't continue to produce more animations with their "Radical Rabbit" character just because they didn't have that sort of clause when they hired a voice actor at their college for the first year.
◧◩◪◨⬒⬓
92. throwa+bi2[view] [source] [discussion] 2024-05-23 21:01:57
>>thehap+f91
How do commerical Elvis impersonators (or commercial Elvis impersonators in commercials) get away with it?
replies(1): >>thehap+0W2
◧◩◪◨⬒⬓
93. throwa+ti2[view] [source] [discussion] 2024-05-23 21:03:25
>>dragon+XS1
> In the United States, no federal statute or case law recognizes the right of publicity, although federal unfair competition law recognizes a related statutory right to protection against false endorsement, association, or affiliation

https://www.inta.org/topics/right-of-publicity/#:~:text=In%2....

replies(1): >>dragon+u43
◧◩◪◨⬒⬓
94. gnicho+eq2[view] [source] [discussion] 2024-05-23 21:55:19
>>visarg+yr1
They also need to have a terse mode, to avoid all the throat-clearing that I've seen in videos.
◧◩◪◨
95. bengal+Pq2[view] [source] [discussion] 2024-05-23 22:00:04
>>jrm4+CX
Until proven otherwise I try not to assume malice in every action.
◧◩◪◨⬒⬓
96. BeefWe+cM2[view] [source] [discussion] 2024-05-24 00:52:32
>>Mattic+3X
They are unlikely to tell the voice actor anything, since OpenAI is the problematic party here.
◧◩◪◨
97. gg82+zV2[view] [source] [discussion] 2024-05-24 02:35:34
>>wkat42+B31
Just imagine, if these voice chatbots get popular... they will likely change how people talk!
replies(1): >>wkat42+153
◧◩◪◨⬒⬓⬔
98. thehap+0W2[view] [source] [discussion] 2024-05-24 02:43:01
>>throwa+bi2
Because no reasonable person thinks it’s actually Elvis.
◧◩◪◨⬒⬓⬔
99. dragon+u43[view] [source] [discussion] 2024-05-24 04:45:42
>>throwa+ti2
The important word in that quote is “federal”. In the US, right of publicity is a state law right in many states (often of particular note because of the concentration of tech and entertainment industries, it is a state law right in California.)
◧◩◪◨⬒
100. wkat42+153[view] [source] [discussion] 2024-05-24 04:53:16
>>gg82+zV2
I think at the moment it's the opposite: It seems based on how many celebrities talk. When we watch them on TV, youtube, tiktok whatever they're not real people but just playing a role. It's not how they really are in their real life. The overacted enthusiasm is like a marketing tool.

As people tend to look up at celebrities and admire them they start associating this with good things and I think this is why they adopted such styles for chatbots.

◧◩◪◨⬒⬓⬔
101. CRConr+0d3[view] [source] [discussion] 2024-05-24 06:23:48
>>m348e9+Ec1
Which they themselves have totally undermined.
◧◩
102. CRConr+6e3[view] [source] [discussion] 2024-05-24 06:35:10
>>beeboo+n21
ITYM

> Is it a crime for voice actors to sound similar to, say, James Earl Jones?

And the answer is, of course: It depends. For one thing, it depends on whether the company using the sound-alike's voice are in a business closely related to the theme of Star Wars, and whether they market whatever it is they're marketing by referring to Jones' iconic performance as Vader. ("<PANT> ... <PANT>") If they do that, then yes, it most likely is.

replies(1): >>beeboo+TQ3
◧◩◪◨
103. alickz+6u3[view] [source] [discussion] 2024-05-24 09:33:47
>>jrm4+CX
The Principle of Charity

https://en.wikipedia.org/wiki/Principle_of_charity

◧◩◪
104. beeboo+TQ3[view] [source] [discussion] 2024-05-24 13:11:49
>>CRConr+6e3
No, I specifically asked about Darth Vader, the fictional character that has been voiced by various voice actors (including the original trilogy, clone wars, etc). Presumably Earl Jones does not sound like Darth Vader in his day to day life, but this is not about Earl Jones, it is about the character.
[go to top]