zlacker

[return to "Statement from Scarlett Johansson on the OpenAI "Sky" voice"]
1. anon37+t5[view] [source] 2024-05-20 22:58:41
>>mjcl+(OP)
Well, that statement lays out a damning timeline:

- OpenAI approached Scarlett last fall, and she refused.

- Two days before the GPT-4o launch, they contacted her agent and asked that she reconsider. (Two days! This means they already had everything they needed to ship the product with Scarlett’s cloned voice.)

- Not receiving a response, OpenAI demos the product anyway, with Sam tweeting “her” in reference to Scarlett’s film.

- When Scarlett’s counsel asked for an explanation of how the “Sky” voice was created, OpenAI yanked the voice from their product line.

Perhaps Sam’s next tweet should read “red-handed”.

◧◩
2. nickth+R7[view] [source] 2024-05-20 23:10:38
>>anon37+t5
This statement from scarlet really changed my perspective. I use and loved the Sky voice and I did feel it sounded a little like her, but moreover it was the best of their voice offerings. I was mad when they removed it. But now I’m mad it was ever there to begin with. This timeline makes it clear that this wasn’t a coincidence and maybe not even a hiring of an impressionist (which is where things get a little more wishy washy for me).
◧◩◪
3. crimso+y9[view] [source] 2024-05-20 23:19:08
>>nickth+R7
But it's clearly not her voice right? The version that's been on the app for a year just isn't. Like, it clearly intending to be slightly reminiscent of her, but it's also very clearly not. Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?
◧◩◪◨
4. ncalla+jc[view] [source] 2024-05-20 23:38:23
>>crimso+y9
> Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?

They clearly thought it was close enough that they asked for permission, twice. And got two no’s. Going forward with it at that point was super fucked up.

It’s very bad to not ask permission when you should. It’s far worse to ask for permission and then ignore the response.

Totally ethically bankrupt.

◧◩◪◨⬒
5. ants_e+8k[view] [source] 2024-05-21 00:26:14
>>ncalla+jc
Yes, totally ethically bankrupt. But what bewilders me is that they yanked it as soon as they heard from their lawyers. I would have thought that if they made the decision to go ahead despite getting two "no"s, that they at least had a legal position they thought was defensible and worth defending.

But it kind of looks like they released it knowing they couldn't defend it in court which must seem pretty bonkers to investors.

◧◩◪◨⬒⬓
6. ethbr1+Mo[view] [source] 2024-05-21 00:54:16
>>ants_e+8k
> I would have thought that if they made the decision to go ahead despite getting two "no"s, that they at least had a legal position they thought was defensible and worth defending.

They likely have a legal position which is defensible.

They're much more worried that they don't have a PR position which is defensible.

What's the point of winning the (legal) battle if you lose the war (of public opinion)?

Given the rest of their product is built on apathy to copyright, they're actively being sued by creators, and the general public is sympathetic to GenAI taking human jobs...

... this isn't a great moment for OpenAI to initiate a long legal battle, against a female movie actress / celebrity, in which they're arguing how her likeness isn't actually controlled by her.

Talk about optics!

(And I'd expect they quietly care much more about their continued ability to push creative output through their copyright launderer, than get into a battle over likeness)

◧◩◪◨⬒⬓⬔
7. XorNot+tI[view] [source] 2024-05-21 04:03:59
>>ethbr1+Mo
How is the PR position not defensible? One of the worst things you can generally do is admit fault, particularly if you have a complete defense.

Buckle in, go to court, and double-down on the fact that the public's opinion of actors is pretty damn fickle at the best of times - particularly if what you released was in fact based on someone you signed a valid contract with who just sounds similar.

Of course, this is all dependent on actually having a complete defense of course - you absolutely would not want to find Scarlett Johannsen voice samples in file folders associated with the Sky model if it went to court.

◧◩◪◨⬒⬓⬔⧯
8. ethbr1+TK[view] [source] 2024-05-21 04:29:45
>>XorNot+tI
In what world does a majority of the public cheer for OpenAI "stealing"* an actress's voice?

People who hate Hollywood? Most of that crowd hates tech even more.

* Because it would take the first news cycle to be branded as that

◧◩◪◨⬒⬓⬔⧯▣
9. XorNot+KW[view] [source] 2024-05-21 06:33:44
>>ethbr1+TK
It is wild to me that on HackerNews of all places, you'd think people don't love an underdog story.

Which is what this would be in the not-stupid version of events: they hired a voice actress for the rights to create the voice, she was paid, and then is basically told by the courts "actually you're unhireable because you sound too much like an already rich and famous person".

The issue of course is that OpenAIs reactions so far don't seem to indicate that they're actually confident they can prove this or that this is the case. Coz if this is actually the case, they're going about handling this in the dumbest possible way.

[go to top]