zlacker

[parent] [thread] 98 comments
1. crimso+(OP)[view] [source] 2024-05-20 23:19:08
But it's clearly not her voice right? The version that's been on the app for a year just isn't. Like, it clearly intending to be slightly reminiscent of her, but it's also very clearly not. Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?
replies(9): >>gedy+u1 >>bigfis+w1 >>bobthe+x1 >>emmp+F1 >>dragon+c2 >>ncalla+L2 >>EasyMa+x3 >>callal+v8 >>visarg+Fx
2. gedy+u1[view] [source] 2024-05-20 23:29:45
>>crimso+(OP)
Normally I'd agree if this were some vague "artist style", but this was clearly an attempt to duplicate a living person, a media celebrity no less.
replies(3): >>threat+k2 >>__loam+y4 >>citize+fI
3. bigfis+w1[view] [source] 2024-05-20 23:30:01
>>crimso+(OP)
It could be trained on Scarlett's voice though, there's plenty of recorded samples for OpenAI to use. It's pretty damning for them to take down the voice right away like that
replies(1): >>branda+Jg
4. bobthe+x1[view] [source] 2024-05-20 23:30:07
>>crimso+(OP)
this is correct. in fact the fcc has already clarified this for the case of robocalls. https://www.fcc.gov/document/fcc-makes-ai-generated-voices-r...
5. emmp+F1[view] [source] 2024-05-20 23:30:38
>>crimso+(OP)
We can seriously say that, yes. The courts have been saying this in the US for over 30 years. See Midler v. Ford Motor Co.
replies(1): >>Avshal+A2
6. dragon+c2[view] [source] 2024-05-20 23:34:18
>>crimso+(OP)
If the purpose is to trade on the celebrity voice and perceived association, and its subject to California right of personality law, then, yes, we're saying that that has been established law for decades.
replies(1): >>Last5D+Da
◧◩
7. threat+k2[view] [source] [discussion] 2024-05-20 23:35:05
>>gedy+u1
Is this different from the various videos of the Harry Potter actors doing comedic high fashion ads? Because those were very well received.

https://www.youtube.com/watch?v=ipuqLy87-3A

replies(6): >>BadHum+14 >>nickle+C4 >>jacobo+05 >>jprete+g5 >>bottle+P7 >>tsimio+1a
◧◩
8. Avshal+A2[view] [source] [discussion] 2024-05-20 23:36:55
>>emmp+F1
Tom Waits won a lawsuit against Doritos too.
9. ncalla+L2[view] [source] 2024-05-20 23:38:23
>>crimso+(OP)
> Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?

They clearly thought it was close enough that they asked for permission, twice. And got two no’s. Going forward with it at that point was super fucked up.

It’s very bad to not ask permission when you should. It’s far worse to ask for permission and then ignore the response.

Totally ethically bankrupt.

replies(5): >>nicce+44 >>avarun+B6 >>ants_e+Aa >>menset+mc >>gibbit+ph
10. EasyMa+x3[view] [source] 2024-05-20 23:43:14
>>crimso+(OP)
Sure they could have taken her to court but right now they don't want the bad publicity, especially since it would put everything else in the shadow of such a scandalous "story". Better to just back off, let S.J. win and move on and start planning on they're gonna spend all that paper money they got with announcement of a new, more advanced model. It's a financial decision and a fairly predictable one. I'm glad she won this time.
replies(2): >>__loam+i4 >>smugma+N4
◧◩◪
11. BadHum+14[view] [source] [discussion] 2024-05-20 23:45:33
>>threat+k2
Is a billion dollar AI company utilizing someone's voice against their will in a flagship product after they said no twice different from a random Youtube channel making comedy videos?

I think so but that could just be me.

◧◩
12. nicce+44[view] [source] [discussion] 2024-05-20 23:45:43
>>ncalla+L2
And they could have totally get away with it by never mentioning the name of Scarlett. But of course, that is not what they wanted.

Edit: to clarify, since it is not exactly identical voice, or even not that close, they can plausibly deny it, and we never new what their intention was.

But in this case, they have clearly created the voice to represent Scarlett's voice to demonstrate the capabilities of their product in order to get marketing power.

replies(1): >>visarg+hw
◧◩
13. __loam+i4[view] [source] [discussion] 2024-05-20 23:46:33
>>EasyMa+x3
Paper money from the model they're giving away for free?
replies(1): >>EasyMa+s6
◧◩
14. __loam+y4[view] [source] [discussion] 2024-05-20 23:47:56
>>gedy+u1
Why do you have an issue with them taking someone's likeness to use in their product but not with them taking someone's work to use in their product?
replies(1): >>gedy+pa
◧◩◪
15. nickle+C4[view] [source] [discussion] 2024-05-20 23:48:12
>>threat+k2
I think anti-deepfake legislation needs to consider fair use, especially when it comes to parody or other commentary on public figures. OpenAI's actions do not qualify as fair use.
replies(1): >>throww+Rb
◧◩
16. smugma+N4[view] [source] [discussion] 2024-05-20 23:49:11
>>EasyMa+x3
She also won big against Disney. They backed down even though it appeared the contract was on their side. Iger apologized.

https://www.bbc.co.uk/news/business-58757748.amp

replies(1): >>mschus+pN
◧◩◪
17. jacobo+05[view] [source] [discussion] 2024-05-20 23:50:57
>>threat+k2
One is a company with a nearly $100 billion valuation using someone's likeness for their own commercial purposes in a large-scale consumer product, which consumers would plausibly interpret as a paid endorsement, while the other seems to be an amateur hobbyist nobody has ever heard of making a parody demo as an art project, in a way that makes it clear that the original actors had nothing to do with it. The context seems pretty wildly different to me.

I'm guessing if any of the Harry Potter actors threatened the hobbyist with legal action the video would likely come down, though I doubt they would bother even if they didn't care for the video.

◧◩◪
18. jprete+g5[view] [source] [discussion] 2024-05-20 23:52:02
>>threat+k2
Those are parodies and not meant at any point for you to believe the actual Harry Potter actors were involved.
◧◩◪
19. EasyMa+s6[view] [source] [discussion] 2024-05-20 23:58:36
>>__loam+i4
I mean if you don't think these kinds of positive announcements don't increase the value of the company or parent company then I don't really know how to convince you as it's a standard business principle.
replies(2): >>ml-ano+49 >>__loam+jr
◧◩
20. avarun+B6[view] [source] [discussion] 2024-05-20 23:59:32
>>ncalla+L2
> They clearly thought it was close enough that they asked for permission, twice.

You seem to be misunderstanding the situation here. They wanted ScarJo to voice their voice assistant, and she refused twice. They also independently created a voice assistant which sounds very similar to her. That doesn't mean they thought they had to ask permission for the similar voice assistant.

replies(4): >>tomrod+5a >>chroma+La >>voltai+Oa >>ncalla+Rp
◧◩◪
21. bottle+P7[view] [source] [discussion] 2024-05-21 00:08:39
>>threat+k2
There’s a big difference between a one off replica and person-as-a-service.
22. callal+v8[view] [source] 2024-05-21 00:12:28
>>crimso+(OP)
I think we should all be held to the standard of “Weird” Al Yankovic. In personal matters consent is important.
◧◩◪◨
23. ml-ano+49[view] [source] [discussion] 2024-05-21 00:15:58
>>EasyMa+s6
There isn’t a positive announcement here, what is wrong with you?

This reads like “we got caught red handed” and doing the bare minimum for it to not appear malicious and deliberate when the timeline is read out in court.

replies(1): >>EasyMa+2w3
◧◩◪
24. tsimio+1a[view] [source] [discussion] 2024-05-21 00:22:38
>>threat+k2
That has a much better chance of falling under fair use (parody, non-commercial) if the actors ever tried to sue.

There is a major difference between parodying someone by imitating them while clearly and almost explicitly being an imitation; and deceptively imitating someone to suggest they are associated with your product in a serious manner.

◧◩◪
25. tomrod+5a[view] [source] [discussion] 2024-05-21 00:23:31
>>avarun+B6
And... No. That is what OpenAI will assert, and good discovery by Scar Jo reps may prove or disprove.
◧◩◪
26. gedy+pa[view] [source] [discussion] 2024-05-21 00:24:53
>>__loam+y4
Because this isn't training an audio model along with a million other voices to understand English, etc. It's clearly meant to sound exactly like that one celebrity.

I suspect a video avatar service that looked exactly like her would fall afoul of fair use as well. Though an image gen that used some images of her (and many others) to train and spit out generic "attractive blonde woman" is fair use in my opinion.

replies(2): >>numpad+Xk >>__loam+cr
◧◩
27. ants_e+Aa[view] [source] [discussion] 2024-05-21 00:26:14
>>ncalla+L2
Yes, totally ethically bankrupt. But what bewilders me is that they yanked it as soon as they heard from their lawyers. I would have thought that if they made the decision to go ahead despite getting two "no"s, that they at least had a legal position they thought was defensible and worth defending.

But it kind of looks like they released it knowing they couldn't defend it in court which must seem pretty bonkers to investors.

replies(3): >>ethbr1+ef >>foobar+Um >>emsign+YC
◧◩
28. Last5D+Da[view] [source] [discussion] 2024-05-21 00:26:25
>>dragon+c2
That's not the purpose though, clearly. If anything, you could make the argument that they're trading in on the association to the movie "Her", that's it. Neither Sky nor the new voice model sound particularly like ScarJo, unless you want to imply that her identity rights extend over 40% of all female voice types. People made the association because her voice was used in a movie that features a highly emotive voice assistant reminiscent of GPT-4o, which sama and others joked about.

I mean, why not actually compare the voices before forming an opinion?

https://www.youtube.com/watch?v=SamGnUqaOfU

https://www.youtube.com/watch?v=vgYi3Wr7v_g

-----

https://www.youtube.com/watch?v=iF9mrI9yoBU

https://www.youtube.com/watch?v=GV01B5kVsC0

replies(2): >>cowsup+Xg >>om2+5E
◧◩◪
29. chroma+La[view] [source] [discussion] 2024-05-21 00:27:05
>>avarun+B6
So, what would they have done if she accepted? Claimed that the existing training of the Sky voice was voiced by her?
replies(4): >>famous+qd >>sangno+qk >>blacko+9v >>avarun+Hv3
◧◩◪
30. voltai+Oa[view] [source] [discussion] 2024-05-21 00:27:20
>>avarun+B6
You seem to be misunderstanding the legalities at work here: reaching out to her multiple times beforehand, along with tweets intended to underline the similarity to her work on Her, demonstrates intention. If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?

Answer: because they knew they needed permission, after working so hard to associate with Her, and they hoped that in traditional tech fashion that if they moved fast and broke things enough, everyone would have to reshape around OAs wants, rather than around the preexisting rights of the humans involved.

replies(3): >>KHRZ+Eg >>parine+hs >>munksb+aY1
◧◩◪◨
31. throww+Rb[view] [source] [discussion] 2024-05-21 00:33:59
>>nickle+C4
The problem with that idea is that I can hide behind it while making videos of famous politicians doing really morally questionable things and distributing them on YouTube. The reason Fair Use works with regular parodies in my opinion is that everyone can tell that it is obviously fake. For example, Saturday Night Live routinely makes joking parody videos of elected officials doing things we think might be consistent with their character. And in those cases it's obvious that it's being portrayed by an actor and therefore a parody. If you use someone's likeness directly I think that it must never be fair use or we will quickly end up in a world where no video can be trusted.
replies(2): >>cjbgka+Ji >>nickle+f12
◧◩
32. menset+mc[view] [source] [discussion] 2024-05-21 00:36:30
>>ncalla+L2
Effective altruism would posit that it is worth one voice theft to help speed the rate of life saving ai technology in the hands of everyone.
replies(2): >>ehnto+Ik >>ncalla+em
◧◩◪◨
33. famous+qd[view] [source] [discussion] 2024-05-21 00:43:40
>>chroma+La
Voice cloning could be as simple as a few seconds of audio in the context window since GPT-4o is a speech to speech transformer. They wouldn't need to claim anything, just switch samples. They haven't launched the new voice mode yet, just demos.
◧◩◪
34. ethbr1+ef[view] [source] [discussion] 2024-05-21 00:54:16
>>ants_e+Aa
> I would have thought that if they made the decision to go ahead despite getting two "no"s, that they at least had a legal position they thought was defensible and worth defending.

They likely have a legal position which is defensible.

They're much more worried that they don't have a PR position which is defensible.

What's the point of winning the (legal) battle if you lose the war (of public opinion)?

Given the rest of their product is built on apathy to copyright, they're actively being sued by creators, and the general public is sympathetic to GenAI taking human jobs...

... this isn't a great moment for OpenAI to initiate a long legal battle, against a female movie actress / celebrity, in which they're arguing how her likeness isn't actually controlled by her.

Talk about optics!

(And I'd expect they quietly care much more about their continued ability to push creative output through their copyright launderer, than get into a battle over likeness)

replies(2): >>justin+er >>XorNot+Vy
◧◩◪◨
35. KHRZ+Eg[view] [source] [discussion] 2024-05-21 01:05:12
>>voltai+Oa
You could also ask: If Scarlett has a legal case already, why does she want legislation passed?
replies(4): >>minima+fh >>ncalla+6r >>bradch+bv >>static+dd2
◧◩
36. branda+Jg[view] [source] [discussion] 2024-05-21 01:06:12
>>bigfis+w1
Her statement claims the voice was taken down at her attorney's insistence.
◧◩◪
37. cowsup+Xg[view] [source] [discussion] 2024-05-21 01:08:05
>>Last5D+Da
> People made the association because her voice was used in a movie that features a highly emotive voice assistant reminiscent of GPT-4o, which sama and others joked about.

Whether you think it sounds like her or not is a matter of opinion, I guess. I can see the resemblance, and I can also see the resemblance to Jennifer Lawrence and others.

What Johannson is alleging goes beyond this, though. She is alleging that Altman (or his team) reached out to her (or her team) to lend her voice, she was not interested, and then she was asked again just two days before GPT-4o's announcement, and she rejected again. Now there's a voice that, in her opinion, sounds a lot like her.

Luckily, the legal system is far more nuanced than just listening to a few voices and comparing it mentally to other voices individuals have heard over the years. They'll be able to figure out, as part of discovery, what lead to the Sky voice sounding the way it does (intentionally using Johannson's likeness? coincidence? directly trained off her interviews/movies?), whether OpenAI were willing to slap Johannson's name onto the existing Sky during the presentation, whether the "her" tweet and the combination of the Sky voice was supposed to draw the subtle connection... This allegation is just the beginning.

replies(1): >>Last5D+Yj
◧◩◪◨⬒
38. minima+fh[view] [source] [discussion] 2024-05-21 01:09:50
>>KHRZ+Eg
To prevent it from happening again, with more legal authority than a legal precedent.
◧◩
39. gibbit+ph[view] [source] [discussion] 2024-05-21 01:10:54
>>ncalla+L2
Are we surprised by this bankruptcy. As neat as AI is, it is only a thing because the corporate class see it as a way to reduce margins by replacing people with it. The whole concept is bankrupt.
replies(4): >>ecjhdn+Fj >>ncalla+ok >>emsign+DD >>Nasrud+mR1
◧◩◪◨⬒
40. cjbgka+Ji[view] [source] [discussion] 2024-05-21 01:23:38
>>throww+Rb
I’m guessing you’re referring to people still thinking Sarah Palin said she could see Russia from her house, that was from a SNL skit and an amazing impression from Tina Fey. I agree, people have a hard time separating reality from obvious parody, how could we expect them to make a distinction with intentional imitation. Society must draw a clear line that it is not ok to do this.
◧◩◪
41. ecjhdn+Fj[view] [source] [discussion] 2024-05-21 01:31:55
>>gibbit+ph
100% this.

It’s shocking to me how people cannot see this.

The only surprise here is that they didn’t think she’d push back. That is what completes the multilayered cosmic and dramatic irony of this whole vignette. Honestly feels like Shakespeare or Arthur Miller might have written it.

◧◩◪◨
42. Last5D+Yj[view] [source] [discussion] 2024-05-21 01:34:07
>>cowsup+Xg
I honestly don't think it is a matter of opinion, though. Her voice has a few very distinct characteristics, the most significant of which being the vocal fry / huskiness, that aren't present at all in either of the Sky models.

Asking for her vocal likeness is completely in line with just wanting the association with "Her" and the big PR hit that would come along with that. They developed voice models on two different occasions and hoped twice that Johannson would allow them to make that connection. Neither time did she accept, and neither time did they release a model that sounded like her. The two day run-up isn't suspicious either, because we're talking about a general audio2audio transformer here. They could likely fine-tune it (if even that is necessary) on her voice in hours.

I don't think we're going to see this going to court. OpenAI simply has nothing to gain by fighting it. It would likely sour their relation to a bunch of media big-wigs and cause them bad press for years to come. Why bother when they can simply disable Sky until the new voice mode releases, allowing them to generate a million variations of highly-expressive female voices?

◧◩◪
43. ncalla+ok[view] [source] [discussion] 2024-05-21 01:37:50
>>gibbit+ph
I don’t think any said anything about being surprised by it?
◧◩◪◨
44. sangno+qk[view] [source] [discussion] 2024-05-21 01:38:20
>>chroma+La
> Claimed that the existing training of the Sky voice was voiced by her?

That claim could very well be true. The letter requested information on how the voice was trained - OpenAI may not want that can of worms opened lest other celebrities start paying closer attention to the other voices.

◧◩◪
45. ehnto+Ik[view] [source] [discussion] 2024-05-21 01:40:38
>>menset+mc
It didn't require voice theft, they could have easily found a volunteer or paid for someone else.
◧◩◪◨
46. numpad+Xk[view] [source] [discussion] 2024-05-21 01:42:56
>>gedy+pa
Chances are this is. Basically same as LoRA. One of go-to tools for these literally uses Diffusion model and work on spectrograms as images.
◧◩◪
47. ncalla+em[view] [source] [discussion] 2024-05-21 01:53:21
>>menset+mc
Effective Altruists are just shitty utilitarians that never take into account all the myriad ways that unmoderated utilitarianism has horrific failure modes.

Their hubris will walk them right into federal prison for fraud if they’re not careful.

If Effective Altruists want to speed the adoption of AI with the general public, they’d do well to avoid talking about it, lest the general public make a connection between EA and AI

I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

replies(5): >>parine+Lr >>comp_t+eB >>0xDEAF+uF >>Intral+Ey2 >>Intral+MA2
◧◩◪
48. foobar+Um[view] [source] [discussion] 2024-05-21 01:59:26
>>ants_e+Aa
> But it kind of looks like they released it knowing they couldn't defend it in court which must seem pretty bonkers to investors.

That actually seems like there may be a few people involved and one of them is a cowboy PM who said fuck it, ship it to make the demo. And then damage control came in later. Possibly the PM didn't even know about the asks for permission?

replies(2): >>anytim+mx >>kubobl+BE
◧◩◪
49. ncalla+Rp[view] [source] [discussion] 2024-05-21 02:32:02
>>avarun+B6
> You seem to be misunderstanding the situation here. They wanted ScarJo to voice their voice assistant, and she refused twice. They also independently created a voice assistant which sounds very similar to her.

And promoted it using a tweet naming the movie that Johansson performed in, for the role that prompted them to ask her in the first place.

You have to be almost deliberately naive to not see that the were attempting to use her vocal likeness in this situation. There’s a reason they immediately walked it back after the situation was revealed.

Neither a judge, nor a jury, would be so willingly naive.

replies(1): >>munksb+EY1
◧◩◪◨⬒
50. ncalla+6r[view] [source] [discussion] 2024-05-21 02:46:44
>>KHRZ+Eg
Because a legal case under the current justice system and legislative framework would probably take hundreds of thousands to millions of dollars to bring a case that requires discovery and a trial to accomplish.

Maybe (maybe!) it’s worth it for someone like Johansson to take on the cost of that to vindicate her rights—but it’s certainly not the case for most people.

If your rights can only be defended from massive corporations by bringing lawsuits that cost hundreds of thousands to millions of dollars, then only the wealthy will have those rights.

So maybe she wants new legislative frameworks around these kind of issues to allow people to realistically enforce these rights that nominally exist.

For an example of updating a legislative framework to allow more easily vindicating existing rights, look up “anti-SLAPP legislation”, which many states have passed to make it easier for a defendant of a meritless lawsuit seeking to chill speech to have the lawsuit dismissed. Anti-SLAPP legislation does almost nothing to change the actual rights that a defendant has to speak, but it makes it much more practical for a defendant to actually excercise those rights.

So, the assumption that a call for updated legislation implies that no legal protection currently exists is just a bad assumption that does not apply in this situation.

◧◩◪◨
51. __loam+cr[view] [source] [discussion] 2024-05-21 02:47:21
>>gedy+pa
Okay so as long as we steal enough stuff then it's legal.
◧◩◪◨
52. justin+er[view] [source] [discussion] 2024-05-21 02:47:34
>>ethbr1+ef
> They likely have a legal position which is defensible.

Doesn't sound like they have that either.

replies(1): >>bonton+0D2
◧◩◪◨
53. __loam+jr[view] [source] [discussion] 2024-05-21 02:48:01
>>EasyMa+s6
I believe there's a difference between building a sustainable and profitable business and pumping the stock.
◧◩◪◨
54. parine+Lr[view] [source] [discussion] 2024-05-21 02:52:24
>>ncalla+em
This is like attributing the crimes of a few fundamentalists to an entire religion.
replies(2): >>ncalla+ts >>ocodo+Jw
◧◩◪◨
55. parine+hs[view] [source] [discussion] 2024-05-21 02:57:17
>>voltai+Oa
> If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?

Many things that are legal are of questionable ethics. Asking permission could easily just be an effort for them to get better samples of her voice. Pulling the voice after debuting it is 100% a PR response. If there's a law that was broken, pulling the voice doesn't unbreak it.

replies(1): >>voltai+pu2
◧◩◪◨⬒
56. ncalla+ts[view] [source] [discussion] 2024-05-21 02:59:14
>>parine+Lr
I don’t think so. I’ve narrowed my comments specifically to Effective Altruists who are making utilitarian trade-offs to justify known moral wrongs.

> I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

Frankly, if you’re going to make an “ends justify the means” moral argument, you need to do a lot of work to address how those arguments have gone horrifically wrong in the past, and why the moral framework you’re using isn’t susceptible to those issues. I haven’t seen much of that from Effective Altruists.

I was responding to someone who was specifically saying an EA might argue why it’s acceptable to commit a moral wrong, because the ends justify it.

So, again, if someone is using EA to decide how to direct their charitable donations, volunteer their time, or otherwise decide between mora goods, I have no problem with it. That specifically wasn’t context I was responding to.

replies(1): >>parine+xT1
◧◩◪◨
57. blacko+9v[view] [source] [discussion] 2024-05-21 03:24:08
>>chroma+La
Maybe they have second trained on her voice.
◧◩◪◨⬒
58. bradch+bv[view] [source] [discussion] 2024-05-21 03:24:14
>>KHRZ+Eg
She has a personal net worth of >$100m. She’s also married to a successful actor in his own right.

Her voice alone didn’t get her there — she did. That’s why celebrities are so protective about how their likeness is used: their personal brand is their asset.

There’s established legal precedent on exactly this—even in the case they didn’t train on her likeness, if it can reasonably be suspected by an unknowing observer that she personally has lent her voice to this, she has a strong case. Even OpenAI knew this, or they would not have asked in the first place.

◧◩◪
59. visarg+hw[view] [source] [discussion] 2024-05-21 03:37:27
>>nicce+44
> since it is not exactly identical voice, or even not that close, they can plausibly deny it

When studios approach an actress A and she refuses, then another actress B takes the role, is that infringing on A's rights? Or should they just scrap the movie?

Maybe if they replicated a scene from the A's movies or there was striking likeness between the voices... but not generally.

replies(1): >>nicce+FO
◧◩◪◨⬒
60. ocodo+Jw[view] [source] [discussion] 2024-05-21 03:42:16
>>parine+Lr
Effective Altruists are the fundamentalists though. So no, it's not.
◧◩◪◨
61. anytim+mx[view] [source] [discussion] 2024-05-21 03:47:31
>>foobar+Um
The whole company behaves like rogue cowboys.

If a PM there didn’t say “fuck it ship it even without her permission” they’d probably be replaced with someone who would.

I expect the cost of any potential legal action/settlement was happily accepted in order to put on an impressive announcement.

62. visarg+Fx[view] [source] 2024-05-21 03:49:18
>>crimso+(OP)
> Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?

I think the copyright industry wants to grab new powers to counter the infinite capacity of AI to create variations. But that move would knee cap the creative industry first, newcomers have no place in a fully copyrighted space.

It reminds me of how NIMBY blocks construction to keep up the prices. Will all copyright space become operated on NIMBY logic?

◧◩◪◨
63. XorNot+Vy[view] [source] [discussion] 2024-05-21 04:03:59
>>ethbr1+ef
How is the PR position not defensible? One of the worst things you can generally do is admit fault, particularly if you have a complete defense.

Buckle in, go to court, and double-down on the fact that the public's opinion of actors is pretty damn fickle at the best of times - particularly if what you released was in fact based on someone you signed a valid contract with who just sounds similar.

Of course, this is all dependent on actually having a complete defense of course - you absolutely would not want to find Scarlett Johannsen voice samples in file folders associated with the Sky model if it went to court.

replies(1): >>ethbr1+lB
◧◩◪◨
64. comp_t+eB[view] [source] [discussion] 2024-05-21 04:28:33
>>ncalla+em
> When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

Extremely reasonable position, and I'm glad that every time some idiot brings it up in the EA forum comments section they get overwhelmingly downvoted, because most EAs aren't idiots in that particular way.

I have no idea what the rest of your comment is talking about; EAs that have opinions about AI largely think that we should be slowing it down rather than speeding it up.

replies(2): >>ncalla+pC >>emsign+gE
◧◩◪◨⬒
65. ethbr1+lB[view] [source] [discussion] 2024-05-21 04:29:45
>>XorNot+Vy
In what world does a majority of the public cheer for OpenAI "stealing"* an actress's voice?

People who hate Hollywood? Most of that crowd hates tech even more.

* Because it would take the first news cycle to be branded as that

replies(1): >>XorNot+cN
◧◩◪◨⬒
66. ncalla+pC[view] [source] [discussion] 2024-05-21 04:39:30
>>comp_t+eB
In some sense I see a direct line between the EA argument being presented here, and the SBF consequentialist argument where he talks about being willing to flip a coin if it had a 50% chance to destroy the world and a 50% chance to make the world more than twice as good.

I did try to cabin my arguments to Effective Altrusts that are making ends justify the means arguments. I really don’t have a problem with people that are attempting to use EA to decide between multiple good outcomes.

I’m definitely not engaged enough with the Effective Altrusits to know where the plurality of thought lies, so I was trying to respond in the context of this argument being put forward on behalf of Effective Altruists.

The only part I’d say applies to all EA, is the brand taint that SBF has done in the public perception.

◧◩◪
67. emsign+YC[view] [source] [discussion] 2024-05-21 04:45:29
>>ants_e+Aa
It looks really unprofessional at minimum if not a bit arrogant, which is actually more concerning as it hints at a deeper disrespect for artists and celebrities.
◧◩◪
68. emsign+DD[view] [source] [discussion] 2024-05-21 04:51:51
>>gibbit+ph
Problem is they really believe we either can't tell the difference between a human and an AI model eventually, or they think we don't care. Don't they understand the meaning of art?
◧◩◪
69. om2+5E[view] [source] [discussion] 2024-05-21 04:56:01
>>Last5D+Da
I haven’t hear the GPT-4o voice before. Comparing the video to the video of Johansson’s voice in “her”, it sounds pretty similar. Johansson’s performance there sounds pretty different from her normal speaking voice in the interview - more intentional emotional inflection, bubbliness, generally higher pitch. The GPT-4o voice sounds a lot like it.

From elsewhere in the thread, likeness rights apparently do extend to intentionally using lookalikes / soundalikes to create the appearance of endorsement or association.

◧◩◪◨⬒
70. emsign+gE[view] [source] [discussion] 2024-05-21 04:57:41
>>comp_t+eB
The speed doesn't really matter if their end goal is morally wrong. A slower speed might give them an advantage to not overshoot and get backlash or it gives artists and the public more time to fight back against EA, but it doesn't hide their ill intentions.
◧◩◪◨
71. kubobl+BE[view] [source] [discussion] 2024-05-21 05:03:01
>>foobar+Um
> a cowboy PM who said fuck it, ship it to make the demo.

Given the timeline it sounds like the PM was told "just go ahead with it, I'll get the permission".

replies(1): >>unrave+8X1
◧◩◪◨
72. 0xDEAF+uF[view] [source] [discussion] 2024-05-21 05:11:19
>>ncalla+em
>Effective Altruists are just shitty utilitarians that never take into account all the myriad ways that unmoderated utilitarianism has horrific failure modes.

There's a fair amount of EA discussion of utilitarianism's problems. Here's EA founder Toby Ord on utilitarianism and why he ultimately doesn't endorse it:

https://forum.effectivealtruism.org/posts/YrXZ3pRvFuH8SJaay/...

>If Effective Altruists want to speed the adoption of AI with the general public, they’d do well to avoid talking about it, lest the general public make a connection between EA and AI

Very few in the EA community want to speed AI adoption. It's far more common to think that current AI companies are being reckless, and we need some sort of AI pause so we can do more research and ensure that AI systems are reliably beneficial.

>When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

The all-time most upvoted post on the EA Forum condemns SBF: https://forum.effectivealtruism.org/allPosts?sortedBy=top&ti...

replies(1): >>ncalla+lI
◧◩
73. citize+fI[view] [source] [discussion] 2024-05-21 05:39:09
>>gedy+u1
An actress that specifically played the voice of AI in a movie about AI no less.
◧◩◪◨⬒
74. ncalla+lI[view] [source] [discussion] 2024-05-21 05:40:40
>>0xDEAF+uF
I’ve had to explain myself a few times on this, so clearly I communicated badly.

I probably should have said _those_ Effective Altruists are shitty utilitarians. I was attempting—and since I’ve had to clarify a few times clearly failed—to take aim at the effective altruists that would make the utilitarian trade off that the commenter mentioned.

In fact, there’s a paragraph from the Toby Ord blog post that I wholeheartedly endorse and I think rebuts the exact claim that was put forward that I was responding to.

> Don’t act without integrity. When something immensely important is at stake and others are dragging their feet, people feel licensed to do whatever it takes to succeed. We must never give in to such temptation. A single person acting without integrity could stain the whole cause and damage everything we hope to achieve.

So, my words were too broad. I don’t actually mean all effective altruists are shitty utilitarians. But the ones that would make the arguments I was responding to are.

I think Ord is a really smart guy, and has worked hard to put some awesome ideas out into the world. I think many others (and again, certainly not all) have interpreted and run with it as a framework for shitty utilitarianism.

◧◩◪◨⬒⬓
75. XorNot+cN[view] [source] [discussion] 2024-05-21 06:33:44
>>ethbr1+lB
It is wild to me that on HackerNews of all places, you'd think people don't love an underdog story.

Which is what this would be in the not-stupid version of events: they hired a voice actress for the rights to create the voice, she was paid, and then is basically told by the courts "actually you're unhireable because you sound too much like an already rich and famous person".

The issue of course is that OpenAIs reactions so far don't seem to indicate that they're actually confident they can prove this or that this is the case. Coz if this is actually the case, they're going about handling this in the dumbest possible way.

replies(3): >>ml-ano+hQ >>Sebb76+k91 >>jubalf+ji1
◧◩◪
76. mschus+pN[view] [source] [discussion] 2024-05-21 06:35:52
>>smugma+N4
Probably (and rightfully) feared that, had Disney stuck with their position, other MCU actors would be much, much harsher in new contract negotiations - or that some would go as far and say "nope, I quit".
◧◩◪◨
77. nicce+FO[view] [source] [discussion] 2024-05-21 06:49:35
>>visarg+hw
> When studios approach an actress A and she refuses, then another actress B takes the role, is that infringing on A's rights? Or should they just scrap the movie?

The scenario would have been that they approach none.

replies(1): >>r2_pil+Y53
◧◩◪◨⬒⬓⬔
78. ml-ano+hQ[view] [source] [discussion] 2024-05-21 07:02:05
>>XorNot+cN
It’s wild to me that there are people who think that OpenAI are the underdog. A 80Bn Microsoft vassal, what a plucky upstart.

You realise that there are multiple employees including the CEO publicly drawing direct comparisons to the movie Her after having tried and failed twice to hire the actress who starred in the movie? There is no non idiotic reading of this.

replies(1): >>XorNot+EU
◧◩◪◨⬒⬓⬔⧯
79. XorNot+EU[view] [source] [discussion] 2024-05-21 07:57:23
>>ml-ano+hQ
You're reading my statements as defending OpenAI. Put on your "I'm the PR department hat" and figure out what you'd do if you were OpenAI given various permutations of the possible facts here.

That's what I'm discussing.

Edit: which is to say, I think Sam Altman may have been a god damn idiot about this, but it's also wild anyone thought that ScarJo or anyone in Hollywood would agree - AI is currently the hot button issue there and you'd find yourself the much more local target of their ire.

replies(1): >>ml-ano+H51
◧◩◪◨⬒⬓⬔⧯▣
80. ml-ano+H51[view] [source] [discussion] 2024-05-21 09:22:23
>>XorNot+EU
Then why bother mentioning an "underdog story" at all?

Who is the underdog in this situation? In your comment it seems like you're framing OpenAI as the underdog (or perceived underdog) which is just bonkers.

Hacker News isn't a hivemind and there are those of us who work in GenAI who are firmly on the side of the creatives and gasp even rights holders.

◧◩◪◨⬒⬓⬔
81. Sebb76+k91[view] [source] [discussion] 2024-05-21 09:48:35
>>XorNot+cN
> they hired a voice actress for the rights to create the voice, she was paid, and then is basically told by the courts "actually you're unhireable because you sound too much like an already rich and famous person".

There are quite a few issues here: First, this is assuming they actually hired a voice-alike person, which is not confirmed. Second, they are not an underdog (the voice actress might be, but she's most likely pretty unaffected by this drama). Finally, they were clearly aiming to impersonate ScarJo (as confirmed by them asking for permission and samas tweet), so this is quite a different issue than "accidentally" hiring someone that "just happens to" sound like ScarJo.

◧◩◪◨⬒⬓⬔
82. jubalf+ji1[view] [source] [discussion] 2024-05-21 11:07:59
>>XorNot+cN
an obnoxious sleazy millionaire backed by microsoft is by no means “an underdog”
◧◩◪
83. Nasrud+mR1[view] [source] [discussion] 2024-05-21 14:30:58
>>gibbit+ph
That is wrong on several levels. First off, it ignores the role of massive researchers. Should we start de-automating processes to employee more people at the cost of worsening margins?
◧◩◪◨⬒⬓
84. parine+xT1[view] [source] [discussion] 2024-05-21 14:40:02
>>ncalla+ts
> I don’t think so. I’ve narrowed my comments specifically to Effective Altruists who are making utilitarian trade-offs to justify known moral wrongs.

Did you?

> Effective Altruists are just shitty utilitarians that never take into account all the myriad ways that unmoderated utilitarianism has horrific failure modes.

replies(1): >>ncalla+tp2
◧◩◪◨⬒
85. unrave+8X1[view] [source] [discussion] 2024-05-21 14:58:43
>>kubobl+BE
Ken Segall has a similar Steve Jobs story, he emails Jobs that the Apple legal team have just thrown a spanner in the works days before Ken's agency is set to launch Apple's big ad campaign and what should he do?

Jobs responds minutes later... "Fuck the lawyers."

◧◩◪◨
86. munksb+aY1[view] [source] [discussion] 2024-05-21 15:03:40
>>voltai+Oa
> If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?

One very easy explanation is that they trained Sky using another voice (this is the claim and no reason to doubt it is true) wanting to replicate the stye of the voice in "Her", but would have preferred to use SJ's real voice for the PR impact that could have.

Yanking it could also easily be a pre-emptive response to avoid further PR drama.

You will obvious decide you don't believe those explanations, but to many of us they're quite plausible, in fact I'd even suggest likely.

(And none of this precludes Sam Altman and OpenAI being dodgy anyway)

replies(1): >>voltai+Nt2
◧◩◪◨
87. munksb+EY1[view] [source] [discussion] 2024-05-21 15:05:57
>>ncalla+Rp
This is a genuine question. If it turns out they trained Sky on someone else's voice to similarly replicate the style of the voice in "Her", would you be ok with that? If it was proven that the voice was just similar, to SJ's would that be ok?

My view is, of course it is ok. SJ doesn't own the right to a particular style of voice.

◧◩◪◨⬒
88. nickle+f12[view] [source] [discussion] 2024-05-21 15:19:05
>>throww+Rb
Let me start by saying I despise generative AI and I think most AI companies are basically crooked.

I thought about your comment for a while, and I agree that there is a fine line between "realistic parody" and "intentional deception" that makes deepfake AI almost impossible to defend. In particular I agree with your distinction:

- In matters involving human actors, human-created animations, etc, there should be great deference to the human impersonators, particularly when it involves notable public figures. One major difference is that, since it's virtually impossible for humans to precisely impersonate or draw one another, there is an element of caricature and artistic choice with highly "realistic" impersonations.

- AI should be held to a higher standard because it involves almost no human expression, and it can easily create mathematically-perfect impersonations which are engineered to fool people. The point of my comment is that fair use is a thin sliver of what you can do with the tech, but it shouldn't be stamped out entirely.

I am really thinking of, say, the Joe Rogan / Donald Trump comedic deepfakes. It might be fine under American constitutional law to say that those things must be made so that AI Rogan / AI Trump always refer to each other in those ways, to make it very clear to listeners. It is a distinctly non-libertarian solution, but it could be "necessary and proper" because of the threat to our social and political knowledge. But as a general principle, those comedic deepkfakes are works of human political expression, aided by a fairly simple computer program that any CS graduate can understand, assuming they earned their degree honestly and are willing to do some math. It is constitutionally icky (legal term) to go after those people too harshly.

replies(1): >>throww+gb2
◧◩◪◨⬒⬓
89. throww+gb2[view] [source] [discussion] 2024-05-21 16:01:12
>>nickle+f12
I think that as long as a clear "mark of parody" is maintained such that a reasonable person could distinguish between the two, AI parodies are probably fine. The murkiness in my mind is expressly in the situation in the first episode of Black Mirror, where nobody could distinguish between the AI-generated video and the prime minister actually performing the act. Clearly that is not a parody, even if some people might find the situation humorous. But if we're not careful we give people making fake videos cover to hide behind fair use for parody.

I think you and I have the same concerns about balancing damage to the societal fabric against protecting honest speech.

◧◩◪◨⬒
90. static+dd2[view] [source] [discussion] 2024-05-21 16:09:18
>>KHRZ+Eg
Before Roe vs Wade was overturned you might have asked if abortion is legal why do abortion rights advocates want legislation passed?

The answer is without legislation you are far more subject to whether a judge feels like changing the law.

◧◩◪◨⬒⬓⬔
91. ncalla+tp2[view] [source] [discussion] 2024-05-21 17:09:07
>>parine+xT1
Sure, I should’ve said I tried to or I intended to:

You can see another comment here, where I acknowledge I communicate badly, since I’ve had to clarify multiple times what I was intending: >>40424566

This is the paragraph that was intended to narrow what I was talking about:

> I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

That said, I definitely should’ve said “those Effective Altruists” in the first paragraph to more clearly communicate my intent.

◧◩◪◨⬒
92. voltai+Nt2[view] [source] [discussion] 2024-05-21 17:31:52
>>munksb+aY1
I actually believe that’s quite plausible. The trouble is, by requesting permission in the first place, they demonstrated intent, which is legally significant. I think a lot of your confusion is attempting to employ pure logic to a legal issue. They are not the same thing, and the latter relies heavily on existing precedent — of which you may, it seems, be unaware.
◧◩◪◨⬒
93. voltai+pu2[view] [source] [discussion] 2024-05-21 17:36:05
>>parine+hs
They are trying to wriggle out of providing insight into how that voice was derived at all (like Google with the 100% of damages check). It would really suck for OpenAI if, for example, Altman had at some point emailed his team to ensure the soundalike was “as indistinguishable from Scarlet’s performance in HER as possible.“

Public figures own their likeness and control its use. Not to mention that in this case OA is playing chicken with studios as well. Not a great time to do so, given their stated hopes of supplanting 99% of existing Hollywood creatives.

◧◩◪◨
94. Intral+Ey2[view] [source] [discussion] 2024-05-21 18:00:06
>>ncalla+em
Plus, describing this as "speed the rate of life saving ai technology in the hands of everyone" is… A Reach.
◧◩◪◨
95. Intral+MA2[view] [source] [discussion] 2024-05-21 18:11:44
>>ncalla+em
The central contention of Effective Altruism, at least in practice if not in principle, seems to be that the value of thinking, feeling persons can be and should be reduced to numbers and objects that you can do calculations on.

Maybe there's a way to do that right. I suppose like any other philosophy, it ends up reflecting the personalities and intentions of the individuals which are attracted to and end up adopting it. Are they actually motivated by identifying with and wanting to help other people most effectively? Or are they just incentivized to try to get rid of pesky deontological and virtue-based constraints like empathy and universal rights?

◧◩◪◨⬒
96. bonton+0D2[view] [source] [discussion] 2024-05-21 18:21:29
>>justin+er
Copilot still tells me I've commit a content policy violation of I ask it to generate an image "in Tim Burton's style". Tim Burton has been openly critical of generative AI.
◧◩◪◨⬒
97. r2_pil+Y53[view] [source] [discussion] 2024-05-21 20:50:17
>>nicce+FO
Your scenario leads to the disenfranchisement of lesser-known voice talent, since they cannot take on work that Top Talent has rejected if they happen to resemble(nobody here has seen their doppelganger before? How unique is a person? 1 in 5 million? Then there are 36 million of that type in just the United States alone). Perhaps it's time to re-evaluate some of the rules in society. It's a healthy thing to do periodically.
◧◩◪◨
98. avarun+Hv3[view] [source] [discussion] 2024-05-21 23:14:02
>>chroma+La
They would create a voice in partnership with her? Not sure why you’re assuming it’s some insanely laborious process. It would obviously have been incredible marketing for them to actually get her on board.
◧◩◪◨⬒
99. EasyMa+2w3[view] [source] [discussion] 2024-05-21 23:15:57
>>ml-ano+49
They announced a whole new model, that's a positive announcement.
[go to top]