zlacker

[parent] [thread] 281 comments
1. nickth+(OP)[view] [source] 2024-05-20 23:10:38
This statement from scarlet really changed my perspective. I use and loved the Sky voice and I did feel it sounded a little like her, but moreover it was the best of their voice offerings. I was mad when they removed it. But now I’m mad it was ever there to begin with. This timeline makes it clear that this wasn’t a coincidence and maybe not even a hiring of an impressionist (which is where things get a little more wishy washy for me).
replies(11): >>ekam+a1 >>crimso+H1 >>kapild+s2 >>andrew+V6 >>wkat42+nf >>sneak+gj >>thaton+Mm >>windex+zs >>barbar+DI >>al_bor+RK >>belter+Iy1
2. ekam+a1[view] [source] 2024-05-20 23:16:03
>>nickth+(OP)
Same here and that voice really was the only good one. I don't know why they don't bring the voices from their API over, which are all much better, like Nova or Shimmer (https://platform.openai.com/docs/guides/text-to-speech)
replies(1): >>sanxiy+G1
◧◩
3. sanxiy+G1[view] [source] [discussion] 2024-05-20 23:18:54
>>ekam+a1
I think because it is not text-to-speech. It probably isn't simple to transfer.
replies(1): >>zwily+jw1
4. crimso+H1[view] [source] 2024-05-20 23:19:08
>>nickth+(OP)
But it's clearly not her voice right? The version that's been on the app for a year just isn't. Like, it clearly intending to be slightly reminiscent of her, but it's also very clearly not. Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?
replies(9): >>gedy+b3 >>bigfis+d3 >>bobthe+e3 >>emmp+m3 >>dragon+T3 >>ncalla+s4 >>EasyMa+e5 >>callal+ca >>visarg+mz
5. kapild+s2[view] [source] 2024-05-20 23:24:29
>>nickth+(OP)
I can still access the sky voice even though it is supposed to be "yanked".
replies(1): >>thorum+93
◧◩
6. thorum+93[view] [source] [discussion] 2024-05-20 23:29:23
>>kapild+s2
There’s still a Sky option but the actual voice has been changed.
◧◩
7. gedy+b3[view] [source] [discussion] 2024-05-20 23:29:45
>>crimso+H1
Normally I'd agree if this were some vague "artist style", but this was clearly an attempt to duplicate a living person, a media celebrity no less.
replies(3): >>threat+14 >>__loam+f6 >>citize+WJ
◧◩
8. bigfis+d3[view] [source] [discussion] 2024-05-20 23:30:01
>>crimso+H1
It could be trained on Scarlett's voice though, there's plenty of recorded samples for OpenAI to use. It's pretty damning for them to take down the voice right away like that
replies(1): >>branda+qi
◧◩
9. bobthe+e3[view] [source] [discussion] 2024-05-20 23:30:07
>>crimso+H1
this is correct. in fact the fcc has already clarified this for the case of robocalls. https://www.fcc.gov/document/fcc-makes-ai-generated-voices-r...
◧◩
10. emmp+m3[view] [source] [discussion] 2024-05-20 23:30:38
>>crimso+H1
We can seriously say that, yes. The courts have been saying this in the US for over 30 years. See Midler v. Ford Motor Co.
replies(1): >>Avshal+h4
◧◩
11. dragon+T3[view] [source] [discussion] 2024-05-20 23:34:18
>>crimso+H1
If the purpose is to trade on the celebrity voice and perceived association, and its subject to California right of personality law, then, yes, we're saying that that has been established law for decades.
replies(1): >>Last5D+kc
◧◩◪
12. threat+14[view] [source] [discussion] 2024-05-20 23:35:05
>>gedy+b3
Is this different from the various videos of the Harry Potter actors doing comedic high fashion ads? Because those were very well received.

https://www.youtube.com/watch?v=ipuqLy87-3A

replies(6): >>BadHum+I5 >>nickle+j6 >>jacobo+H6 >>jprete+X6 >>bottle+w9 >>tsimio+Ib
◧◩◪
13. Avshal+h4[view] [source] [discussion] 2024-05-20 23:36:55
>>emmp+m3
Tom Waits won a lawsuit against Doritos too.
◧◩
14. ncalla+s4[view] [source] [discussion] 2024-05-20 23:38:23
>>crimso+H1
> Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?

They clearly thought it was close enough that they asked for permission, twice. And got two no’s. Going forward with it at that point was super fucked up.

It’s very bad to not ask permission when you should. It’s far worse to ask for permission and then ignore the response.

Totally ethically bankrupt.

replies(5): >>nicce+L5 >>avarun+i8 >>ants_e+hc >>menset+3e >>gibbit+6j
◧◩
15. EasyMa+e5[view] [source] [discussion] 2024-05-20 23:43:14
>>crimso+H1
Sure they could have taken her to court but right now they don't want the bad publicity, especially since it would put everything else in the shadow of such a scandalous "story". Better to just back off, let S.J. win and move on and start planning on they're gonna spend all that paper money they got with announcement of a new, more advanced model. It's a financial decision and a fairly predictable one. I'm glad she won this time.
replies(2): >>__loam+Z5 >>smugma+u6
◧◩◪◨
16. BadHum+I5[view] [source] [discussion] 2024-05-20 23:45:33
>>threat+14
Is a billion dollar AI company utilizing someone's voice against their will in a flagship product after they said no twice different from a random Youtube channel making comedy videos?

I think so but that could just be me.

◧◩◪
17. nicce+L5[view] [source] [discussion] 2024-05-20 23:45:43
>>ncalla+s4
And they could have totally get away with it by never mentioning the name of Scarlett. But of course, that is not what they wanted.

Edit: to clarify, since it is not exactly identical voice, or even not that close, they can plausibly deny it, and we never new what their intention was.

But in this case, they have clearly created the voice to represent Scarlett's voice to demonstrate the capabilities of their product in order to get marketing power.

replies(1): >>visarg+Yx
◧◩◪
18. __loam+Z5[view] [source] [discussion] 2024-05-20 23:46:33
>>EasyMa+e5
Paper money from the model they're giving away for free?
replies(1): >>EasyMa+98
◧◩◪
19. __loam+f6[view] [source] [discussion] 2024-05-20 23:47:56
>>gedy+b3
Why do you have an issue with them taking someone's likeness to use in their product but not with them taking someone's work to use in their product?
replies(1): >>gedy+6c
◧◩◪◨
20. nickle+j6[view] [source] [discussion] 2024-05-20 23:48:12
>>threat+14
I think anti-deepfake legislation needs to consider fair use, especially when it comes to parody or other commentary on public figures. OpenAI's actions do not qualify as fair use.
replies(1): >>throww+yd
◧◩◪
21. smugma+u6[view] [source] [discussion] 2024-05-20 23:49:11
>>EasyMa+e5
She also won big against Disney. They backed down even though it appeared the contract was on their side. Iger apologized.

https://www.bbc.co.uk/news/business-58757748.amp

replies(1): >>mschus+6P
◧◩◪◨
22. jacobo+H6[view] [source] [discussion] 2024-05-20 23:50:57
>>threat+14
One is a company with a nearly $100 billion valuation using someone's likeness for their own commercial purposes in a large-scale consumer product, which consumers would plausibly interpret as a paid endorsement, while the other seems to be an amateur hobbyist nobody has ever heard of making a parody demo as an art project, in a way that makes it clear that the original actors had nothing to do with it. The context seems pretty wildly different to me.

I'm guessing if any of the Harry Potter actors threatened the hobbyist with legal action the video would likely come down, though I doubt they would bother even if they didn't care for the video.

23. andrew+V6[view] [source] 2024-05-20 23:51:59
>>nickth+(OP)
I thought it sounded like Jodie Foster.
replies(1): >>ncr100+C9
◧◩◪◨
24. jprete+X6[view] [source] [discussion] 2024-05-20 23:52:02
>>threat+14
Those are parodies and not meant at any point for you to believe the actual Harry Potter actors were involved.
◧◩◪◨
25. EasyMa+98[view] [source] [discussion] 2024-05-20 23:58:36
>>__loam+Z5
I mean if you don't think these kinds of positive announcements don't increase the value of the company or parent company then I don't really know how to convince you as it's a standard business principle.
replies(2): >>ml-ano+La >>__loam+0t
◧◩◪
26. avarun+i8[view] [source] [discussion] 2024-05-20 23:59:32
>>ncalla+s4
> They clearly thought it was close enough that they asked for permission, twice.

You seem to be misunderstanding the situation here. They wanted ScarJo to voice their voice assistant, and she refused twice. They also independently created a voice assistant which sounds very similar to her. That doesn't mean they thought they had to ask permission for the similar voice assistant.

replies(4): >>tomrod+Mb >>chroma+sc >>voltai+vc >>ncalla+yr
◧◩◪◨
27. bottle+w9[view] [source] [discussion] 2024-05-21 00:08:39
>>threat+14
There’s a big difference between a one off replica and person-as-a-service.
◧◩
28. ncr100+C9[view] [source] [discussion] 2024-05-21 00:09:19
>>andrew+V6
Scar Jo thought it sounded like herself, and so did people who knew her personally.

That is what matters. OWNERSHIP over her contributions to the world.

replies(5): >>mcphag+zb >>smt88+di >>minima+ti >>dyno12+dp >>parine+as
◧◩
29. callal+ca[view] [source] [discussion] 2024-05-21 00:12:28
>>crimso+H1
I think we should all be held to the standard of “Weird” Al Yankovic. In personal matters consent is important.
◧◩◪◨⬒
30. ml-ano+La[view] [source] [discussion] 2024-05-21 00:15:58
>>EasyMa+98
There isn’t a positive announcement here, what is wrong with you?

This reads like “we got caught red handed” and doing the bare minimum for it to not appear malicious and deliberate when the timeline is read out in court.

replies(1): >>EasyMa+Jx3
◧◩◪
31. mcphag+zb[view] [source] [discussion] 2024-05-21 00:21:39
>>ncr100+C9
Clearly Sam Altman though it sounded like ScarJo as well :-(
◧◩◪◨
32. tsimio+Ib[view] [source] [discussion] 2024-05-21 00:22:38
>>threat+14
That has a much better chance of falling under fair use (parody, non-commercial) if the actors ever tried to sue.

There is a major difference between parodying someone by imitating them while clearly and almost explicitly being an imitation; and deceptively imitating someone to suggest they are associated with your product in a serious manner.

◧◩◪◨
33. tomrod+Mb[view] [source] [discussion] 2024-05-21 00:23:31
>>avarun+i8
And... No. That is what OpenAI will assert, and good discovery by Scar Jo reps may prove or disprove.
◧◩◪◨
34. gedy+6c[view] [source] [discussion] 2024-05-21 00:24:53
>>__loam+f6
Because this isn't training an audio model along with a million other voices to understand English, etc. It's clearly meant to sound exactly like that one celebrity.

I suspect a video avatar service that looked exactly like her would fall afoul of fair use as well. Though an image gen that used some images of her (and many others) to train and spit out generic "attractive blonde woman" is fair use in my opinion.

replies(2): >>numpad+Em >>__loam+Ts
◧◩◪
35. ants_e+hc[view] [source] [discussion] 2024-05-21 00:26:14
>>ncalla+s4
Yes, totally ethically bankrupt. But what bewilders me is that they yanked it as soon as they heard from their lawyers. I would have thought that if they made the decision to go ahead despite getting two "no"s, that they at least had a legal position they thought was defensible and worth defending.

But it kind of looks like they released it knowing they couldn't defend it in court which must seem pretty bonkers to investors.

replies(3): >>ethbr1+Vg >>foobar+Bo >>emsign+FE
◧◩◪
36. Last5D+kc[view] [source] [discussion] 2024-05-21 00:26:25
>>dragon+T3
That's not the purpose though, clearly. If anything, you could make the argument that they're trading in on the association to the movie "Her", that's it. Neither Sky nor the new voice model sound particularly like ScarJo, unless you want to imply that her identity rights extend over 40% of all female voice types. People made the association because her voice was used in a movie that features a highly emotive voice assistant reminiscent of GPT-4o, which sama and others joked about.

I mean, why not actually compare the voices before forming an opinion?

https://www.youtube.com/watch?v=SamGnUqaOfU

https://www.youtube.com/watch?v=vgYi3Wr7v_g

-----

https://www.youtube.com/watch?v=iF9mrI9yoBU

https://www.youtube.com/watch?v=GV01B5kVsC0

replies(2): >>cowsup+Ei >>om2+MF
◧◩◪◨
37. chroma+sc[view] [source] [discussion] 2024-05-21 00:27:05
>>avarun+i8
So, what would they have done if she accepted? Claimed that the existing training of the Sky voice was voiced by her?
replies(4): >>famous+7f >>sangno+7m >>blacko+Qw >>avarun+ox3
◧◩◪◨
38. voltai+vc[view] [source] [discussion] 2024-05-21 00:27:20
>>avarun+i8
You seem to be misunderstanding the legalities at work here: reaching out to her multiple times beforehand, along with tweets intended to underline the similarity to her work on Her, demonstrates intention. If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?

Answer: because they knew they needed permission, after working so hard to associate with Her, and they hoped that in traditional tech fashion that if they moved fast and broke things enough, everyone would have to reshape around OAs wants, rather than around the preexisting rights of the humans involved.

replies(3): >>KHRZ+li >>parine+Yt >>munksb+RZ1
◧◩◪◨⬒
39. throww+yd[view] [source] [discussion] 2024-05-21 00:33:59
>>nickle+j6
The problem with that idea is that I can hide behind it while making videos of famous politicians doing really morally questionable things and distributing them on YouTube. The reason Fair Use works with regular parodies in my opinion is that everyone can tell that it is obviously fake. For example, Saturday Night Live routinely makes joking parody videos of elected officials doing things we think might be consistent with their character. And in those cases it's obvious that it's being portrayed by an actor and therefore a parody. If you use someone's likeness directly I think that it must never be fair use or we will quickly end up in a world where no video can be trusted.
replies(2): >>cjbgka+qk >>nickle+W22
◧◩◪
40. menset+3e[view] [source] [discussion] 2024-05-21 00:36:30
>>ncalla+s4
Effective altruism would posit that it is worth one voice theft to help speed the rate of life saving ai technology in the hands of everyone.
replies(2): >>ehnto+pm >>ncalla+Vn
◧◩◪◨⬒
41. famous+7f[view] [source] [discussion] 2024-05-21 00:43:40
>>chroma+sc
Voice cloning could be as simple as a few seconds of audio in the context window since GPT-4o is a speech to speech transformer. They wouldn't need to claim anything, just switch samples. They haven't launched the new voice mode yet, just demos.
42. wkat42+nf[view] [source] 2024-05-21 00:44:36
>>nickth+(OP)
> maybe not even a hiring of an impressionist

If they really hired someone who sounds just like her it's fair game IMO. Johanssen can't own the right to a similar voice just like many people can have the same name. I think if there really was another actress and she just happens to sound like her, then it's really ok. And no I'm not a fan of Altman (especially his worldcoin which I view as a privacy disaster)

I mean, imagine if I happened to have a similar voice to a famous actor, would that mean that I couldn't work as a voice actor without getting their OK just because they happen to be more famous? That would be ridiculous. Pretending to be them would be wrong, yes.

If they hired someone to change their voice to match hers, that'd be bad. Yeah. If they actually just AI-cloned her voice that's totally not OK. Also any references to the movies. Bad.

replies(2): >>confus+Uf >>101008+Pj
◧◩
43. confus+Uf[view] [source] [discussion] 2024-05-21 00:48:24
>>wkat42+nf
Discovery process will be interesting
◧◩◪◨
44. ethbr1+Vg[view] [source] [discussion] 2024-05-21 00:54:16
>>ants_e+hc
> I would have thought that if they made the decision to go ahead despite getting two "no"s, that they at least had a legal position they thought was defensible and worth defending.

They likely have a legal position which is defensible.

They're much more worried that they don't have a PR position which is defensible.

What's the point of winning the (legal) battle if you lose the war (of public opinion)?

Given the rest of their product is built on apathy to copyright, they're actively being sued by creators, and the general public is sympathetic to GenAI taking human jobs...

... this isn't a great moment for OpenAI to initiate a long legal battle, against a female movie actress / celebrity, in which they're arguing how her likeness isn't actually controlled by her.

Talk about optics!

(And I'd expect they quietly care much more about their continued ability to push creative output through their copyright launderer, than get into a battle over likeness)

replies(2): >>justin+Vs >>XorNot+CA
◧◩◪
45. smt88+di[view] [source] [discussion] 2024-05-21 01:03:32
>>ncr100+C9
I mostly agree with you, but I actually don't think it matters if it sounded exactly like her or not. The crime is in the training: did they use her voice or not?

If someone licenses an impersonator's voice and it gets very close to the real thing, that feels like an impossible situation for a court to settle and it should probably just be legal (if repugnant).

replies(7): >>toomuc+vj >>sangno+Tk >>randog+ym >>aseipp+Om >>dv_dt+Cn >>XorNot+Sz >>jonath+vC
◧◩◪◨⬒
46. KHRZ+li[view] [source] [discussion] 2024-05-21 01:05:12
>>voltai+vc
You could also ask: If Scarlett has a legal case already, why does she want legislation passed?
replies(4): >>minima+Wi >>ncalla+Ns >>bradch+Sw >>static+Ue2
◧◩◪
47. branda+qi[view] [source] [discussion] 2024-05-21 01:06:12
>>bigfis+d3
Her statement claims the voice was taken down at her attorney's insistence.
◧◩◪
48. minima+ti[view] [source] [discussion] 2024-05-21 01:06:39
>>ncr100+C9
More notably for legal purposes, there were several independent news reports corroborating the vocal similarity.
replies(1): >>sangno+wl
◧◩◪◨
49. cowsup+Ei[view] [source] [discussion] 2024-05-21 01:08:05
>>Last5D+kc
> People made the association because her voice was used in a movie that features a highly emotive voice assistant reminiscent of GPT-4o, which sama and others joked about.

Whether you think it sounds like her or not is a matter of opinion, I guess. I can see the resemblance, and I can also see the resemblance to Jennifer Lawrence and others.

What Johannson is alleging goes beyond this, though. She is alleging that Altman (or his team) reached out to her (or her team) to lend her voice, she was not interested, and then she was asked again just two days before GPT-4o's announcement, and she rejected again. Now there's a voice that, in her opinion, sounds a lot like her.

Luckily, the legal system is far more nuanced than just listening to a few voices and comparing it mentally to other voices individuals have heard over the years. They'll be able to figure out, as part of discovery, what lead to the Sky voice sounding the way it does (intentionally using Johannson's likeness? coincidence? directly trained off her interviews/movies?), whether OpenAI were willing to slap Johannson's name onto the existing Sky during the presentation, whether the "her" tweet and the combination of the Sky voice was supposed to draw the subtle connection... This allegation is just the beginning.

replies(1): >>Last5D+Fl
◧◩◪◨⬒⬓
50. minima+Wi[view] [source] [discussion] 2024-05-21 01:09:50
>>KHRZ+li
To prevent it from happening again, with more legal authority than a legal precedent.
◧◩◪
51. gibbit+6j[view] [source] [discussion] 2024-05-21 01:10:54
>>ncalla+s4
Are we surprised by this bankruptcy. As neat as AI is, it is only a thing because the corporate class see it as a way to reduce margins by replacing people with it. The whole concept is bankrupt.
replies(4): >>ecjhdn+ml >>ncalla+5m >>emsign+kF >>Nasrud+3T1
52. sneak+gj[view] [source] 2024-05-21 01:12:38
>>nickth+(OP)
Why are you mad? We have no rights to the sound of our voice. There is nothing wrong with someone or something else making sounds that sound like us, even if we don’t want it to happen.

No one is harmed.

replies(2): >>mkehrt+tj >>elicas+xk
◧◩
53. mkehrt+tj[view] [source] [discussion] 2024-05-21 01:14:07
>>sneak+gj
Are you sure? You certainly have rights to your likeness--it can't be used commercially without permission. Di you know this doesn't cover your voice?
◧◩◪◨
54. toomuc+vj[view] [source] [discussion] 2024-05-21 01:14:16
>>smt88+di
https://en.wikipedia.org/wiki/Personality_rights
◧◩
55. 101008+Pj[view] [source] [discussion] 2024-05-21 01:17:57
>>wkat42+nf
But clearly they are advertising as her (no pun intended), which is a gray area.
replies(1): >>wkat42+Ik
◧◩◪◨⬒⬓
56. cjbgka+qk[view] [source] [discussion] 2024-05-21 01:23:38
>>throww+yd
I’m guessing you’re referring to people still thinking Sarah Palin said she could see Russia from her house, that was from a SNL skit and an amazing impression from Tina Fey. I agree, people have a hard time separating reality from obvious parody, how could we expect them to make a distinction with intentional imitation. Society must draw a clear line that it is not ok to do this.
◧◩
57. elicas+xk[view] [source] [discussion] 2024-05-21 01:24:30
>>sneak+gj
The law can actually be interesting and nuanced on this: http://law2.umkc.edu/faculty/projects/ftrials/communications...
replies(1): >>ethbr1+kn
◧◩◪
58. wkat42+Ik[view] [source] [discussion] 2024-05-21 01:26:55
>>101008+Pj
Yeah that was the bad part. Agreed there.

I wonder if they deliberately steered towards this for more marketing buzz?

◧◩◪◨
59. sangno+Tk[view] [source] [discussion] 2024-05-21 01:28:19
>>smt88+di
> The crime is in the training: did they use her voice or not?

This is a civil issue, and actors get broad rights to their likeliness. Kim Kardashian sued Old Navy for using a look-alike actress in an ad; old Navy chose to settle, which makes it appear like "the real actress wasn't involved in any way" may not be a perfect defense. The timeline makes it clear they wanted it to sound like Scarlett's voice, the actual mechanics on how they got the AI to sound like that is only part of the story.

◧◩◪◨
60. ecjhdn+ml[view] [source] [discussion] 2024-05-21 01:31:55
>>gibbit+6j
100% this.

It’s shocking to me how people cannot see this.

The only surprise here is that they didn’t think she’d push back. That is what completes the multilayered cosmic and dramatic irony of this whole vignette. Honestly feels like Shakespeare or Arthur Miller might have written it.

◧◩◪◨
61. sangno+wl[view] [source] [discussion] 2024-05-21 01:32:56
>>minima+ti
...and sama's tweet referencing "Her"
◧◩◪◨⬒
62. Last5D+Fl[view] [source] [discussion] 2024-05-21 01:34:07
>>cowsup+Ei
I honestly don't think it is a matter of opinion, though. Her voice has a few very distinct characteristics, the most significant of which being the vocal fry / huskiness, that aren't present at all in either of the Sky models.

Asking for her vocal likeness is completely in line with just wanting the association with "Her" and the big PR hit that would come along with that. They developed voice models on two different occasions and hoped twice that Johannson would allow them to make that connection. Neither time did she accept, and neither time did they release a model that sounded like her. The two day run-up isn't suspicious either, because we're talking about a general audio2audio transformer here. They could likely fine-tune it (if even that is necessary) on her voice in hours.

I don't think we're going to see this going to court. OpenAI simply has nothing to gain by fighting it. It would likely sour their relation to a bunch of media big-wigs and cause them bad press for years to come. Why bother when they can simply disable Sky until the new voice mode releases, allowing them to generate a million variations of highly-expressive female voices?

◧◩◪◨
63. ncalla+5m[view] [source] [discussion] 2024-05-21 01:37:50
>>gibbit+6j
I don’t think any said anything about being surprised by it?
◧◩◪◨⬒
64. sangno+7m[view] [source] [discussion] 2024-05-21 01:38:20
>>chroma+sc
> Claimed that the existing training of the Sky voice was voiced by her?

That claim could very well be true. The letter requested information on how the voice was trained - OpenAI may not want that can of worms opened lest other celebrities start paying closer attention to the other voices.

◧◩◪◨
65. ehnto+pm[view] [source] [discussion] 2024-05-21 01:40:38
>>menset+3e
It didn't require voice theft, they could have easily found a volunteer or paid for someone else.
◧◩◪◨
66. randog+ym[view] [source] [discussion] 2024-05-21 01:42:05
>>smt88+di
> If someone licenses an impersonator's voice and it gets very close to the real thing, that feels like an impossible situation for a court to settle and it should probably just be legal (if repugnant).

Does that mean if cosplayers dress up like some other character, they can use that version of the character in their games/media? I think it should be equally simple to settle. It's different if it's their natural voice. Even then, it brings into question whether they can use "doppelgangers" legally.

◧◩◪◨⬒
67. numpad+Em[view] [source] [discussion] 2024-05-21 01:42:56
>>gedy+6c
Chances are this is. Basically same as LoRA. One of go-to tools for these literally uses Diffusion model and work on spectrograms as images.
68. thaton+Mm[view] [source] 2024-05-21 01:44:03
>>nickth+(OP)
At least in past court cases I’m familiar, you can’t use an impersonator and get people to think it’s the real thing.

It’s not like Tom Waits ever wanted to hock chips

https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...

replies(3): >>chii+Go >>acomje+DA >>splatz+l31
◧◩◪◨
69. aseipp+Om[view] [source] [discussion] 2024-05-21 01:44:15
>>smt88+di
It is not an impossible situation, courts have settled it, and what you describe is not how the law works (despite how many computer engineers think to the contrary.)
replies(1): >>smt88+lP
◧◩◪
70. ethbr1+kn[view] [source] [discussion] 2024-05-21 01:47:58
>>elicas+xk
I think it's a different argument with respect to famous media celebrities* too.

If someone clones a random person's voice for commercial purposes, the public likely has no idea who the voice's identity is. Consequently, it's just the acoustic voice.

If someone clones a famous media celebrity's voice, the public has a much greater chance of recognizing the voice and associating it with a specific person.

Which then opens a different question of 'Is the commercial use of the voice appropriating the real person's fame for their own gain?'

Add in the facts that media celebrities' values are partially defined by how people see them, and that they are often paid for their endorsements, and it's a much clearer case that (a) the use potentially influenced the value of their public image & (b) the use was theft, because it was taking something which otherwise would have had value.

Neither consideration exists with 'random person's voice' (with deference to voice actors).

* Defined as 'someone for whom there is an expectation that the general public would recognize their voice or image'

◧◩◪◨
71. dv_dt+Cn[view] [source] [discussion] 2024-05-21 01:50:25
>>smt88+di
As I understand it (though I may be wrong) in music sampling cases, it doesn’t matter if the “sample” is using an actual clip from a recording or if were recreated from scratch using a new media (e.g. direct midi sequence), if a song sampling another song is recognizable it is still infringing.
replies(1): >>parine+Cs
◧◩◪◨
72. ncalla+Vn[view] [source] [discussion] 2024-05-21 01:53:21
>>menset+3e
Effective Altruists are just shitty utilitarians that never take into account all the myriad ways that unmoderated utilitarianism has horrific failure modes.

Their hubris will walk them right into federal prison for fraud if they’re not careful.

If Effective Altruists want to speed the adoption of AI with the general public, they’d do well to avoid talking about it, lest the general public make a connection between EA and AI

I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

replies(5): >>parine+st >>comp_t+VC >>0xDEAF+bH >>Intral+lA2 >>Intral+tC2
◧◩◪◨
73. foobar+Bo[view] [source] [discussion] 2024-05-21 01:59:26
>>ants_e+hc
> But it kind of looks like they released it knowing they couldn't defend it in court which must seem pretty bonkers to investors.

That actually seems like there may be a few people involved and one of them is a cowboy PM who said fuck it, ship it to make the demo. And then damage control came in later. Possibly the PM didn't even know about the asks for permission?

replies(2): >>anytim+3z >>kubobl+iG
◧◩
74. chii+Go[view] [source] [discussion] 2024-05-21 02:00:03
>>thaton+Mm
> get people to think it’s the real thing.

but did openAI make any claims about whose voice this is? Just because a voice sounds similar or familiar, doesn't mean it's fraudulent.

replies(1): >>tedivm+pp
◧◩◪
75. dyno12+dp[view] [source] [discussion] 2024-05-21 02:05:27
>>ncr100+C9
I'm not sure how much you currently legally own imitations of your own voice. There's a whole market for voice actors who can imitate particular famous voices.
replies(2): >>adolph+Mp >>stonog+s13
◧◩◪
76. tedivm+pp[view] [source] [discussion] 2024-05-21 02:09:18
>>chii+Go
Just read the top post of the thread you're responding to-

> - Not receiving a response, OpenAI demos the product anyway, with Sam tweeting “her” in reference to Scarlett’s film.

replies(1): >>dzhiur+5r
◧◩◪◨
77. adolph+Mp[view] [source] [discussion] 2024-05-21 02:13:28
>>dyno12+dp
Should have renamed it

https://en.wikipedia.org/wiki/Sosumi

Or

https://www.reddit.com/r/todayilearned/comments/9n44b6/til_t...

◧◩◪◨
78. dzhiur+5r[view] [source] [discussion] 2024-05-21 02:26:14
>>tedivm+pp
To me reference sounds more to towards omni than her voice
replies(2): >>altair+Os >>nickth+5u
◧◩◪◨
79. ncalla+yr[view] [source] [discussion] 2024-05-21 02:32:02
>>avarun+i8
> You seem to be misunderstanding the situation here. They wanted ScarJo to voice their voice assistant, and she refused twice. They also independently created a voice assistant which sounds very similar to her.

And promoted it using a tweet naming the movie that Johansson performed in, for the role that prompted them to ask her in the first place.

You have to be almost deliberately naive to not see that the were attempting to use her vocal likeness in this situation. There’s a reason they immediately walked it back after the situation was revealed.

Neither a judge, nor a jury, would be so willingly naive.

replies(1): >>munksb+l02
◧◩◪
80. parine+as[view] [source] [discussion] 2024-05-21 02:39:43
>>ncr100+C9
She doesn't own most (all probably) of her contributions to the world.

If the voice was only trained on the voice of the character she played in Her, would she have any standing in claiming some kind of infringement?

replies(1): >>lesost+UE1
81. windex+zs[view] [source] 2024-05-21 02:43:47
>>nickth+(OP)
The thing about the situation is that Altman is willing to lie and steal a celebrity's voice for use in ChatGPT. What he did, the timeline, everything - is sleazy if, in fact, that's the story.

The really concerning part here is that Altman is, and wants to be, a large part of AI regulation [0]. Quite the public contradiction.

[0] https://www.businessinsider.com/sam-altman-openai-artificial...

replies(13): >>dvhh+9t >>startu+yt >>vasili+wv >>ocodo+ex >>beefnu+qJ >>choppa+AJ >>trustn+yM >>belter+9Y >>akudha+511 >>xinayd+U11 >>chx+o21 >>latexr+Cu1 >>Intral+eF2
◧◩◪◨⬒
82. parine+Cs[view] [source] [discussion] 2024-05-21 02:44:19
>>dv_dt+Cn
Sampling is not the same as duplication. Sampling is allowed as it's a derivitive work as long as it's substantially different from the original.

It's a "I know it when I see it" situation so it's not clear cut.

replies(1): >>Findec+sV
◧◩◪◨⬒⬓
83. ncalla+Ns[view] [source] [discussion] 2024-05-21 02:46:44
>>KHRZ+li
Because a legal case under the current justice system and legislative framework would probably take hundreds of thousands to millions of dollars to bring a case that requires discovery and a trial to accomplish.

Maybe (maybe!) it’s worth it for someone like Johansson to take on the cost of that to vindicate her rights—but it’s certainly not the case for most people.

If your rights can only be defended from massive corporations by bringing lawsuits that cost hundreds of thousands to millions of dollars, then only the wealthy will have those rights.

So maybe she wants new legislative frameworks around these kind of issues to allow people to realistically enforce these rights that nominally exist.

For an example of updating a legislative framework to allow more easily vindicating existing rights, look up “anti-SLAPP legislation”, which many states have passed to make it easier for a defendant of a meritless lawsuit seeking to chill speech to have the lawsuit dismissed. Anti-SLAPP legislation does almost nothing to change the actual rights that a defendant has to speak, but it makes it much more practical for a defendant to actually excercise those rights.

So, the assumption that a call for updated legislation implies that no legal protection currently exists is just a bad assumption that does not apply in this situation.

◧◩◪◨⬒
84. altair+Os[view] [source] [discussion] 2024-05-21 02:46:47
>>dzhiur+5r
What’s “omni”?
replies(1): >>nickth+bu
◧◩◪◨⬒
85. __loam+Ts[view] [source] [discussion] 2024-05-21 02:47:21
>>gedy+6c
Okay so as long as we steal enough stuff then it's legal.
◧◩◪◨⬒
86. justin+Vs[view] [source] [discussion] 2024-05-21 02:47:34
>>ethbr1+Vg
> They likely have a legal position which is defensible.

Doesn't sound like they have that either.

replies(1): >>bonton+HE2
◧◩◪◨⬒
87. __loam+0t[view] [source] [discussion] 2024-05-21 02:48:01
>>EasyMa+98
I believe there's a difference between building a sustainable and profitable business and pumping the stock.
◧◩
88. dvhh+9t[view] [source] [discussion] 2024-05-21 02:48:55
>>windex+zs
Some people might see some parallel with SBF and see how Altman would try to regulate competition without impeding OpenAI progress
replies(3): >>viking+Pw >>numpad+9z >>choppa+eK
◧◩◪◨⬒
89. parine+st[view] [source] [discussion] 2024-05-21 02:52:24
>>ncalla+Vn
This is like attributing the crimes of a few fundamentalists to an entire religion.
replies(2): >>ncalla+au >>ocodo+qy
◧◩
90. startu+yt[view] [source] [discussion] 2024-05-21 02:53:28
>>windex+zs
Most likely it was an unforced error, as there’ve been a lot of chaos with cofounders and the board revolt, easy to loose track of something really minor.

Like some intern’s idea to train the voice on their favorite movie.

And then they’ve decided that this is acceptable risk/reward and not a big liability, so worth it.

This could be a well-planned opening move of a regulation gambit. But unlikely.

replies(6): >>mmastr+Rt >>windex+iu >>Always+0v >>Cheer2+gv >>mbrees+Qv >>kergon+LI
◧◩◪
91. mmastr+Rt[view] [source] [discussion] 2024-05-21 02:56:25
>>startu+yt
It makes a lot more sense that he was caught red-handed, likely hiring a similar voice actress and not realizing how strong identity protections are for celebs.
◧◩◪◨⬒
92. parine+Yt[view] [source] [discussion] 2024-05-21 02:57:17
>>voltai+vc
> If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?

Many things that are legal are of questionable ethics. Asking permission could easily just be an effort for them to get better samples of her voice. Pulling the voice after debuting it is 100% a PR response. If there's a law that was broken, pulling the voice doesn't unbreak it.

replies(1): >>voltai+6w2
◧◩◪◨⬒
93. nickth+5u[view] [source] [discussion] 2024-05-21 02:58:29
>>dzhiur+5r
That’s not a gamble they are willing to take to court of law or public opinion.
◧◩◪◨⬒⬓
94. ncalla+au[view] [source] [discussion] 2024-05-21 02:59:14
>>parine+st
I don’t think so. I’ve narrowed my comments specifically to Effective Altruists who are making utilitarian trade-offs to justify known moral wrongs.

> I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

Frankly, if you’re going to make an “ends justify the means” moral argument, you need to do a lot of work to address how those arguments have gone horrifically wrong in the past, and why the moral framework you’re using isn’t susceptible to those issues. I haven’t seen much of that from Effective Altruists.

I was responding to someone who was specifically saying an EA might argue why it’s acceptable to commit a moral wrong, because the ends justify it.

So, again, if someone is using EA to decide how to direct their charitable donations, volunteer their time, or otherwise decide between mora goods, I have no problem with it. That specifically wasn’t context I was responding to.

replies(1): >>parine+eV1
◧◩◪◨⬒⬓
95. nickth+bu[view] [source] [discussion] 2024-05-21 02:59:18
>>altair+Os
GTP-4o is the new model, the o stands for Omni.
◧◩◪
96. windex+iu[view] [source] [discussion] 2024-05-21 03:00:52
>>startu+yt
I don't think this makes any sense, at all, quite honestly. Why would an "intern" be training one of ChatGPT's voices for a major release?

If in fact, that was the case, then OpenAI is not aligned with the statement they just put out about having utmost focus on rigor and careful considerations, in particular this line: "We know we can't imagine every possible future scenario. So we need to have a very tight feedback loop, rigorous testing, careful consideration at every step, world-class security, and harmony of safety and capabilities." [0]

[0] https://x.com/gdb/status/1791869138132218351

◧◩◪
97. Always+0v[view] [source] [discussion] 2024-05-21 03:08:04
>>startu+yt
At first I thought there may be a /s coming...
◧◩◪
98. Cheer2+gv[view] [source] [discussion] 2024-05-21 03:10:34
>>startu+yt
> easy to loose track of something really minor. Like some intern’s idea

Yes, because we all know the high profile launch for a major new product is entirely run by the interns. Stop being an apologist.

◧◩
99. vasili+wv[view] [source] [discussion] 2024-05-21 03:12:32
>>windex+zs
if this account is true, Sam Altman is a deeply unethical human being. Given that he doesn't bring any technical know how to building of AGI, I just don't see the reason to have such a person in charge here. The new board should act.
replies(9): >>ornorn+dz >>jcranm+wC >>imjons+PC >>insane+6D >>silver+fF >>serial+8P >>latexr+yw1 >>gdilla+oc2 >>FireBe+IQ3
◧◩◪
100. mbrees+Qv[view] [source] [discussion] 2024-05-21 03:15:19
>>startu+yt
This is an unforced error, but it isn’t minor. It’s quite large and public.

The general public doesn’t understand the details and nuances of training an LLM, the various data sources required, and how to get them.

But the public does understand stealing someone’s voice. If you want to keep the public on your side, it’s best to not train a voice with a celebrity who hasn’t agreed to it.

replies(1): >>surfin+tP
◧◩◪
101. viking+Pw[view] [source] [discussion] 2024-05-21 03:23:57
>>dvhh+9t
I always mix up those two in my head and have to think which one is which
replies(1): >>ocodo+hx
◧◩◪◨⬒
102. blacko+Qw[view] [source] [discussion] 2024-05-21 03:24:08
>>chroma+sc
Maybe they have second trained on her voice.
◧◩◪◨⬒⬓
103. bradch+Sw[view] [source] [discussion] 2024-05-21 03:24:14
>>KHRZ+li
She has a personal net worth of >$100m. She’s also married to a successful actor in his own right.

Her voice alone didn’t get her there — she did. That’s why celebrities are so protective about how their likeness is used: their personal brand is their asset.

There’s established legal precedent on exactly this—even in the case they didn’t train on her likeness, if it can reasonably be suspected by an unknowing observer that she personally has lent her voice to this, she has a strong case. Even OpenAI knew this, or they would not have asked in the first place.

◧◩
104. ocodo+ex[view] [source] [discussion] 2024-05-21 03:28:44
>>windex+zs
Altman has proven time and again that he is little more than a huckster wrt technology, and in business he is a stone cold shark.

Conman plain and simple.

replies(5): >>wrapti+3A >>lawn+OE >>mlindn+1V >>m000+b71 >>firebi+LP1
◧◩◪◨
105. ocodo+hx[view] [source] [discussion] 2024-05-21 03:29:43
>>viking+Pw
One is in jail, when it should be two are in jail
replies(1): >>garbth+RI
◧◩◪◨
106. visarg+Yx[view] [source] [discussion] 2024-05-21 03:37:27
>>nicce+L5
> since it is not exactly identical voice, or even not that close, they can plausibly deny it

When studios approach an actress A and she refuses, then another actress B takes the role, is that infringing on A's rights? Or should they just scrap the movie?

Maybe if they replicated a scene from the A's movies or there was striking likeness between the voices... but not generally.

replies(1): >>nicce+mQ
◧◩◪◨⬒⬓
107. ocodo+qy[view] [source] [discussion] 2024-05-21 03:42:16
>>parine+st
Effective Altruists are the fundamentalists though. So no, it's not.
◧◩◪◨⬒
108. anytim+3z[view] [source] [discussion] 2024-05-21 03:47:31
>>foobar+Bo
The whole company behaves like rogue cowboys.

If a PM there didn’t say “fuck it ship it even without her permission” they’d probably be replaced with someone who would.

I expect the cost of any potential legal action/settlement was happily accepted in order to put on an impressive announcement.

◧◩◪
109. numpad+9z[view] [source] [discussion] 2024-05-21 03:47:58
>>dvhh+9t
Maybe they were the rogue AGI escapes we found along the way
◧◩◪
110. ornorn+dz[view] [source] [discussion] 2024-05-21 03:48:23
>>vasili+wv
He has “The Vision”… It’s the modern entrepreneurship trope that lowly engineers won’t achieve anything if they weren’t rallied by a demi-god who has “The Vision” and makes it all happen.
replies(2): >>azinma+Rz >>parpfi+yG
◧◩
111. visarg+mz[view] [source] [discussion] 2024-05-21 03:49:18
>>crimso+H1
> Are we seriously saying we can't make voices that are similar to celebrities, when not using their actual voice?

I think the copyright industry wants to grab new powers to counter the infinite capacity of AI to create variations. But that move would knee cap the creative industry first, newcomers have no place in a fully copyrighted space.

It reminds me of how NIMBY blocks construction to keep up the prices. Will all copyright space become operated on NIMBY logic?

◧◩◪◨
112. azinma+Rz[view] [source] [discussion] 2024-05-21 03:55:06
>>ornorn+dz
Probably not wrong. Lots and lots of examples of that being true.
replies(1): >>safety+IR
◧◩◪◨
113. XorNot+Sz[view] [source] [discussion] 2024-05-21 03:55:14
>>smt88+di
If OpenAI commissioned a voice actor to lend their voice to the Sky model, and cast on the basis of trying to get someone who is similar sounding to the Scarlett Johannson, but then did not advertise or otherwise use the voice model created to claim it was Scarlett Johannson - then they're completely in the clear.

Because then the actual case would be fairly bizarre: an entirely separate person, selling the rights to their own likeness as they are entitled to do, is being prohibited from doing that by the courts because they sound too much like an already famous person.

EDIT: Also up front I'm not sure you can entirely discuss timelines for changing out technology here. We have voice cloning systems that can do it with as little as 15 seconds of audio. So having a demo reel of what they wanted to do that they could've used on a few days notice isn't unrealistic - and training a model and not using it or releasing it also isn't illegal.

replies(2): >>cycoma+3G >>jeroje+XI
◧◩◪
114. wrapti+3A[view] [source] [discussion] 2024-05-21 03:56:46
>>ocodo+ex
Not going to lie, he had me. He appeared very genuine and fair in almost all media he appeared like podcasts but many of his actions are just so hard to justify.
replies(3): >>svacha+2C >>kristi+0K >>JohnFe+ln2
◧◩◪◨⬒
115. XorNot+CA[view] [source] [discussion] 2024-05-21 04:03:59
>>ethbr1+Vg
How is the PR position not defensible? One of the worst things you can generally do is admit fault, particularly if you have a complete defense.

Buckle in, go to court, and double-down on the fact that the public's opinion of actors is pretty damn fickle at the best of times - particularly if what you released was in fact based on someone you signed a valid contract with who just sounds similar.

Of course, this is all dependent on actually having a complete defense of course - you absolutely would not want to find Scarlett Johannsen voice samples in file folders associated with the Sky model if it went to court.

replies(1): >>ethbr1+2D
◧◩
116. acomje+DA[view] [source] [discussion] 2024-05-21 04:04:11
>>thaton+Mm
Or Bette Midler singing for ford. She turned them down. They used a sound alike, she sued and won

https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.

replies(1): >>dewbri+oF
◧◩◪◨
117. svacha+2C[view] [source] [discussion] 2024-05-21 04:19:39
>>wrapti+3A
He has a certain charm and seeming sincerity when he talks. But the more I see of him, the more disturbing I find him -- he combines the Mark Zuckerberg stare with the Elizabeth Holmes vocal fry.
replies(2): >>jester+xM >>poloti+5R
◧◩◪◨
118. jonath+vC[view] [source] [discussion] 2024-05-21 04:23:40
>>smt88+di
This has been settled law for 34 years. See Tom Waits v Frito-Lay.

They literally hired an impersonator, and it cost them 2.5 million (~6 million today).

https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...

replies(1): >>smt88+6Q
◧◩◪
119. jcranm+wC[view] [source] [discussion] 2024-05-21 04:23:54
>>vasili+wv
I mean, there's already been some yellow flags with Altman already. He founded Worldcoin, whose plan is to airdrop free money in exchange for retinal scans. And the board of OpenAI fired him for (if I've got this right) lying to the board about conversations he'd had with individual board members.
replies(1): >>JohnFe+io2
◧◩◪
120. imjons+PC[view] [source] [discussion] 2024-05-21 04:27:22
>>vasili+wv
He rubs elbows with very powerful people including CEOs, heads of state and sheiks. They probably want 'one of them' in charge of the company that has the best chances of getting close to AGI. So it's not his technical chops and not even 'vision' in the Jobs sense that keeps him there.
replies(1): >>dontup+ko1
◧◩◪◨⬒
121. comp_t+VC[view] [source] [discussion] 2024-05-21 04:28:33
>>ncalla+Vn
> When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

Extremely reasonable position, and I'm glad that every time some idiot brings it up in the EA forum comments section they get overwhelmingly downvoted, because most EAs aren't idiots in that particular way.

I have no idea what the rest of your comment is talking about; EAs that have opinions about AI largely think that we should be slowing it down rather than speeding it up.

replies(2): >>ncalla+6E >>emsign+XF
◧◩◪◨⬒⬓
122. ethbr1+2D[view] [source] [discussion] 2024-05-21 04:29:45
>>XorNot+CA
In what world does a majority of the public cheer for OpenAI "stealing"* an actress's voice?

People who hate Hollywood? Most of that crowd hates tech even more.

* Because it would take the first news cycle to be branded as that

replies(1): >>XorNot+TO
◧◩◪
123. insane+6D[view] [source] [discussion] 2024-05-21 04:30:00
>>vasili+wv
I thought we had already established this when the previous board tried to oust him for failing to stick to OpenAI’s charter. This is just further confirmation.

> The new board should act

You mean like the last board tried? Besides the board was picked to be on Altman’s side. The independent members were forced out.

replies(1): >>frank_+EG5
◧◩◪◨⬒⬓
124. ncalla+6E[view] [source] [discussion] 2024-05-21 04:39:30
>>comp_t+VC
In some sense I see a direct line between the EA argument being presented here, and the SBF consequentialist argument where he talks about being willing to flip a coin if it had a 50% chance to destroy the world and a 50% chance to make the world more than twice as good.

I did try to cabin my arguments to Effective Altrusts that are making ends justify the means arguments. I really don’t have a problem with people that are attempting to use EA to decide between multiple good outcomes.

I’m definitely not engaged enough with the Effective Altrusits to know where the plurality of thought lies, so I was trying to respond in the context of this argument being put forward on behalf of Effective Altruists.

The only part I’d say applies to all EA, is the brand taint that SBF has done in the public perception.

◧◩◪◨
125. emsign+FE[view] [source] [discussion] 2024-05-21 04:45:29
>>ants_e+hc
It looks really unprofessional at minimum if not a bit arrogant, which is actually more concerning as it hints at a deeper disrespect for artists and celebrities.
◧◩◪
126. lawn+OE[view] [source] [discussion] 2024-05-21 04:46:53
>>ocodo+ex
You'd think that Worldcoin would be enough proof of what he is but I guess people missed that memo.
replies(3): >>JoRyGu+MK >>ben_w+sM >>Intral+cD2
◧◩◪
127. silver+fF[view] [source] [discussion] 2024-05-21 04:50:46
>>vasili+wv
It shouldn’t be forgotten that his sister has publicly accused him and his brother of sexually abusing her as a child.
replies(1): >>verisi+TR
◧◩◪◨
128. emsign+kF[view] [source] [discussion] 2024-05-21 04:51:51
>>gibbit+6j
Problem is they really believe we either can't tell the difference between a human and an AI model eventually, or they think we don't care. Don't they understand the meaning of art?
◧◩◪
129. dewbri+oF[view] [source] [discussion] 2024-05-21 04:52:12
>>acomje+DA
They used a sound-alike and had her sing one of her songs. I believe that's a different precedent, in that it's leveraging her fame.

Imo Sky's voice is distinct enough from Scarlett, and it wasn't implied to _be_ her.

Sam's "Her" tweet could be interpreted as such, but defending the tweet as the concept of "Her", rather than the voice itself, is.

replies(1): >>sleepy+QS
◧◩◪◨
130. om2+MF[view] [source] [discussion] 2024-05-21 04:56:01
>>Last5D+kc
I haven’t hear the GPT-4o voice before. Comparing the video to the video of Johansson’s voice in “her”, it sounds pretty similar. Johansson’s performance there sounds pretty different from her normal speaking voice in the interview - more intentional emotional inflection, bubbliness, generally higher pitch. The GPT-4o voice sounds a lot like it.

From elsewhere in the thread, likeness rights apparently do extend to intentionally using lookalikes / soundalikes to create the appearance of endorsement or association.

◧◩◪◨⬒⬓
131. emsign+XF[view] [source] [discussion] 2024-05-21 04:57:41
>>comp_t+VC
The speed doesn't really matter if their end goal is morally wrong. A slower speed might give them an advantage to not overshoot and get backlash or it gives artists and the public more time to fight back against EA, but it doesn't hide their ill intentions.
◧◩◪◨⬒
132. cycoma+3G[view] [source] [discussion] 2024-05-21 04:59:11
>>XorNot+Sz
That's confidently incorrect. Many others already posted that this has been settled case law for many years. I mean would you argue that if someone build a macbook lookalike, but not using the same components would be completely clear?
replies(1): >>XorNot+JG
◧◩◪◨⬒
133. kubobl+iG[view] [source] [discussion] 2024-05-21 05:03:01
>>foobar+Bo
> a cowboy PM who said fuck it, ship it to make the demo.

Given the timeline it sounds like the PM was told "just go ahead with it, I'll get the permission".

replies(1): >>unrave+PY1
◧◩◪◨
134. parpfi+yG[view] [source] [discussion] 2024-05-21 05:04:42
>>ornorn+dz
I roll my eyes when somebody says that they’re “the idea person” or that they have “the vision”.

I’d wager that most senior+ engineers or product people also have equally compelling “the vision”s.

The difference is that they need to do actual work all day so they don’t get to sit around pontificating.

replies(1): >>JohnFe+2o2
◧◩◪◨⬒⬓
135. XorNot+JG[view] [source] [discussion] 2024-05-21 05:06:40
>>cycoma+3G
I ask you what do you call the Framework [1]? Or Dell's offerings?[2] Compared to the Macbook? [3]

Look kind of similar right? Lot of familiar styling queues? What would take it from "similar" to actual infringement? Well if you slapped an Apple Logo on there, that would do it. Did OpenAI make an actual claim? Did they actually use Scarlett Johannson's public image and voice as sampling for the system?

[1] https://images.prismic.io/frameworkmarketplace/25c9a15f-4374...

[2] https://i.dell.com/is/image/DellContent/content/dam/ss2/prod...

[3] https://cdn.arstechnica.net/wp-content/uploads/2023/06/IMG_1...

replies(2): >>mrbung+DL >>einher+4U
◧◩◪◨⬒
136. 0xDEAF+bH[view] [source] [discussion] 2024-05-21 05:11:19
>>ncalla+Vn
>Effective Altruists are just shitty utilitarians that never take into account all the myriad ways that unmoderated utilitarianism has horrific failure modes.

There's a fair amount of EA discussion of utilitarianism's problems. Here's EA founder Toby Ord on utilitarianism and why he ultimately doesn't endorse it:

https://forum.effectivealtruism.org/posts/YrXZ3pRvFuH8SJaay/...

>If Effective Altruists want to speed the adoption of AI with the general public, they’d do well to avoid talking about it, lest the general public make a connection between EA and AI

Very few in the EA community want to speed AI adoption. It's far more common to think that current AI companies are being reckless, and we need some sort of AI pause so we can do more research and ensure that AI systems are reliably beneficial.

>When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

The all-time most upvoted post on the EA Forum condemns SBF: https://forum.effectivealtruism.org/allPosts?sortedBy=top&ti...

replies(1): >>ncalla+2K
137. barbar+DI[view] [source] 2024-05-21 05:24:17
>>nickth+(OP)
Everyone is so mad about them stealing a beloved celebrity’s voice. What about the millions of authors and other creators whose copyrighted works they stole to create works that resemble and replace those people? Not famous enough to generate the same outrage?
replies(3): >>surfin+OP >>creato+XT >>Quantu+tc3
◧◩◪
138. kergon+LI[view] [source] [discussion] 2024-05-21 05:25:40
>>startu+yt
> Like some intern’s idea to train the voice on their favorite movie.

Ah, the famous rogue engineer.

The thing is, even if it were the case, this intern would have been supervised by someone, who themselves would have been managed by someone, all the way to the top. The moment Altman makes a demo using it, he owns the problem. Such a public fuckup is embarrassing.

> And then they’ve decided that this is acceptable risk/reward and not a big liability, so worth it.

You mean, they were reckless and tried to wing it? Yes, that’s exactly what’s wrong with them.

> This could be a well-planned opening move of a regulation gambit. But unlikely.

LOL. ROFL, even. This was a gambit all right. They just expected her to cave and not ask questions. Altman has a common thing with Musk: he does not play 3D chess.

◧◩◪◨⬒
139. garbth+RI[view] [source] [discussion] 2024-05-21 05:26:34
>>ocodo+hx
I dont like sam, but he moves way smarter than ppl like sbf or Elizabeth holmes. He actual has a product close to the reported specs, albeit still far away from the ultimate goal of AGI

i dont see why he should be in jail

replies(2): >>choppa+HK >>Findec+HU
◧◩◪◨⬒
140. jeroje+XI[view] [source] [discussion] 2024-05-21 05:28:00
>>XorNot+Sz
Well Sam Altman tweeted "her" so that does seem to me like they're trying to claim a similarity to Scarlett Johannson.
◧◩
141. beefnu+qJ[view] [source] [discussion] 2024-05-21 05:32:36
>>windex+zs
the whole technology is based on fucking over artists, who didn't expect this exact thing?
replies(1): >>surfin+YO
◧◩
142. choppa+AJ[view] [source] [discussion] 2024-05-21 05:34:26
>>windex+zs
Altman doesn’t want to be part of regulation. sama wants to be the next tk. he wants to be above regulation, and he wants to spend Microsoft’s money getting there.

E.g. flying Congress to Lake Cuomo for an off-the-record “discussion” https://freebeacon.com/politics/how-the-aspen-institute-help...

◧◩◪
143. citize+WJ[view] [source] [discussion] 2024-05-21 05:39:09
>>gedy+b3
An actress that specifically played the voice of AI in a movie about AI no less.
◧◩◪◨
144. kristi+0K[view] [source] [discussion] 2024-05-21 05:40:17
>>wrapti+3A
I have exactly the same feeling as I think you do. When you reach the levels of success he has, there will always be people screaming that you are incompetent, evil and every other negative adjective under the sun. But he genuinely seemed to care about doing the right thing. But this is just so lacking of basic morals that I have to conclude that I was wrong, at least to an extent.
replies(2): >>wrapti+mR >>gdilla+5c2
◧◩◪◨⬒⬓
145. ncalla+2K[view] [source] [discussion] 2024-05-21 05:40:40
>>0xDEAF+bH
I’ve had to explain myself a few times on this, so clearly I communicated badly.

I probably should have said _those_ Effective Altruists are shitty utilitarians. I was attempting—and since I’ve had to clarify a few times clearly failed—to take aim at the effective altruists that would make the utilitarian trade off that the commenter mentioned.

In fact, there’s a paragraph from the Toby Ord blog post that I wholeheartedly endorse and I think rebuts the exact claim that was put forward that I was responding to.

> Don’t act without integrity. When something immensely important is at stake and others are dragging their feet, people feel licensed to do whatever it takes to succeed. We must never give in to such temptation. A single person acting without integrity could stain the whole cause and damage everything we hope to achieve.

So, my words were too broad. I don’t actually mean all effective altruists are shitty utilitarians. But the ones that would make the arguments I was responding to are.

I think Ord is a really smart guy, and has worked hard to put some awesome ideas out into the world. I think many others (and again, certainly not all) have interpreted and run with it as a framework for shitty utilitarianism.

◧◩◪
146. choppa+eK[view] [source] [discussion] 2024-05-21 05:42:46
>>dvhh+9t
sama gets to farm out much of the lobbying to Microsoft’s already very powerful team, which spends a mere $10m but that money gets magnified by MS’s gov and DoD contracts. That’s a huge safety net for him, he gets to steal and lie (as demonstrated w/ Scarlett) and yet the MS lobbying machine will continue unphased.

https://www.opensecrets.org/federal-lobbying/clients/summary...

◧◩◪◨⬒⬓
147. choppa+HK[view] [source] [discussion] 2024-05-21 05:46:28
>>garbth+RI
Should be in jail for Worldcoin which has pilfered people of their biological identity. I guess you could literally delete Worldcoin and in theory make people whole, but that company treats humans like vegetables that have no rights.
◧◩◪◨
148. JoRyGu+MK[view] [source] [discussion] 2024-05-21 05:47:10
>>lawn+OE
Because of course he's got a crypto grift going. Shocking.
149. al_bor+RK[view] [source] 2024-05-21 05:48:18
>>nickth+(OP)
I had to go look at what voice I picked once I heard the news, it was Sky. I listened to them all and thought it sounded the best. I didn’t make any connection to her (Scar Jo or the movie) when going through the voices, but I wasn’t listening for either. I don’t think I know her voice well enough to pick it out of a group like that.

Maybe I liked it best because it felt familiar, even if I didn’t know why. I’m a bit disappointed now that she didn’t sign on officially, but my guess is that Altman just burned his bridge to half of Hollywood if he is looking for a plan B.

◧◩◪◨⬒⬓⬔
150. mrbung+DL[view] [source] [discussion] 2024-05-21 05:58:33
>>XorNot+JG
You're not arguing your way out of jurisprudence, especially when the subject is a human and not a device nor IP. They (OpenAI) fucked up.
replies(1): >>XorNot+uP
◧◩◪◨
151. ben_w+sM[view] [source] [discussion] 2024-05-21 06:09:57
>>lawn+OE
Much as I dislike crypto, that's more of "having no sense of other people's privacy" (and hubris) than general scamminess.

It's a Musk-error not an SBF-error. (Of course, I do realise many will say all three are the same, but I think it's worth separating the types of mistakes everyone makes, because everyone makes mistakes, and only two of these three also did useful things).

replies(3): >>pwdiss+IN >>lawn+zO >>latexr+Vv1
◧◩◪◨⬒
152. jester+xM[view] [source] [discussion] 2024-05-21 06:10:17
>>svacha+2C
so all psychopaths do, aren't they?
replies(2): >>johnny+tN >>cutemo+7J2
◧◩
153. trustn+yM[view] [source] [discussion] 2024-05-21 06:10:20
>>windex+zs
Altman wants to be a part of AI regulation in the same way Bankman Fried wanted to be a part of cryptocurrency regulation.
replies(5): >>gds44+qP >>askl+s31 >>dqft+cQ1 >>Andrex+DY3 >>graves+t24
◧◩◪◨⬒⬓
154. johnny+tN[view] [source] [discussion] 2024-05-21 06:20:59
>>jester+xM
CEO's have been studied to have a disproportionately higher rate of psychopathy. So there's a little correlation. You don't get to the top of a company in this kind of society without having some inherent charm (assuming you aren't simply inheriting billions from a previous generation).
replies(1): >>jester+jI7
◧◩◪◨⬒
155. pwdiss+IN[view] [source] [discussion] 2024-05-21 06:23:09
>>ben_w+sM
> that's more of "having no sense of other people's privacy"

Sufficiently advanced incompetence is indistinguishable from malice.

replies(1): >>ben_w+C01
◧◩◪◨⬒
156. lawn+zO[view] [source] [discussion] 2024-05-21 06:31:51
>>ben_w+sM
It's not just about privacy either.

Worldcoin is centrally controlled making it a classic "scam coin". Decentralization is the _only_ unique thing about cryptocurrencies, when you abandon decentralization all that's left is general scamminess.

(Yes, there's nuance to decentralization too but that's not what's going on with Worldcoin.)

replies(1): >>ben_w+c21
◧◩◪◨⬒⬓⬔
157. XorNot+TO[view] [source] [discussion] 2024-05-21 06:33:44
>>ethbr1+2D
It is wild to me that on HackerNews of all places, you'd think people don't love an underdog story.

Which is what this would be in the not-stupid version of events: they hired a voice actress for the rights to create the voice, she was paid, and then is basically told by the courts "actually you're unhireable because you sound too much like an already rich and famous person".

The issue of course is that OpenAIs reactions so far don't seem to indicate that they're actually confident they can prove this or that this is the case. Coz if this is actually the case, they're going about handling this in the dumbest possible way.

replies(3): >>ml-ano+YR >>Sebb76+1b1 >>jubalf+0k1
◧◩◪
158. surfin+YO[view] [source] [discussion] 2024-05-21 06:34:25
>>beefnu+qJ
It's not just the artists, anything you do in the digital realm and anything that can be digitised is fair game. In the UK NHS GP practices refuse to register you to see a doctor even when it's urgent and tell you to use a third-party app to book an appointment. You have use your phone to take photos of the affected area and provide a personal info. I fully expect that data to be fed into some AI and sold without me knowing and without a process for removal of data should the company go bust. It is preying on the vulnerable when they need help.
replies(3): >>Kineti+oR >>4ndrew+aT >>jjgree+w11
◧◩◪◨
159. mschus+6P[view] [source] [discussion] 2024-05-21 06:35:52
>>smugma+u6
Probably (and rightfully) feared that, had Disney stuck with their position, other MCU actors would be much, much harsher in new contract negotiations - or that some would go as far and say "nope, I quit".
◧◩◪
160. serial+8P[view] [source] [discussion] 2024-05-21 06:36:01
>>vasili+wv
He must be bringing something to the table as they tried to get rid of him and failed spectacularly. Business is not only about technical know how.
replies(1): >>surfin+lR
◧◩◪◨⬒
161. smt88+lP[view] [source] [discussion] 2024-05-21 06:39:09
>>aseipp+Om
Courts have settled almost nothing related to AI. We don't even know if training AI using copyrighted works is a violating of copyright law.

Please point to a case where someone was successfully sued for sounding too much like a celebrity (while not using the celebrity's name or claiming to be them).

replies(2): >>davidg+gU >>ascorb+BW
◧◩◪
162. gds44+qP[view] [source] [discussion] 2024-05-21 06:39:57
>>trustn+yM
Whats really interesting about our timeline is when you look at the history of market capture in Big Oil, Telco, Pharma, Real Estate, Banks, Tobacco etc all the lobbying, bribing, competition killing used to be done behind the scenes within elite circles.

The public hardly heard from or saw the mgmt of these firm in media until shit hit the fan.

Today it feels like managment is in the media every 3 hours trying to capture attention of prospective customers, investors, employees etc or they loose out to whoever is out there capturing more attention.

So false and condradictory signalling is easy to see. Hopefully out of all this chaos we get a better class of leaders not a better class of panderers.

replies(2): >>hoseja+GY >>detour+cH6
◧◩◪◨
163. surfin+tP[view] [source] [discussion] 2024-05-21 06:40:34
>>mbrees+Qv
I had a conversation with someone responsible for introducing LLMs into the process that involves personal information. That person rejected my concern over one person's data appearing in the report on another person. He told me that it will be possible to train AI to avoid that. The rest of the conversation convinced me that AI is seen as magic that can do anything. It seems to me that we are seeing a split between those who don't understand it and fear it and those who don't understand it, but want to align themselves with it. Those latter are those I fear the most.
replies(1): >>komboo+201
◧◩◪◨⬒⬓⬔⧯
164. XorNot+uP[view] [source] [discussion] 2024-05-21 06:40:37
>>mrbung+DL
There is not clear jurisprudence on this. They're only in trouble if they actually used ScarJo's voice samples to train the model, or if they intentionally tried to portray their imitation as her without her permission.

The biggest problem on that front (assuming the former is not true) is Altman's tweets, but court-wise that's defensible (though I retract what I had here previously - probably not easily) as a reference to the general concept of the movie.

Because otherwise the situation you have is OpenAI seeking a particular style, hiring someone who can provide it, not trying to pass it off as that person (give or take the Tweet's) and the intended result effectively being: "random voice actress, you sound too much like an already rich and famous person. Good luck having no more work in your profession" - which would be the actual outcome.

The question entirely hinges on, did they include any data at all which includes ScarJo's voice samples in the training. And also whether it actually does sound similar enough - Frito-Lay went down because of intent and similarity. There's the hilarious outcome here that the act of trying to contact ScarJo is the actual problem they had.

EDIT 2: Of note also - to have a case, they actually have to show reputational harm. Of course on that front, the entire problem might also be Altman. Continuing the trend I suppose of billionaires not shutting up on Twitter being the main source of their legal issues.

replies(2): >>einher+bU >>zdp7+7O5
◧◩
165. surfin+OP[view] [source] [discussion] 2024-05-21 06:43:45
>>barbar+DI
Welcome to the world where the "fuck the creatives" brigade wants everything for free.
◧◩◪◨⬒
166. smt88+6Q[view] [source] [discussion] 2024-05-21 06:46:11
>>jonath+vC
That case seems completely dissimilar to what OpenAI did.

Frito-Lay copied a song by Waits (with different lyrics) and had an impersonator sing it. Witnesses testified they thought Waits had sung the song.

If OpenAI were to anonymously copy someone's voice by training AI on an imitation, you wouldn't have:

- a recognizable singing voice

- music identified with a singer

- market confusion about whose voice it is (since it's novel audio coming from a machine)

I don't think any of this is ethical and think voice-cloning should be entirely illegal, but I also don't think we have good precedents for most AI issues.

replies(1): >>jonath+f31
◧◩◪◨⬒
167. nicce+mQ[view] [source] [discussion] 2024-05-21 06:49:35
>>visarg+Yx
> When studios approach an actress A and she refuses, then another actress B takes the role, is that infringing on A's rights? Or should they just scrap the movie?

The scenario would have been that they approach none.

replies(1): >>r2_pil+F73
◧◩◪◨⬒
168. poloti+5R[view] [source] [discussion] 2024-05-21 06:55:59
>>svacha+2C
Do you have a link to a video of Altman's voice shifting from controlled deep to nasal? The videos of Elizabeth Holmes not being able to keep up with the faked deep tone are textbook-worthy...
◧◩◪◨
169. surfin+lR[view] [source] [discussion] 2024-05-21 06:58:03
>>serial+8P
Microsoft. They are protecting their investment.
◧◩◪◨⬒
170. wrapti+mR[view] [source] [discussion] 2024-05-21 06:58:13
>>kristi+0K
I feel that this is a classic tale of success getting to you. It almost feels like it's impossible to be successful at this level and remain true. At least, I hadn't seen it yet.
replies(1): >>Intral+Sz2
◧◩◪◨
171. Kineti+oR[view] [source] [discussion] 2024-05-21 06:58:32
>>surfin+YO
Last time I booked a blood test it was via the official NHS app , not a third party.
replies(1): >>surfin+mS
◧◩◪◨⬒
172. safety+IR[view] [source] [discussion] 2024-05-21 07:00:32
>>azinma+Rz
There is something to it. Someone has to identify the intersection between what the engineering can do and what the market actually wants, then articulate that to a broad enough audience. Engineers constantly undervalue this very fuzzy and very human centric part of the work.

I don't think the issue is that Vision doesn't matter. I think the issue is Sam doesn't have it. Like Gates and Jobs had clear, well defined visions for how the PC was going to change the world, then rallied engineering talent around them and turned those into reality, that's how their billions and those lasting empires were born. Maybe someone like Elon Musk is a contemporary example. Just don't see anything like that from SamA, we see him in the media, talking a lot about AI, rubbing shoulders with power brokers, being cutthroat, but where's the vision of a better future? And if he comes up with one does he really understand the engineering well enough to ground it in reality?

replies(1): >>azinma+j12
◧◩◪◨
173. verisi+TR[view] [source] [discussion] 2024-05-21 07:01:41
>>silver+fF
I didn't know about that, strange:

https://www.lesswrong.com/posts/QDczBduZorG4dxZiW/sam-altman...

replies(1): >>lrvick+cW
◧◩◪◨⬒⬓⬔⧯
174. ml-ano+YR[view] [source] [discussion] 2024-05-21 07:02:05
>>XorNot+TO
It’s wild to me that there are people who think that OpenAI are the underdog. A 80Bn Microsoft vassal, what a plucky upstart.

You realise that there are multiple employees including the CEO publicly drawing direct comparisons to the movie Her after having tried and failed twice to hire the actress who starred in the movie? There is no non idiotic reading of this.

replies(1): >>XorNot+lW
◧◩◪◨⬒
175. surfin+mS[view] [source] [discussion] 2024-05-21 07:06:35
>>Kineti+oR
https://www.patientaccess.com/
◧◩◪◨
176. sleepy+QS[view] [source] [discussion] 2024-05-21 07:13:20
>>dewbri+oF
Does not the 'her' tweet give away the game. Aas you said, it was a midler impersator singing one of midlers songs. In this case they have a voice for their AI assistant/phone sex toy that is very much like the actress that played a famous ai assistant/phone sex toy. Even if he is taken as meaning the concept it's very very damning. If they had, instead, mimic'ed another famous actor's voice that hasn't played a robot/ai/whatever and used that would that really be any better though? Christopher Walken, say, or hell Bette Midler?
◧◩◪◨
177. 4ndrew+aT[view] [source] [discussion] 2024-05-21 07:16:36
>>surfin+YO
Important to note the "The NHS" is not a single entity and the GP practice is likely a private entity owned in partnership by the doctors. There are a number of reasons why individual practices can refuse to register.

Take your point about LLMs though.

replies(1): >>surfin+GT
◧◩◪◨⬒
178. surfin+GT[view] [source] [discussion] 2024-05-21 07:22:14
>>4ndrew+aT
I went to see my GP and the lady at the reception told me they no longer book visits at the reception and I had to use the app. Here's the privacy policy https://support.patientaccess.com/privacy-policy They reserve the right to pass your data to third party contractors and to use it for marketing purposes. There is the obligatory clause on regarding the right to be forgotten, but the AI companies claim it is impossible to implement.
replies(1): >>4ndrew+qU
◧◩
179. creato+XT[view] [source] [discussion] 2024-05-21 07:26:33
>>barbar+DI
I think the unique thing about this case is not specifically the "voice theft", but that OpenAI specifically asked for permission and were denied, which eliminates most of the usual plausible deniability that gets trotted out in these cases.
replies(1): >>Lia_th+9V2
◧◩◪◨⬒⬓⬔
180. einher+4U[view] [source] [discussion] 2024-05-21 07:29:37
>>XorNot+JG
Grey laptops that share some ideas in their outline while being distinct enough to not get lawyers from Cupertino on their necks?
◧◩◪◨⬒⬓⬔⧯▣
181. einher+bU[view] [source] [discussion] 2024-05-21 07:30:57
>>XorNot+uP
Are you a lawyer?
◧◩◪◨⬒⬓
182. davidg+gU[view] [source] [discussion] 2024-05-21 07:31:28
>>smt88+lP
Multiple cases already answering your question in this thread.
◧◩◪◨⬒⬓
183. 4ndrew+qU[view] [source] [discussion] 2024-05-21 07:34:13
>>surfin+GT
I didn't read that as reserving the right - looks like a standard dpia that is opt-in and limited.

However, GP practices are essentially privatised - so you do have the right to register at another practice.

◧◩◪◨⬒⬓
184. Findec+HU[view] [source] [discussion] 2024-05-21 07:38:47
>>garbth+RI
If his sister's words about sexually abusing her are true, he should be in jail.
replies(1): >>bryanr+gZ
◧◩◪
185. mlindn+1V[view] [source] [discussion] 2024-05-21 07:42:22
>>ocodo+ex
I'm glad more people are thinking this. It's amazing that he got his way back into OpenAI somehow. I said as much that he shouldn't go back to OpenAI and got downvotes universally both here and on reddit.
replies(1): >>renega+gm5
◧◩◪◨⬒⬓
186. Findec+sV[view] [source] [discussion] 2024-05-21 07:47:22
>>parine+Cs
Oh, the day when an artist could sample other artists without attribution and royalties is long gone. The music labels are very hard on this these days.
◧◩◪◨⬒
187. lrvick+cW[view] [source] [discussion] 2024-05-21 07:55:44
>>verisi+TR
"Some commenters on Hacker News claim that a post regarding Annie's claims that Sam sexually assaulted her at age 4 has been being repeatedly removed."

Whelp. Let us see if this one sticks.

◧◩◪◨⬒⬓⬔⧯▣
188. XorNot+lW[view] [source] [discussion] 2024-05-21 07:57:23
>>ml-ano+YR
You're reading my statements as defending OpenAI. Put on your "I'm the PR department hat" and figure out what you'd do if you were OpenAI given various permutations of the possible facts here.

That's what I'm discussing.

Edit: which is to say, I think Sam Altman may have been a god damn idiot about this, but it's also wild anyone thought that ScarJo or anyone in Hollywood would agree - AI is currently the hot button issue there and you'd find yourself the much more local target of their ire.

replies(1): >>ml-ano+o71
◧◩◪◨⬒⬓
189. ascorb+BW[view] [source] [discussion] 2024-05-21 07:59:24
>>smt88+lP
Midler vs Ford: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
◧◩
190. belter+9Y[view] [source] [discussion] 2024-05-21 08:11:27
>>windex+zs
This whole exchange from 1:04:53 to 1:10:22 takes a whole different meaning....

https://youtu.be/P_ACcQxJIsg?t=3891

◧◩◪◨
191. hoseja+GY[view] [source] [discussion] 2024-05-21 08:15:17
>>gds44+qP
So great to have twitter so the narcissistic psychopaths can't resist revealing themselves for clout.
replies(1): >>Toucan+af2
◧◩◪◨⬒⬓⬔
192. bryanr+gZ[view] [source] [discussion] 2024-05-21 08:20:49
>>Findec+HU
no, in that case he should have been in the Juvenile incarceration system, unless the argument is that he should have been charged as an adult, or that Juvenile abusers should always be charged and sentenced as adults, or that Juvenile sex offenders who were not charged as Juveniles should be charged as adults.

Which one?

on edit: this being based on American legal system, you may come from a legal system with different rules.

◧◩◪◨⬒
193. komboo+201[view] [source] [discussion] 2024-05-21 08:26:30
>>surfin+tP
The "AI is magic and we should simply believe" is even being actively promoted because all these VC hucksters need it.

Any criticism of AI is being met with "but if we all just hype AI harder, it will get so good that your criticisms won't matter" or flat out denied. You've got tech that's deeply flawed with no obvious way to get unflawed, and the current AI 'leaders' run companies with no clear way to turn a profit other than being relentlessly hyped on proposed future growth.

It's becoming an extremely apparent bubble.

replies(1): >>surfin+Z01
◧◩◪◨⬒⬓
194. ben_w+C01[view] [source] [discussion] 2024-05-21 08:33:06
>>pwdiss+IN
It's not particularly advanced, it's the same thing that means the supermajority of websites have opted for "click here to consent to our 1200 partners processing everything you do on our website" rather than "why do we need 1200 partners anyway?"

It's still bad, don't get be wrong, it's just something I can distinguish.

replies(2): >>Intral+oz2 >>CRConr+iSk
◧◩◪◨⬒⬓
195. surfin+Z01[view] [source] [discussion] 2024-05-21 08:36:30
>>komboo+201
On the plus side, lots of cheap nVida cards heading for eBay once it bursts.
◧◩
196. akudha+511[view] [source] [discussion] 2024-05-21 08:36:51
>>windex+zs
What is so special about her voice? They could’ve found a college student with a sweet voice and offered to pay her tuition in exchange for using her voice, no? Or a voice actor?

Why be cartoonishly stupid and cartoonishly arsehole and steal a celebrity’s voice? Did he think Scarlett won’t find out? Or object?

I don’t understand these rich people. Is it their hobby to be a dick to as many people as they can, for no reason other than their amusement? Just plain weirdos

replies(2): >>meat_m+O11 >>sage76+Sd1
◧◩◪◨
197. jjgree+w11[view] [source] [discussion] 2024-05-21 08:42:12
>>surfin+YO
App? What's an app?

It's a thing you put on your phone

I don't have a phone

Well, we can't register you

You don't accept people who don't have phones? Could I have that in writing please, ..., oh, your signature on that please ...

◧◩◪
198. meat_m+O11[view] [source] [discussion] 2024-05-21 08:45:16
>>akudha+511
Scarlett voiced Samantha, an AI in the movie "Her"

Considering the movie's 11 years old, it's surprisingly on-point with depictions of AI/human interactions, relations, and societal acceptance. It does get a bit speculative and imaginative at the end though...

But I imagine that movie did/does spark the imagination of many people, and I guess Sam just couldn't let it go.

replies(2): >>mike_h+7d1 >>themad+iP6
◧◩
199. xinayd+U11[view] [source] [discussion] 2024-05-21 08:45:58
>>windex+zs
> The thing about the situation is that Altman is willing to lie and steal a celebrity's voice for use in ChatGPT. What he did, the timeline, everything - is sleazy if, in fact, that's the story.

Correcting, the thing about this whole situation with OpenAI is they are willing to steal everything for use in ChatGPT. They trained their model with copyrighted data and for some reason they won't delete the millions of protected data they used to train the AI model.

replies(1): >>JohnFe+op2
◧◩◪◨⬒⬓
200. ben_w+c21[view] [source] [discussion] 2024-05-21 08:47:57
>>lawn+zO
True decentralisation is part of the problem with cryptocurrencies and why they can't work the way the advocates want them to.

Decentralisation allows trust-less assurance that money is sent, it's just that's not useful because the goods or services for which the money is transferred still need either trust or a centralised system that can undo the transaction because fraud happened.

That's where smart contracts come in, which I also think are a terrible idea, but do at least deserve a "you tried!" badge, because they're as dumb as saying "I will write bug-free code" rather than as dumb as "let's build a Dyson swarm to mine exactly the same amount of cryptocurrency as we would have if we did nothing".

replies(1): >>lawn+hd1
◧◩
201. chx+o21[view] [source] [discussion] 2024-05-21 08:49:06
>>windex+zs
Altman is a known conman. Surely you are aware of Yishan Wong describing how Sam Altman and the Reddit founders conned Conde Nast https://reddit.com/r/AskReddit/comments/3cs78i/whats_the_bes...
replies(1): >>sirsin+o51
◧◩◪◨⬒⬓
202. jonath+f31[view] [source] [discussion] 2024-05-21 08:55:40
>>smt88+6Q
Let me connect the dots for you.

Company identifies celebrity voice they want. (Frito=Waits, OpenAi=ScarJo)

Company comes up with novel thing for the the voice to say. (Frito=Song, OpenAI=ChatGpt)

Company decides they don’t need the celebrity they want (Frito=Waits, OpenAI=ScarJo) and instead hire an impersonator (Frito=singer, {OpenAI=impersonator or OpenAI=ScarJo-public-recordings}) to get what they want (Frito=a-facsimile-of-Tom-Waitte’s-voice-in-a-commercial, OpenAi=a-fascimilie-of-ScarJo’s-voice-in-their-chatbot)

When made public, people confuse the fascimilie as the real thing.

I don’t see how you don’t see a parallel. It’s literally best for beat the same, particularly around the part about using an impersonator as an excuse.

◧◩
203. splatz+l31[view] [source] [discussion] 2024-05-21 08:56:01
>>thaton+Mm
There’s a nice YouTube doc telling the story of this, and Tom Waits’ hatred of advertising - https://youtu.be/W7J01e-OIMA?si=57IJooNwg5oTfh62
◧◩◪
204. askl+s31[view] [source] [discussion] 2024-05-21 08:56:42
>>trustn+yM
I always had trouble telling apart those two Sams. Turns out they're the same person.
◧◩◪
205. sirsin+o51[view] [source] [discussion] 2024-05-21 09:09:03
>>chx+o21
Wow, Altman in the replies there:

> Cool story bro.

> Except I could never have predicted the part where you resigned on the spot :)

> Other than that, child's play for me.

>Thanks for the help. I mean, thanks for your service as CEO.

◧◩◪
206. m000+b71[view] [source] [discussion] 2024-05-21 09:20:38
>>ocodo+ex
Aspiring technofeudalist.
◧◩◪◨⬒⬓⬔⧯▣▦
207. ml-ano+o71[view] [source] [discussion] 2024-05-21 09:22:23
>>XorNot+lW
Then why bother mentioning an "underdog story" at all?

Who is the underdog in this situation? In your comment it seems like you're framing OpenAI as the underdog (or perceived underdog) which is just bonkers.

Hacker News isn't a hivemind and there are those of us who work in GenAI who are firmly on the side of the creatives and gasp even rights holders.

◧◩◪◨⬒⬓⬔⧯
208. Sebb76+1b1[view] [source] [discussion] 2024-05-21 09:48:35
>>XorNot+TO
> they hired a voice actress for the rights to create the voice, she was paid, and then is basically told by the courts "actually you're unhireable because you sound too much like an already rich and famous person".

There are quite a few issues here: First, this is assuming they actually hired a voice-alike person, which is not confirmed. Second, they are not an underdog (the voice actress might be, but she's most likely pretty unaffected by this drama). Finally, they were clearly aiming to impersonate ScarJo (as confirmed by them asking for permission and samas tweet), so this is quite a different issue than "accidentally" hiring someone that "just happens to" sound like ScarJo.

◧◩◪◨
209. mike_h+7d1[view] [source] [discussion] 2024-05-21 10:03:47
>>meat_m+O11
It's not just that. Originally the AI voice in Her was played by someone else, but Spike Jonze felt strongly that the movie wasn't working and recast the part to Johansson. The movie immediately worked much better and became a sleeper hit. Johansson just has a much better fitting voice and higher skill in voice acting for this kind of role, to the extent that it maybe was a make/break choice for the movie. It isn't a surprise that after having created the exact tech from the movie, OpenAI wanted it to have the same success that Jonze had with his character.

It's funny that just seven days ago I was speculating that they deliberately picked someone whose voice is very close to Scarlett's and was told right here on HN, by someone who works in AI, that the Sky voice doesn't sound anything like Scarlett and it is just a generic female voice:

https://news.ycombinator.com/item?id=40343950#40345807

Apparently .... not.

◧◩◪◨⬒⬓⬔
210. lawn+hd1[view] [source] [discussion] 2024-05-21 10:05:16
>>ben_w+c21
> Decentralisation allows trust-less assurance that money is sent

That is indeed something it does.

But it also gives you the assurance that a single entity can't print unlimited money out of thin air, which is the case with a centrally controlled currency like Worldcoin.

They can just shrug their shoulders and claim that all that money is for the poor and gullible Africans that had their eyeballs scanned.

replies(1): >>ben_w+pf1
◧◩◪
211. sage76+Sd1[view] [source] [discussion] 2024-05-21 10:12:17
>>akudha+511
> Is it their hobby to be a dick to as many people as they can, for no reason other than their amusement? Just plain weirdos

They seem to love "testing" how much they can bully someone.

I remember a few experiences where someone responded by being an even bigger dick, and they disappeared fast.

◧◩◪◨⬒⬓⬔⧯
212. ben_w+pf1[view] [source] [discussion] 2024-05-21 10:25:59
>>lawn+hd1
> But it also gives you the assurance that a single entity can't print unlimited money out of thin air, which is the case with a centrally controlled currency like Worldcoin.

Sure, but the inability to do that when needed is also a bad thing.

Also, single world currencies are (currently) a bad thing, because when your bit of the world needs to devalue its currency is generally different to when mine needs to do that.

But this is why economics is its own specialty and not something that software nerds should jump into like our example with numbers counts for much :D

replies(1): >>bavell+rR1
◧◩◪◨⬒⬓⬔⧯
213. jubalf+0k1[view] [source] [discussion] 2024-05-21 11:07:59
>>XorNot+TO
an obnoxious sleazy millionaire backed by microsoft is by no means “an underdog”
◧◩◪◨
214. dontup+ko1[view] [source] [discussion] 2024-05-21 11:46:34
>>imjons+PC
Are they really the ones with the best chance now though?

They're basically owned by Microsoft, they're bleeding tech/ethnical talent and credibility, and most importantly Microsoft Research itself is no slouch (especially post-Deepmind poaching) - things like Phi are breaking ground on planets that openai hasn't even touched.

At this point I'm thinking they're destined to become nothing but a premium marketing brand for Microsoft's technology.

◧◩
215. latexr+Cu1[view] [source] [discussion] 2024-05-21 12:22:10
>>windex+zs
> The thing about the situation is that Altman is willing to lie and steal a celebrity's voice for use in ChatGPT.

He lies and steals much more than that. He’s the scammer behind Worldcoin.

https://www.technologyreview.com/2022/04/06/1048981/worldcoi...

https://www.buzzfeednews.com/article/richardnieva/worldcoin-...

> Altman is, and wants to be, a large part of AI regulation. Quite the public contradiction.

That’s as much of a contradiction as a thief wanting to be a large part of lock regulation. What better way to ensure your sleazy plans benefit you, and preferably only you but not the competition, than being an active participant in the inevitable regulation while it’s being written?

replies(1): >>ben_w+Mx1
◧◩◪◨⬒
216. latexr+Vv1[view] [source] [discussion] 2024-05-21 12:29:25
>>ben_w+sM
> that's more of "having no sense of other people's privacy" (and hubris) than general scamminess.

It’s both.

>>40427454

◧◩◪
217. zwily+jw1[view] [source] [discussion] 2024-05-21 12:31:38
>>sanxiy+G1
The previous chatgpt voice mode uses text to speech, and has different voices than the OpenAI API. Seemed weird to me too.
◧◩◪
218. latexr+yw1[view] [source] [discussion] 2024-05-21 12:33:11
>>vasili+wv
> if this account is true, Sam Altman is a deeply unethical human being.

This isn’t even close to the most unethical thing he has done. This is peanuts compared to the Worldcoin scam.

>>40427454

◧◩◪
219. ben_w+Mx1[view] [source] [discussion] 2024-05-21 12:40:00
>>latexr+Cu1
> That’s as much of a contradiction as a thief wanting to be a large part of lock regulation.

Based on what I see in the videos from The Lockpocking Lawyer, that would be a massive improvement.

Now, the NSA and crypto standards, that would have worked as a metaphor for your point.

(I don't think it's correct, but that's an independent claim, and I am not only willing to discover that I'm wrong about their sincerity, I think everyone writing that legislation should actively assume the worst while they do so).

replies(3): >>latexr+MC1 >>maeln+vD1 >>lawn+WH1
220. belter+Iy1[view] [source] 2024-05-21 12:45:05
>>nickth+(OP)
https://x.com/jam3scampbell/status/1791338109709287511
◧◩◪◨
221. latexr+MC1[view] [source] [discussion] 2024-05-21 13:03:51
>>ben_w+Mx1
> Based on what I see in the videos from The Lockpocking Lawyer

The Lockpicking Lawyer is not a thief, so I don’t get your desire to incorrectly nitpick. Especially when you clearly understood the point.

replies(1): >>ben_w+vF1
◧◩◪◨
222. maeln+vD1[view] [source] [discussion] 2024-05-21 13:06:59
>>ben_w+Mx1
> > That’s as much of a contradiction as a thief wanting to be a large part of lock regulation.

> Based on what I see in the videos from The Lockpocking Lawyer, that would be a massive improvement.

A thief is not a lock picker and they don't have the same incentive. A thief in a position to dictate lock regulation would try to have a legal backdoor on every lock in the world. One that only he has the master key for. Something something NSA & cryptography :)

replies(1): >>ben_w+6U1
◧◩◪◨
223. lesost+UE1[view] [source] [discussion] 2024-05-21 13:15:32
>>parine+as
IANAL, but – I think it's most likely to be an infringement on her right of publicity (i.e. the right to control the commercial value of your name, likeness, etc.)

She doesn't have to own anything to claim this right, if the value of her voice is recognizable.

◧◩◪◨⬒
224. ben_w+vF1[view] [source] [discussion] 2024-05-21 13:19:02
>>latexr+MC1
You noticed your confusion but still went on the aggressive, huh. Ah well.

"A is demonstrating a proof of B" does not require "A is a clause in B".

A being TLPL, B being that the entire lock industry is bad, so bad that anyone with experience would be a massive improvement, for example a thief.

replies(1): >>latexr+AI1
◧◩◪◨
225. lawn+WH1[view] [source] [discussion] 2024-05-21 13:31:02
>>ben_w+Mx1
> Based on what I see in the videos from The Lockpocking Lawyer, that would be a massive improvement.

If you've watched his videos then surely you should know that lockpicking isn't even on the radar for thieves as there are much easier and faster methods such as breaking the door or breaking a window.

replies(1): >>ben_w+DU1
◧◩◪◨⬒⬓
226. latexr+AI1[view] [source] [discussion] 2024-05-21 13:35:23
>>ben_w+vF1
I’m not confused and my reply was not aggressive. I don’t think it will be a good use of time to continue this conversation because discussions should get more substantive as they go on and this was an irrelevant tangent to which I have no desire to get sucked in to.

Other people have commented to further explain the point in other words. I recommend you read those, perhaps it’ll make you understand.

>>40428005

>>40428280

◧◩◪
227. firebi+LP1[view] [source] [discussion] 2024-05-21 14:14:06
>>ocodo+ex
almost every tech ceo is like that. Could list many examples. It's an effect of capitalism.
◧◩◪
228. dqft+cQ1[view] [source] [discussion] 2024-05-21 14:17:12
>>trustn+yM
(AI)tman tries to be, (Bank)man fried to be, who is letting Kojima name all these villians?
replies(1): >>tmaly+Yr6
◧◩◪◨⬒⬓⬔⧯▣
229. bavell+rR1[view] [source] [discussion] 2024-05-21 14:23:52
>>ben_w+pf1
> Sure, but the inability to do that when needed is also a bad thing.

When and why would BTC or ETH need to print unlimited money and devalue themselves?

replies(2): >>ben_w+jT1 >>petter+Dn3
◧◩◪◨
230. Nasrud+3T1[view] [source] [discussion] 2024-05-21 14:30:58
>>gibbit+6j
That is wrong on several levels. First off, it ignores the role of massive researchers. Should we start de-automating processes to employee more people at the cost of worsening margins?
◧◩◪◨⬒⬓⬔⧯▣▦
231. ben_w+jT1[view] [source] [discussion] 2024-05-21 14:32:26
>>bavell+rR1
Wrong framing, currencies don't have agency. You should be asking when would you need your currency to be devalued, regardless of what it's called or made from.

And the answer to that is all the reasons governments do just that, except for the times where the government is being particularly stupid and doing hyperinflation.

replies(1): >>bavell+CU1
◧◩◪◨⬒
232. ben_w+6U1[view] [source] [discussion] 2024-05-21 14:35:52
>>maeln+vD1
Indeed, but locks are so bad (as demonstrated by LPL) that even a thief would make them better.

> Something something NSA & cryptography :)

Indeed, as I said :)

◧◩◪◨⬒⬓⬔⧯▣▦▧
233. bavell+CU1[view] [source] [discussion] 2024-05-21 14:37:30
>>ben_w+jT1
Not a very convincing answer at all.
replies(3): >>ben_w+VU1 >>freeja+AL2 >>CRConr+6Rk
◧◩◪◨⬒
234. ben_w+DU1[view] [source] [discussion] 2024-05-21 14:37:35
>>lawn+WH1
Surely that just makes the thief/locks metaphor even worse? Or have I missed something?
◧◩◪◨⬒⬓⬔⧯▣▦▧▨
235. ben_w+VU1[view] [source] [discussion] 2024-05-21 14:38:23
>>bavell+CU1
What would a convincing answer look like?
◧◩◪◨⬒⬓⬔
236. parine+eV1[view] [source] [discussion] 2024-05-21 14:40:02
>>ncalla+au
> I don’t think so. I’ve narrowed my comments specifically to Effective Altruists who are making utilitarian trade-offs to justify known moral wrongs.

Did you?

> Effective Altruists are just shitty utilitarians that never take into account all the myriad ways that unmoderated utilitarianism has horrific failure modes.

replies(1): >>ncalla+ar2
◧◩◪◨⬒⬓
237. unrave+PY1[view] [source] [discussion] 2024-05-21 14:58:43
>>kubobl+iG
Ken Segall has a similar Steve Jobs story, he emails Jobs that the Apple legal team have just thrown a spanner in the works days before Ken's agency is set to launch Apple's big ad campaign and what should he do?

Jobs responds minutes later... "Fuck the lawyers."

◧◩◪◨⬒
238. munksb+RZ1[view] [source] [discussion] 2024-05-21 15:03:40
>>voltai+vc
> If they didn’t think they needed permission, why ask for permission multiple times and then yank it when she noticed?

One very easy explanation is that they trained Sky using another voice (this is the claim and no reason to doubt it is true) wanting to replicate the stye of the voice in "Her", but would have preferred to use SJ's real voice for the PR impact that could have.

Yanking it could also easily be a pre-emptive response to avoid further PR drama.

You will obvious decide you don't believe those explanations, but to many of us they're quite plausible, in fact I'd even suggest likely.

(And none of this precludes Sam Altman and OpenAI being dodgy anyway)

replies(1): >>voltai+uv2
◧◩◪◨⬒
239. munksb+l02[view] [source] [discussion] 2024-05-21 15:05:57
>>ncalla+yr
This is a genuine question. If it turns out they trained Sky on someone else's voice to similarly replicate the style of the voice in "Her", would you be ok with that? If it was proven that the voice was just similar, to SJ's would that be ok?

My view is, of course it is ok. SJ doesn't own the right to a particular style of voice.

◧◩◪◨⬒⬓
240. azinma+j12[view] [source] [discussion] 2024-05-21 15:11:18
>>safety+IR
I don’t know enough about him or his vision. It doesn’t seem he’s as clear as say Jobs in the past. But I do look at all the amazing things openai has done in a short period of time, and that the employees overwhelmingly backed him with the whole board chaos issue. He also has fundraised a lot money for the company. It appears he’s doing more right than wrong, and openai pulled everyone else’s pants down.
◧◩◪◨⬒⬓
241. nickle+W22[view] [source] [discussion] 2024-05-21 15:19:05
>>throww+yd
Let me start by saying I despise generative AI and I think most AI companies are basically crooked.

I thought about your comment for a while, and I agree that there is a fine line between "realistic parody" and "intentional deception" that makes deepfake AI almost impossible to defend. In particular I agree with your distinction:

- In matters involving human actors, human-created animations, etc, there should be great deference to the human impersonators, particularly when it involves notable public figures. One major difference is that, since it's virtually impossible for humans to precisely impersonate or draw one another, there is an element of caricature and artistic choice with highly "realistic" impersonations.

- AI should be held to a higher standard because it involves almost no human expression, and it can easily create mathematically-perfect impersonations which are engineered to fool people. The point of my comment is that fair use is a thin sliver of what you can do with the tech, but it shouldn't be stamped out entirely.

I am really thinking of, say, the Joe Rogan / Donald Trump comedic deepfakes. It might be fine under American constitutional law to say that those things must be made so that AI Rogan / AI Trump always refer to each other in those ways, to make it very clear to listeners. It is a distinctly non-libertarian solution, but it could be "necessary and proper" because of the threat to our social and political knowledge. But as a general principle, those comedic deepkfakes are works of human political expression, aided by a fairly simple computer program that any CS graduate can understand, assuming they earned their degree honestly and are willing to do some math. It is constitutionally icky (legal term) to go after those people too harshly.

replies(1): >>throww+Xc2
◧◩◪◨⬒
242. gdilla+5c2[view] [source] [discussion] 2024-05-21 15:58:09
>>kristi+0K
He just, to his credit, understands his public persona has to be the non-douchy-tech-bro and the media will eat it up. Much like a politician. He doesn't want to be like Elon or Travis K in public (though he probably agrees with them more than his public persona would imply).
◧◩◪
243. gdilla+oc2[view] [source] [discussion] 2024-05-21 15:59:08
>>vasili+wv
because so many people ran cover for him, from paul graham to whos-who of silicon valley.
◧◩◪◨⬒⬓⬔
244. throww+Xc2[view] [source] [discussion] 2024-05-21 16:01:12
>>nickle+W22
I think that as long as a clear "mark of parody" is maintained such that a reasonable person could distinguish between the two, AI parodies are probably fine. The murkiness in my mind is expressly in the situation in the first episode of Black Mirror, where nobody could distinguish between the AI-generated video and the prime minister actually performing the act. Clearly that is not a parody, even if some people might find the situation humorous. But if we're not careful we give people making fake videos cover to hide behind fair use for parody.

I think you and I have the same concerns about balancing damage to the societal fabric against protecting honest speech.

◧◩◪◨⬒⬓
245. static+Ue2[view] [source] [discussion] 2024-05-21 16:09:18
>>KHRZ+li
Before Roe vs Wade was overturned you might have asked if abortion is legal why do abortion rights advocates want legislation passed?

The answer is without legislation you are far more subject to whether a judge feels like changing the law.

◧◩◪◨⬒
246. Toucan+af2[view] [source] [discussion] 2024-05-21 16:10:41
>>hoseja+GY
I mean, to whatever extent it matters. All these outrageously rich morons still have tons of economic and social clout. They still have pages upon pages of fans foaming at the mouth for the opportunity to harass people asking basic questions. They still carry undue influence in our society and in our industry no matter how many times they are "outed."

What does being outed even mean anymore? It's just free advertising from all the outlets that feel they can derive revenue off your name being in their headlines. Nothing happens to them. SBF and Holmes being the notable exceptions, but that's because they stole from rich people.

◧◩◪◨
247. JohnFe+ln2[view] [source] [discussion] 2024-05-21 16:50:38
>>wrapti+3A
This is why it's a mistake to go by "vibes" of a person when they're speaking to an audience. Pay attention to what they do, not what they say.
◧◩◪◨⬒
248. JohnFe+2o2[view] [source] [discussion] 2024-05-21 16:54:52
>>parpfi+yG
Ideas are a dime a dozen. The value of "idea men" isn't their ideas, it's their ability to rally people around them. It's the exact same skill that con men use for nefarious purposes.
◧◩◪◨
249. JohnFe+io2[view] [source] [discussion] 2024-05-21 16:56:20
>>jcranm+wC
WorldCoin is how I first heard of him, and it's what made me think he was a bad actor. I think of it as a red flag, not yellow.
◧◩◪
250. JohnFe+op2[view] [source] [discussion] 2024-05-21 17:00:52
>>xinayd+U11
Using other people data for training without their permission is the "original sin" of LLMs[1]. That will, at best, be a shadow over the entire field for an extremely long time.

[1] Just to head off people saying that such a use is not a copyright violation -- I'm not saying it is. I'm just saying that it's extremely sketchy and, in my view, ethically unsupportable.

◧◩◪◨⬒⬓⬔⧯
251. ncalla+ar2[view] [source] [discussion] 2024-05-21 17:09:07
>>parine+eV1
Sure, I should’ve said I tried to or I intended to:

You can see another comment here, where I acknowledge I communicate badly, since I’ve had to clarify multiple times what I was intending: >>40424566

This is the paragraph that was intended to narrow what I was talking about:

> I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.

That said, I definitely should’ve said “those Effective Altruists” in the first paragraph to more clearly communicate my intent.

◧◩◪◨⬒⬓
252. voltai+uv2[view] [source] [discussion] 2024-05-21 17:31:52
>>munksb+RZ1
I actually believe that’s quite plausible. The trouble is, by requesting permission in the first place, they demonstrated intent, which is legally significant. I think a lot of your confusion is attempting to employ pure logic to a legal issue. They are not the same thing, and the latter relies heavily on existing precedent — of which you may, it seems, be unaware.
◧◩◪◨⬒⬓
253. voltai+6w2[view] [source] [discussion] 2024-05-21 17:36:05
>>parine+Yt
They are trying to wriggle out of providing insight into how that voice was derived at all (like Google with the 100% of damages check). It would really suck for OpenAI if, for example, Altman had at some point emailed his team to ensure the soundalike was “as indistinguishable from Scarlet’s performance in HER as possible.“

Public figures own their likeness and control its use. Not to mention that in this case OA is playing chicken with studios as well. Not a great time to do so, given their stated hopes of supplanting 99% of existing Hollywood creatives.

◧◩◪◨⬒⬓⬔
254. Intral+oz2[view] [source] [discussion] 2024-05-21 17:55:46
>>ben_w+C01
If it fools billions of people and does significant damage to the lives of people, then it's plenty advanced to me, even if it happens through a more simple or savant-like process than something that looks obviously deliberate.

I don't think the cookies thing is a good example. That's passive incompetence, to avoid the work of changing their business models. Altman actively does more work to erode people's rights.

> It's still bad, don't get be wrong, it's just something I can distinguish.

Can you? Plausible deniability is one of the first things in any malicious actor's playbook. "I meant well…" If there's no way to know, then you can only assess the pattern of behavior.

But realistically, nobody sapient accidentally spends multiple years building elaborate systems for laundering other people's IP, privacy, and likeness, and accidentally continues when they are made aware of the harms and explicitly asked multiple times to stop…

◧◩◪◨⬒⬓
255. Intral+Sz2[view] [source] [discussion] 2024-05-21 17:57:59
>>wrapti+mR
Or "success" itself acts as a filter selecting for those who are ruthless enough to do amoral and immoral things.
◧◩◪◨⬒
256. Intral+lA2[view] [source] [discussion] 2024-05-21 18:00:06
>>ncalla+Vn
Plus, describing this as "speed the rate of life saving ai technology in the hands of everyone" is… A Reach.
◧◩◪◨⬒
257. Intral+tC2[view] [source] [discussion] 2024-05-21 18:11:44
>>ncalla+Vn
The central contention of Effective Altruism, at least in practice if not in principle, seems to be that the value of thinking, feeling persons can be and should be reduced to numbers and objects that you can do calculations on.

Maybe there's a way to do that right. I suppose like any other philosophy, it ends up reflecting the personalities and intentions of the individuals which are attracted to and end up adopting it. Are they actually motivated by identifying with and wanting to help other people most effectively? Or are they just incentivized to try to get rid of pesky deontological and virtue-based constraints like empathy and universal rights?

◧◩◪◨
258. Intral+cD2[view] [source] [discussion] 2024-05-21 18:15:09
>>lawn+OE
I think people tend to assume our own values and experiences have some degree of being universal.

So scammers see other scammers, and they just think there's nothing wrong with it.

While normal people who act in good faith see scammers, and instinctively think that there must be a good reason for it, even (or especially!) if it looks sketchy.

I think this happens a lot. Not just with Altman, though that is a prominent currently ongoing example.

Protecting yourself from dark triad type personalities means you need to be able to understand a worldview and system of values and axioms that is completely different from yours, which is… difficult. …There's always that impulse to assume good faith and rationalize the behavior based on your own values.

◧◩◪◨⬒⬓
259. bonton+HE2[view] [source] [discussion] 2024-05-21 18:21:29
>>justin+Vs
Copilot still tells me I've commit a content policy violation of I ask it to generate an image "in Tim Burton's style". Tim Burton has been openly critical of generative AI.
◧◩
260. Intral+eF2[view] [source] [discussion] 2024-05-21 18:23:46
>>windex+zs
"Not consistently candid", the last board said.

Like many people who try to oppose psychopaths though, they don't seem to be around much anymore.

◧◩◪◨⬒⬓
261. cutemo+7J2[view] [source] [discussion] 2024-05-21 18:43:01
>>jester+xM
Not the physically violent ones with impulse control problems.
◧◩◪◨⬒⬓⬔⧯▣▦▧▨
262. freeja+AL2[view] [source] [discussion] 2024-05-21 18:54:23
>>bavell+CU1
Convincing to who? It's not like crypto is widely used as currency anywhere
◧◩◪
263. Lia_th+9V2[view] [source] [discussion] 2024-05-21 19:45:35
>>creato+XT
Is it really better to take something without asking than to ask and then take it anyway if denied?

In my mind these are close to being equally shitty, but not asking is a shittier because the victim won't necessarily know they've been exploited, which limits the actions they will be able to take to rectify matters.

◧◩◪◨
264. stonog+s13[view] [source] [discussion] 2024-05-21 20:17:54
>>dyno12+dp
When those imitations are commercialized, there is a disclaimer that a celebrity voice is being impersonated, and parody is a legally protected form of speech. OpenAI is not parodying anything, and failed even the low bar of having a disclaimer.
◧◩◪◨⬒⬓
265. r2_pil+F73[view] [source] [discussion] 2024-05-21 20:50:17
>>nicce+mQ
Your scenario leads to the disenfranchisement of lesser-known voice talent, since they cannot take on work that Top Talent has rejected if they happen to resemble(nobody here has seen their doppelganger before? How unique is a person? 1 in 5 million? Then there are 36 million of that type in just the United States alone). Perhaps it's time to re-evaluate some of the rules in society. It's a healthy thing to do periodically.
◧◩
266. Quantu+tc3[view] [source] [discussion] 2024-05-21 21:12:42
>>barbar+DI
Lots of posts about that. This is new ("news") and celebrity, so it's getting traction at the moment.
◧◩◪◨⬒⬓⬔⧯▣▦
267. petter+Dn3[view] [source] [discussion] 2024-05-21 22:14:32
>>bavell+rR1
When the economy grows, the amount of currency needs to grow as well. Otherwise prices will fall (deflation). That hurts the economy as a whole because e.g. real wages might increase too much
◧◩◪◨⬒
268. avarun+ox3[view] [source] [discussion] 2024-05-21 23:14:02
>>chroma+sc
They would create a voice in partnership with her? Not sure why you’re assuming it’s some insanely laborious process. It would obviously have been incredible marketing for them to actually get her on board.
◧◩◪◨⬒⬓
269. EasyMa+Jx3[view] [source] [discussion] 2024-05-21 23:15:57
>>ml-ano+La
They announced a whole new model, that's a positive announcement.
◧◩◪
270. FireBe+IQ3[view] [source] [discussion] 2024-05-22 01:45:38
>>vasili+wv
> if this account is true, Sam Altman is a deeply unethical human being

I thought this when he didn't launch Worldcoin in the US but Africa, and consistently upped the ante to the point where he was offering people in the poorer parts of the continent amounts that equalled two months wages or more to scan their retinas.

Why was that necessary? It wasn't to share the VC windfall.

◧◩◪
271. Andrex+DY3[view] [source] [discussion] 2024-05-22 03:20:55
>>trustn+yM
I'm glad I'm not the only one drawing SBF personality comparisons here. I'd throw Martin Shkreli into the mix too for good measure. Awful.
◧◩◪
272. graves+t24[view] [source] [discussion] 2024-05-22 04:09:19
>>trustn+yM
Less poacher turned gamekeeper and more poachers infiltrating the gamekeeper council.
◧◩◪◨
273. renega+gm5[view] [source] [discussion] 2024-05-22 15:38:57
>>mlindn+1V
Altman's biggest accomplishment is being out of the way. Great work is done despite management, not because of it. It's the ability to hire the right people and get out of their hair. Altman himself has no talents, he is not technical. He is just well-connected in the Valley. But, at least Altman is not the wrecking ball like Elon Musk is, and that's really his only job - to not micromanage.
◧◩◪◨
274. frank_+EG5[view] [source] [discussion] 2024-05-22 17:23:53
>>insane+6D
And almost every thread on HN had its top-voted comments defending and praising Altman, while shrugging off Ilya et al. It was bizarre and disheartening to see that from this community, of all places.
◧◩◪◨⬒⬓⬔⧯▣
275. zdp7+7O5[view] [source] [discussion] 2024-05-22 18:05:09
>>XorNot+uP
A bit late here, but... You are ignoring a lot of evidence. SJ stating they asked for permission twice. One of which was requested days before they released it. The Her tweet would seem to corroborate it's meant to sound like her. They then take it down (presumably since they aren't confident they'd win and don't want to be subjected to discovery.) Because of their tweet, even if the voice actors normal voice was identical to SJ, it's pretty clear they were trying to profit off her voice.
replies(1): >>XorNot+pX6
◧◩◪◨
276. tmaly+Yr6[view] [source] [discussion] 2024-05-22 21:03:11
>>dqft+cQ1
You had me thinking we were in some type of simulation for a second.

Bernie Madoff is another funny name we should throw in there.

◧◩◪◨
277. detour+cH6[view] [source] [discussion] 2024-05-22 22:24:40
>>gds44+qP
This happened once the US broadcasters were no longer required to treat news as a civic responsibility.
◧◩◪◨
278. themad+iP6[view] [source] [discussion] 2024-05-22 23:10:06
>>meat_m+O11
Also, I understand that sama considers "Her" his favorite movie. Perhaps, for him, it just had to be ScarJo's voice.
◧◩◪◨⬒⬓⬔⧯▣▦
279. XorNot+pX6[view] [source] [discussion] 2024-05-22 23:54:09
>>zdp7+7O5
I repeatedly referred to the fact that they tried to contact SJ, and Sam Altman's tweet, as being the biggest problems.
◧◩◪◨⬒⬓⬔
280. jester+jI7[view] [source] [discussion] 2024-05-23 06:56:45
>>johnny+tN
All power position attract psychopaths, so naturally we find more when we look at those.
◧◩◪◨⬒⬓⬔⧯▣▦▧▨
281. CRConr+6Rk[view] [source] [discussion] 2024-05-28 15:43:57
>>bavell+CU1
There's a difference between the answer being unconvincing to people who understand it and not understanding the answer.
◧◩◪◨⬒⬓⬔
282. CRConr+iSk[view] [source] [discussion] 2024-05-28 15:49:40
>>ben_w+C01
I think those websites actually have only one partner, one of the tiny oligopoly of advertisement brokers. That partner (*cough*Google*cough*), in turn, bows to the fig leaf of user consent via those interminable dialogs. So the site owners' question should probably be "Why do we need to partner with this behemoth that shackles us to 1200 'partners?".
[go to top]