> Though the most successful founders are usually good people, they tend to have a piratical gleam in their eye. They're not Goody Two-Shoes type good. Morally, they care about getting the big questions right, but not about observing proprieties. That's why I'd use the word naughty rather than evil. They delight in breaking rules, but not rules that matter. This quality may be redundant though; it may be implied by imagination.
> Sam Altman of Loopt is one of the most successful alumni, so we asked him what question we could put on the Y Combinator application that would help us discover more people like him. He said to ask about a time when they'd hacked something to their advantage—hacked in the sense of beating the system, not breaking into computers. It has become one of the questions we pay most attention to when judging applications.
"What We Look for in Founders", PG
https://paulgraham.com/founders.html
I think the more powerful you become, the less endearing this trait is.
[0]: https://www.mercurynews.com/2021/07/20/how-the-doodle-god-un...
[1] https://variety.com/2024/digital/news/openai-pulls-scarlett-...
Stuff from her comes via press agents, which is generally sent directly to reporters.
[1] https://www.lesswrong.com/posts/QDczBduZorG4dxZiW/sam-altman...
[2] https://www.hackingbutlegal.com/p/statement-by-annie-altman-...
https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
Bette Middler successfully sued Ford for impersonating her likeness in a commercial.
Then also:
https://casetext.com/case/waits-v-frito-lay-inc
Tom Waits successfully sued Frito Lay for using an imitator without approval in a radio commercial.
The key seems to be that if someone is famous and their voice is distinctly attributeable to them, there is a case. In both of these cases, the artists in question were also solicited first and refused.
> In a novel case of voice theft, a Los Angeles federal court jury Tuesday awarded gravel-throated recording artist Tom Waits $2.475 million in damages from Frito-Lay Inc. and its advertising agency.
> The U.S. District Court jury found that the corn chip giant unlawfully appropriated Waits’ distinctive voice, tarring his reputation by employing an impersonator to record a radio ad for a new brand of spicy Doritos corn chips.
https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...
And not see how over the top it is... cmon.
OpenAI pulls Johansson soundalike Sky’s voice from ChatGPT - >>40414249 - May 2024 (96 comments)
If so, I suspect they’ll be okay in a court of law — having a voice similar to a celebrity isn’t illegal.
It’ll likely cheese off actors and performers though.
[1] https://www.forbes.com/sites/roberthart/2024/05/20/openai-sa...
Doesn't matter around similarity. There was nothing fair-use around this voice and it is exactly why OpenAI yanked the voice and indirectly admitted to cloning her voice.
[0 >>38154733 ]
https://www.hollywoodreporter.com/business/business-news/bac...
I'll ask the devil's advocate / contrarian question: How big a slice of the human voice space does Scarlett lay a claim to?
The evidence would be in her favor in a civil court case. OTOH, a less famous woman's claim that any given synthesized voice sounds like hers would probably fail.
Contrast this with copyrighted fiction. That space is dimensionally much bigger. If you're not deliberately trying to copy some work, it's very unlikely that you'll get in trouble accidentally.
The closest comparison is the Marvin Gaye estate's case. Arguably, the estate laid claim to a large fraction of what is otherwise a dimensionally large space. https://en.wikipedia.org/wiki/Pharrell_Williams_v._Bridgepor...
"when voice is sufficient indicia of a celebrity's identity, the right of publicity protects against its imitation for commercial purposes without the celebrity's consent."
it doesn't seem like principles should matter. but then the bill of rights doesn't seem like it should matter either if you were to cold read the constitution (you might be like - hmm, kinda seems important maybe...).
it compounds culturally over time though. principles ^ time = culture.
"Audacious, Thoughtful, Unpretentious, Impact-driven, Collaborative, and Growth-oriented."
https://archive.is/wLOfC#selection-1095.112-1095.200
maybe "thoughtful" was the closest (and sam is apologetic and regretful and transparent - kudos to him for that). but it's not that clear without a core principle around responsibility. you need that imho to avoid losing trust.
Why would people not want laws? The answer is so they can do the things that the laws prevent.
This is POSIWID territory [0]. "The purpose of a system is what it does". Not what it repeatedly fails to live up to.
What was the primary investment purpose of Uber? Not any of the things it will forever fail to turn a profit at. It was to destroy regulations preventing companies like Uber doing what they do. That is what it succeeded at.
The purpose of OpenAI is to minimise and denigrate the idea of individual human contributions.
[0] https://en.wikipedia.org/wiki/The_purpose_of_a_system_is_wha...
That said, Krazam covered this topic well already https://www.youtube.com/watch?v=KiPQdVC5RHU
For example, a car company approached the band sigur ros to include some of their music in a car commercial. Sigur ros declined. A few months later the commercial airs with a song that sounds like an unreleased sigur ros song, but really they just paid a composer to make something that sounds like sigur ros, but isn't. So maybe openai just had a random lady with a voice similar to Scarlett so the recording.
Taking down the voice could just be concern for bad press, or trying to avoid lawsuits regardless of whether you think you are in the right or not. Per this* CNN article:
> Johansson said she hired legal counsel, and said OpenAI “reluctantly agreed” to take down the “Sky” voice after her counsel sent Altman two letters.
So, Johansson's lawyers probably said something like "I'll sue your pants off if you don't take it down". And then they took it down. You can't use that as evidence that they are guilty. It could just as easily be the case that they didn't want to go to court over this even if they thought they were legally above board.
* https://www.cnn.com/2024/05/20/tech/openai-pausing-flirty-ch...
Saw some other posters expressing this view who deleted their posts after getting downvoted, lol.
> But OpenAI's chief technology officer, Mira Murati, has said that GPT-4o's voice modes were less inspired by Her than by studying the "really natural, rich, and interactive" aspects of human conversation, The Wall Street Journal reported.
People made fun of Murati when she froze after being asked what Sora was trained on. But behavior like that indicates understanding that you could get the company sued if you said something incriminating. Altman just tweets through it.
[1] https://arstechnica.com/tech-policy/2024/05/openai-pauses-ch...
The fact that they reached out to her multiple times and insinuated it was supposed to sound like her with Sam's "her" tweet makes a pretty clear connection to her. Without that they'd probably be fine.
Bette Midler sued Ford under very similar circumstances and won.
There's not a lot of precedent around voice impersonation, but there is for a very, very similar case against Ford
An issue with voice actors having their voice stolen by AI models/voice cloning tech is that they have no legal standing because their performance is owned by their client, and therefore no ownership. ScarJo may not have standing, depending on the contract (I suspect hers is much different than typical VA). It might have to be Annapurna Pictures that sues OpenAI instead.
Forbes had a good story about performer rights of voices: https://www.forbes.com/sites/rashishrivastava/2023/10/09/kee...
IANAL of course.
There is no doubt that the hired actor was an impersonator, this was explicitly stated by scama himself.
I mean, why not actually compare the voices before forming an opinion?
https://www.youtube.com/watch?v=SamGnUqaOfU
https://www.youtube.com/watch?v=vgYi3Wr7v_g
-----
"Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice."
https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
https://nwn.blogs.com/nwn/2024/04/adam-schiff-ai-video-games...
You can cheer on "forces" like Uber all you like but I would prefer it if progress happened without criminal deception:
https://www.theguardian.com/news/2022/jul/10/uber-files-leak...
I don't see how anyone can read this and think the uber app is a net positive.
It’s not like Tom Waits ever wanted to hock chips
https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...
Bette Midler was able to sue Ford Motor Co. for damages after they hired a sound-alike voice: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. Ford had acquired the rights to the song (which Midler didn't write).
They changed the voice to intone like Scarlett Johansson's character. It's like they changed the song the voice was singing to one that lots of people recognise.
http://law2.umkc.edu/faculty/projects/ftrials/communications...
1. https://apnews.com/article/hollywood-ai-strike-wga-artificia...
The really concerning part here is that Altman is, and wants to be, a large part of AI regulation [0]. Quite the public contradiction.
[0] https://www.businessinsider.com/sam-altman-openai-artificial...
If in fact, that was the case, then OpenAI is not aligned with the statement they just put out about having utmost focus on rigor and careful considerations, in particular this line: "We know we can't imagine every possible future scenario. So we need to have a very tight feedback loop, rigorous testing, careful consideration at every step, world-class security, and harmony of safety and capabilities." [0]
The replies to Altman's message showed readers did connect it to the film. And people noticed the voice sounded like Scarlett Johansson and connected it to the film when OpenAI introduced it in September.[1]
How do you believe Altman intended people to interpret his message?
[1] https://www.reddit.com/r/ChatGPT/comments/177v8wz/i_have_a_r...
OpenAI didn't just use a voice like Scarlett Johansson's. They used it in an AI system they wanted people to associate with AI from movies and the movie where Johansson played an AI particularly.[1][2]
They literally hired an impersonator, and it cost them 2.5 million (~6 million today).
https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...
"…two separate people from my label told me that they had personally talked to Coolio… and that he told them that he was okay with the whole parody idea…Halfway into production, my record label told me that Coolio’s management had a problem with the parody, even though Coolio personally was okay with it. My label told me… they would iron things out — so I proceeded with the recording and finished the album."
https://www.vulture.com/2011/12/gangstas-parodist-revisiting...
Look kind of similar right? Lot of familiar styling queues? What would take it from "similar" to actual infringement? Well if you slapped an Apple Logo on there, that would do it. Did OpenAI make an actual claim? Did they actually use Scarlett Johannson's public image and voice as sampling for the system?
[1] https://images.prismic.io/frameworkmarketplace/25c9a15f-4374...
[2] https://i.dell.com/is/image/DellContent/content/dam/ss2/prod...
[3] https://cdn.arstechnica.net/wp-content/uploads/2023/06/IMG_1...
https://podcasts.apple.com/us/podcast/an-ex-cia-officer-expl...
There's a fair amount of EA discussion of utilitarianism's problems. Here's EA founder Toby Ord on utilitarianism and why he ultimately doesn't endorse it:
https://forum.effectivealtruism.org/posts/YrXZ3pRvFuH8SJaay/...
>If Effective Altruists want to speed the adoption of AI with the general public, they’d do well to avoid talking about it, lest the general public make a connection between EA and AI
Very few in the EA community want to speed AI adoption. It's far more common to think that current AI companies are being reckless, and we need some sort of AI pause so we can do more research and ensure that AI systems are reliably beneficial.
>When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.
The all-time most upvoted post on the EA Forum condemns SBF: https://forum.effectivealtruism.org/allPosts?sortedBy=top&ti...
And here's some caselaw where another major corporation got smacked down for doing the exact same thing: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
But given how unscrupulous Sam Altman appears to be, I wouldn't be surprised if OpenAI hired an impersonator as some kind half-ass legal cover, and went about using Johansson's voice anyway. Tech people do stupid shut sometimes because they assume they're so much cleverer than everyone else.
Yes, it will be interesting in June 1988 when we will find out "where this lands": https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
E.g. flying Congress to Lake Cuomo for an off-the-record “discussion” https://freebeacon.com/politics/how-the-aspen-institute-help...
https://www.opensecrets.org/federal-lobbying/clients/summary...
It will come down to what makes the complaining celebrity's voice iconic, which for Scarjo is the 'gravelly' bit. Which smooth Sky had none of.
[1] actress reading poem: https://www.youtube.com/watch?v=eWEEAjRFJKc
Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus
https://x.com/AlexBlechman/status/1457842724128833538?lang=e...
Assuming Sam Altman is not stupid, this could be part of some elaborate plan and a calculated strategy. The end goals could range from immediate practical outcomes like increased publicity (see ChatGPT's mobile app revenue doubled overnight: https://finance.yahoo.com/news/chatgpts-mobile-app-revenue-s...) and market impact, to more complex objectives like influencing future legal frameworks and societal norms around AI.
https://twitter.com/OpenAI/status/1790089521985466587
Giggly, flirty AI voice demos were already weird, but now it's even creepier knowing the backstory of how they try to get their voices.
ChatGPT using Sky voice (not 4o - original release): https://youtu.be/JmxjluHaePw?t=129
Samantha from "Her" (voiced by ScarJo): https://youtu.be/GV01B5kVsC0?t=134
Rashida Jones Talking about herself https://youtu.be/iP-sK9uAKkM
I challenge anyone to leave prejudice at the door by describing each voice in totality first and seeing if your descriptions overlap entirely with others. They each have an obvious unique whispiness and huskiness to them.
"Each actor receives compensation above top-of-market rates, and this will continue for as long as their voices are used in our products."
https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
Not sure if this is royalties, but it seems like there's some form of long term compensation. But it's a little vague so not sure.
This voice: https://x.com/OpenAI/status/1790072174117613963
To me it's about as close to her voice as saying "It's a woman's voice". Not to say all women sound alike but the sound I heard from that video above could maybe best be described and "generic peppy female American spokesperson voice"
Even listening to it now with the suggestion that it might sound like her I don't personally hear Scarlett Johansson's voice from the demo.
There may be some damming proof where they find they sampled her specifically but saying they negotiated and didn't come to an agreement is not proof that it's supposed to be her voice. Again, to me it just sounds like a generic voice. I've used the the version before GPT-4o and I never got the vibe it was Scarlett Johansson.
I did get the "Her" vibe but only because I was talking to a computer with a female voice and it was easy to imagine that something like "Her" was in the near future. I also imagined or wished that it was Majel Barrett from ST:TNG, if only because the computer on ST:TNG gave short and useful answers where as ChatGPT always gives long-winded repetitive annoying answers
https://www.lesswrong.com/posts/QDczBduZorG4dxZiW/sam-altman...
So he's here to help regulate it all with an "international agency" (see the reference[2] by windexh8er in this thread)! Don't forget that Altman is the same hack who came up with "Worldcoin" and the so-called "Orb" that'll scan your eyeballs for "proof of personhood".
Is this sleazy marketer the one to be trusted to lead an effort that has a lasting impact on humanity? Hell no.
[1] >>38312294
[2] >>40423483
>The Ninth Circuit reversed the District Court, finding that White had a cause of action based on the value of her image, and that Samsung had appropriated this image. Samsung's assertion that this was a parody was found to be unavailing, as the intent of the ad was not to make fun of White's characteristics, but to sell VCRs.
https://en.wikipedia.org/wiki/White_v._Samsung_Electronics_A....
Maybe it depends on which court will handle the case, but OpenAI's core intent isn't parody, but rather to use someone's likeness as a way to make money.
(I am not a lawyer)
The court determined that Midler should be compensated for the misappropriation of her voice, holding that, when "a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
I hope there's going to be no further hypotheticals after this.
-----
>They're doing something with a voice that some claim sounds like hers.
Yes, that's what a likeness is.
If you start using your own paintings of Taylor Swift in a product without her permission, you'll run afoul of the law, even though your painting is obviously not the actual Taylor Swift, and you painted it from memory.
>But if it isn't, then it is more like selling a figurine called Sally that happens to look a lot like Taylor Swift. Sally has a right to exist even if she happens to look like Taylor Swift.
Sally has a right to exist, not the right to be distributed, sold, and otherwise used for commercial gain without Taylor Swift's permission.
California Civil Code Section 3344(a) states:
Any person who knowingly uses another’s name, voice, signature, photograph, or likeness, in any manner, on or in products, merchandise, or goods, or for purposes of advertising or selling, or soliciting purchases of, products, merchandise, goods or services, without such person’s prior consent, or, in the case of a minor, the prior consent of his parent or legal guardian, shall be liable for any damages sustained by the person or persons injured as a result thereof.
Note the word "likeness".
Read more at [1] on Common Law protections of identity.
>Has there ever been an up and coming artist who was not allowed to sell their own songs, because they happened to sound a lot like an already famous artist? I doubt it.
Wrong question.
Can you give me an example of an artist which was allowed to do a close-enough impersonation without explicit approval?
No? Well, now you know a good reason for that.
Tribute bands are legally in the grey area[2], for that matter.
[1] https://www.dmlp.org/legal-guide/california-right-publicity-...
[2] https://lawyerdrummer.com/2020/01/are-tribute-acts-actually-...
[3] https://repository.law.miami.edu/cgi/viewcontent.cgi?article...
https://www.quimbee.com/cases/waits-v-frito-lay-inc
https://en.wikipedia.org/wiki/White_v._Samsung_Electronics_A....
Hurray, OpenAI has found a new lucrative market. Horny incels.
[1] https://forums.theregister.com/forum/all/2024/05/21/scarlett...
It's funny that just seven days ago I was speculating that they deliberately picked someone whose voice is very close to Scarlett's and was told right here on HN, by someone who works in AI, that the Sky voice doesn't sound anything like Scarlett and it is just a generic female voice:
https://news.ycombinator.com/item?id=40343950#40345807
Apparently .... not.
Yes, I've listened to Altman. A most recent one is him waffling with a straight-face about "Platonic ideals"[1], while sitting on a royal chair in Cambridge. As I noted here[2] six months ago, if he had truly read and digested Plato's works, he simply will not be the ruthless conman he is. Plato would be turning in his grave.
[1] https://www.youtube.com/watch?v=NjpNG0CJRMM&t=3632s
[2] >>38312875
For example, I would love to see all of the Bourne books adapted into live-action films, but I know that will be impossible. In the future, I believe it would be great to see some AI actors who are not related to any famous actors/actresses perform the same screenplay: of course, if the book is licensed to that AI movie.
He lies and steals much more than that. He’s the scammer behind Worldcoin.
https://www.technologyreview.com/2022/04/06/1048981/worldcoi...
https://www.buzzfeednews.com/article/richardnieva/worldcoin-...
> Altman is, and wants to be, a large part of AI regulation. Quite the public contradiction.
That’s as much of a contradiction as a thief wanting to be a large part of lock regulation. What better way to ensure your sleazy plans benefit you, and preferably only you but not the competition, than being an active participant in the inevitable regulation while it’s being written?
It’s both.
This isn’t even close to the most unethical thing he has done. This is peanuts compared to the Worldcoin scam.
https://nytco-assets.nytimes.com/2023/12/Lawsuit-Document-dk...
Other people have commented to further explain the point in other words. I recommend you read those, perhaps it’ll make you understand.
You can see another comment here, where I acknowledge I communicate badly, since I’ve had to clarify multiple times what I was intending: >>40424566
This is the paragraph that was intended to narrow what I was talking about:
> I will say, when EA are talking about where they want to donate their money with the most efficacy, I have no problem with it. When they start talking about the utility of committing crimes or other moral wrongs because the ends justify the means, I tend to start assuming they’re bad at morality and ethics.
That said, I definitely should’ve said “those Effective Altruists” in the first paragraph to more clearly communicate my intent.
Sam Pullara @sampullara
If you have the ChatGPT app, set it to the Sky voice and talk to it. It is definitely a clone of ScarJo from Her.
6:21 PM · Dec 13, 2023
https://x.com/sampullara/status/1735122897663094853[0]: https://www.reddit.com/r/ChatGPT/comments/1cwy6wz/vocal_comp...
There is value in OpenAI, there are (a steadily shrinking number of) ethical pros there guilty of nothing worse than wanting to make a good living. Groups including but not limited to the voice group have done excellent, socially positive work.
But the leadership team is a menace (as I’ve been saying for years) and it’s just time for a clean sweep at the senior leadership level.
I’ve been such a vocal critic for so long that I’m always looking for an opportunity to present a balanced view whenever there’s a reasonable doubt.
Wrong again, me.
https://www.theatlantic.com/technology/archive/2024/05/opena...
Additionally I think you're understating the similarity to the Midler v Ford case. It has a similar pattern where they first contacted the "original", then hired an impersonator. The song wasn't at issue, they had a license to that part.
I think this part of some of the court documents[0] is especially relevant. Hedwig was the sound alike hired.
> Hedwig was told by Young & Rubicam that "they wanted someone who could sound like Bette Midler's recording of [Do You Want To Dance]." She was asked to make a "demo" tape of the song if she was interested. She made an a capella demo and got the job.
> At the direction of Young & Rubicam, Hedwig then made a record for the commercial. The Midler record of "Do You Want To Dance" was first played to her. She was told to "sound as much as possible like the Bette Midler record," leaving out only a few "aahs" unsuitable for the commercial. Hedwig imitated Midler to the best of her ability.
> After the commercial was aired Midler was told by "a number of people" that it "sounded exactly" like her record of "Do You Want To Dance." Hedwig was told by "many personal friends" that they thought it was Midler singing the commercial. Ken Fritz, a personal manager in the entertainment business not associated with Midler, declares by affidavit that he heard the commercial on more than one occasion and thought Midler was doing the singing.
> Neither the name nor the picture of Midler was used in the commercial; Young & Rubicam had a license from the copyright holder to use the song. At issue in this case is only the protection of Midler's voice.
So the fact that us random internet commentors did not recognize her voice doesn't seem to matter in cases like these. It's enough that the sound alike had been told to mimic the original voice, and that people familiar with the voice be fooled.
[0] https://law.justia.com/cases/federal/appellate-courts/F2/849...