zlacker

[parent] [thread] 20 comments
1. Increa+(OP)[view] [source] 2024-05-20 23:59:08
I wouldn't necessarily call that damning. "Soundalikes" are very common in the ad industry.

For example, a car company approached the band sigur ros to include some of their music in a car commercial. Sigur ros declined. A few months later the commercial airs with a song that sounds like an unreleased sigur ros song, but really they just paid a composer to make something that sounds like sigur ros, but isn't. So maybe openai just had a random lady with a voice similar to Scarlett so the recording.

Taking down the voice could just be concern for bad press, or trying to avoid lawsuits regardless of whether you think you are in the right or not. Per this* CNN article:

> Johansson said she hired legal counsel, and said OpenAI “reluctantly agreed” to take down the “Sky” voice after her counsel sent Altman two letters.

So, Johansson's lawyers probably said something like "I'll sue your pants off if you don't take it down". And then they took it down. You can't use that as evidence that they are guilty. It could just as easily be the case that they didn't want to go to court over this even if they thought they were legally above board.

* https://www.cnn.com/2024/05/20/tech/openai-pausing-flirty-ch...

replies(4): >>dragon+G1 >>romwel+W1 >>spuz+h3 >>Projec+7g
2. dragon+G1[view] [source] 2024-05-21 00:11:24
>>Increa+(OP)
> I wouldn't necessarily call that damning. "Soundalikes" are very common in the ad industry.

As are disclaimers that celebrity voices are impersonated when there is additional context which makes it likely that the voice would be considered something other than a mere soundalike, like direct reference to a work in which the impersonated celebrity was involved as part of the same publicity campaign.

And liability for commercial voice appropriation, even by impersonation, is established law in some jurisdictions, including California.

replies(1): >>Increa+28
3. romwel+W1[view] [source] 2024-05-21 00:12:46
>>Increa+(OP)
Sorry, that's apples-to-pizzas comparison. You're conflating work and identity.

There's an ocean of difference between mimicking the style of someone's art in an original work, and literally cloning someone's likeness for marketing/business reasons.

You can hire someone to make art in the style of Taylor Swift, that's OK.

You can't start selling Taylor Swift figurines by the same principle.

What Sam Altman did, figuratively, was giving out free T-Shirts featuring a face that is recognized as Taylor Swift by anyone who knows her.

replies(1): >>Increa+j7
4. spuz+h3[view] [source] 2024-05-21 00:21:35
>>Increa+(OP)
The damning part is that they tried to contact her and get her to reconsider their offer only 2 days before the model was demoed. That tells you that at the very least they either felt a moral or legal obligation to get her to agree with their release of the model.
replies(1): >>Increa+S3
◧◩
5. Increa+S3[view] [source] [discussion] 2024-05-21 00:25:01
>>spuz+h3
Or, they wanted to be able to say, yes, that is "her" talking to you.

I have no idea if they really used her voice, or it is a voice that just sounds like her to some. I'm just saying openai's behavior isn't a smoking gun.

replies(1): >>johnny+GH
◧◩
6. Increa+j7[view] [source] [discussion] 2024-05-21 00:45:37
>>romwel+W1
But they aren't doing anything with her voice(allegedly?). They're doing something with a voice that some claim sounds like hers.

But if it isn't, then it is more like selling a figurine called Sally that happens to look a lot like Taylor Swift. Sally has a right to exist even if she happens to look like Taylor Swift.

Has there ever been an up and coming artist who was not allowed to sell their own songs, because they happened to sound a lot like an already famous artist? I doubt it.

replies(2): >>airstr+wd >>romwel+jW
◧◩
7. Increa+28[view] [source] [discussion] 2024-05-21 00:50:44
>>dragon+G1
The most famous case of voice appropriation was Midler vs Ford, which involved Ford paying a Midler impersonator to perform a well known Midler song, creating the impression that it was actually Bette.

Where are the signs or symbols tying Scarlett to the openAI voice? I don't think a single word, contextless message on a separate platform that 99% of openAI users will not see is significant enough to form that connection in users heads.

replies(1): >>pseuda+tm
◧◩◪
8. airstr+wd[view] [source] [discussion] 2024-05-21 01:35:01
>>Increa+j7
The detail you're missing is that who claim it sounds like "her" includes the CEO of the company
9. Projec+7g[view] [source] 2024-05-21 01:57:19
>>Increa+(OP)
Case law says no.

There have been several legal cases where bands have sued advertisers for copying their distinct sound. Here are a few examples:

The Beatles vs. Nike (1987): The Beatles' company, Apple Corps, sued Nike and Capitol Records for using the song "Revolution" in a commercial without their permission. The case was settled out of court.

Tom Waits vs. Frito-Lay (1988): Tom Waits sued Frito-Lay for using a sound-alike in a commercial for their Doritos chips. Waits won the case, emphasizing the protection of his distinct voice and style.

Bette Midler vs. Ford Motor Company (1988): Although not a band, Bette Midler successfully sued Ford for using a sound-alike to imitate her voice in a commercial. The court ruled in her favor, recognizing the uniqueness of her voice.

The Black Keys vs. Pizza Hut and Home Depot (2012): The Black Keys sued both companies for using music in their advertisements that sounded remarkably similar to their songs. The cases were settled out of court.

Beastie Boys vs. Monster Energy (2014): The Beastie Boys sued Monster Energy for using their music in a promotional video without permission. The court awarded the band $1.7 million in damages.

replies(2): >>mycolo+Nm >>Increa+mn
◧◩◪
10. pseuda+tm[view] [source] [discussion] 2024-05-21 03:05:34
>>Increa+28
The Midler v. Ford decision said her voice was distinctive. Not the song.

The replies to Altman's message showed readers did connect it to the film. And people noticed the voice sounded like Scarlett Johansson and connected it to the film when OpenAI introduced it in September.[1]

How do you believe Altman intended people to interpret his message?

[1] https://www.reddit.com/r/ChatGPT/comments/177v8wz/i_have_a_r...

◧◩
11. mycolo+Nm[view] [source] [discussion] 2024-05-21 03:09:12
>>Projec+7g
Sucks that he had to do it, but the notion of Tom Waits making Rain Dogs and then pivoting to spending a bunch of time thinking about Doritos must be one of the funnier quirks of music history.
◧◩
12. Increa+mn[view] [source] [discussion] 2024-05-21 03:13:37
>>Projec+7g
Disregarding the cases settled out of court (which have nothing to do with case law):

1) Tom Waits vs Frito-Lay: Frito-Lay not only used a soundalike to Tom Waits, but the song they created was extremely reminiscent of "Step Right Up" by Waits.

2) Bette Midler vs. Ford Motor Company: Same thing - this time Ford literally had a very Midler-esque singer sing an exact Midler song.

3) Beastie Boys vs. Monster Energy: Monster literally used the Beastie Boys' music, because someone said "Dope!" when watching the ad and someone at Monster took that to mean "Yes you can use our music in the ad".

Does Scarlett Johansson have a distinct enough voice that she is instantly recognizable? Maybe, but, well, not to me. I had no clue the voice was supposed to be Scarlett's, and I think a lot of people who heard it also didn't think so either.

replies(4): >>Cheer2+jo >>pseuda+Lw >>parpfi+Bz >>eaglef+I84
◧◩◪
13. Cheer2+jo[view] [source] [discussion] 2024-05-21 03:21:19
>>Increa+mn
> Does Scarlett Johansson have a distinct enough voice that she is instantly recognizable?

It absolutely is if you've seen /Her/. It even nails her character's borderline-flirty cadence and tone in the film.

replies(2): >>underl+9s >>Y_Y+w91
◧◩◪◨
14. underl+9s[view] [source] [discussion] 2024-05-21 04:01:53
>>Cheer2+jo
Actually until recently I thought the voice actor for "her" was Rashida Jones
replies(1): >>locuso+qy
◧◩◪
15. pseuda+Lw[view] [source] [discussion] 2024-05-21 04:49:07
>>Increa+mn
The Midler v. Ford decision said her voice was distinctive. Not the song.
◧◩◪◨⬒
16. locuso+qy[view] [source] [discussion] 2024-05-21 05:06:10
>>underl+9s
I definitely thought "Sky" was Rashida Jones. I still do.
◧◩◪
17. parpfi+Bz[view] [source] [discussion] 2024-05-21 05:17:46
>>Increa+mn
And in an interesting coincidence: ScarJo recorded a Tom Waits cover album in 2008
◧◩◪
18. johnny+GH[view] [source] [discussion] 2024-05-21 06:45:24
>>Increa+S3
> a reference to an object or fact that serves as conclusive evidence of a crime or similar act, just short of being caught in flagrante delicto.

If this isn't a smoking gun, I don't know what it.

I think people forget the last part of the definition, though. A Smoking gun is about as close as you get without having objective, non-doctored footage of the act. There's a small chance the gun is a red herring, but it's still suspicious.

◧◩◪
19. romwel+jW[view] [source] [discussion] 2024-05-21 09:04:13
>>Increa+j7
TL;DR: This question had already been settled in 2001 [3]:

The court determined that Midler should be compensated for the misappropriation of her voice, holding that, when "a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."

I hope there's going to be no further hypotheticals after this.

-----

>They're doing something with a voice that some claim sounds like hers.

Yes, that's what a likeness is.

If you start using your own paintings of Taylor Swift in a product without her permission, you'll run afoul of the law, even though your painting is obviously not the actual Taylor Swift, and you painted it from memory.

>But if it isn't, then it is more like selling a figurine called Sally that happens to look a lot like Taylor Swift. Sally has a right to exist even if she happens to look like Taylor Swift.

Sally has a right to exist, not the right to be distributed, sold, and otherwise used for commercial gain without Taylor Swift's permission.

California Civil Code Section 3344(a) states:

Any person who knowingly uses another’s name, voice, signature, photograph, or likeness, in any manner, on or in products, merchandise, or goods, or for purposes of advertising or selling, or soliciting purchases of, products, merchandise, goods or services, without such person’s prior consent, or, in the case of a minor, the prior consent of his parent or legal guardian, shall be liable for any damages sustained by the person or persons injured as a result thereof.

Note the word "likeness".

Read more at [1] on Common Law protections of identity.

>Has there ever been an up and coming artist who was not allowed to sell their own songs, because they happened to sound a lot like an already famous artist? I doubt it.

Wrong question.

Can you give me an example of an artist which was allowed to do a close-enough impersonation without explicit approval?

No? Well, now you know a good reason for that.

Tribute bands are legally in the grey area[2], for that matter.

[1] https://www.dmlp.org/legal-guide/california-right-publicity-...

[2] https://lawyerdrummer.com/2020/01/are-tribute-acts-actually-...

[3] https://repository.law.miami.edu/cgi/viewcontent.cgi?article...

◧◩◪◨
20. Y_Y+w91[view] [source] [discussion] 2024-05-21 10:47:38
>>Cheer2+jo
I've seen Her and the similarity of the voice didn't occur to me until I read about it. I guess it wasn't super distinct in the movie. Maybe if they'd had Christopher Walken or Shakira or someone with a really distinctive sound it would have been more memorable and noticable to me.
◧◩◪
21. eaglef+I84[view] [source] [discussion] 2024-05-22 07:07:42
>>Increa+mn
Obviously it'll be up to some court to decide, but I don't think openai did themselves any favours in a possible voice protection lawsuit, when Sam Altmann tweeted out "her". That makes it seem very much like the imitation was intentional.

Additionally I think you're understating the similarity to the Midler v Ford case. It has a similar pattern where they first contacted the "original", then hired an impersonator. The song wasn't at issue, they had a license to that part.

I think this part of some of the court documents[0] is especially relevant. Hedwig was the sound alike hired.

> Hedwig was told by Young & Rubicam that "they wanted someone who could sound like Bette Midler's recording of [Do You Want To Dance]." She was asked to make a "demo" tape of the song if she was interested. She made an a capella demo and got the job.

> At the direction of Young & Rubicam, Hedwig then made a record for the commercial. The Midler record of "Do You Want To Dance" was first played to her. She was told to "sound as much as possible like the Bette Midler record," leaving out only a few "aahs" unsuitable for the commercial. Hedwig imitated Midler to the best of her ability.

> After the commercial was aired Midler was told by "a number of people" that it "sounded exactly" like her record of "Do You Want To Dance." Hedwig was told by "many personal friends" that they thought it was Midler singing the commercial. Ken Fritz, a personal manager in the entertainment business not associated with Midler, declares by affidavit that he heard the commercial on more than one occasion and thought Midler was doing the singing.

> Neither the name nor the picture of Midler was used in the commercial; Young & Rubicam had a license from the copyright holder to use the song. At issue in this case is only the protection of Midler's voice.

So the fact that us random internet commentors did not recognize her voice doesn't seem to matter in cases like these. It's enough that the sound alike had been told to mimic the original voice, and that people familiar with the voice be fooled.

[0] https://law.justia.com/cases/federal/appellate-courts/F2/849...

[go to top]