Altman appears to be an habitual liar. Note his recent claim not to be aware of the non-disparagement and claw-back terms he had departing employees agree to. Are we supposed to believe that the company lawyer or head of HR did this without consulting (or more likely being instructed by) the co-founder and CEO?!
Some of it can also be attributed to ideological reasons, the d/acc crowd for example. Please note I am not attacking any individual poster, but speculating on the reasons why someone might refuse to acknowledge the truth, even when presented evidence to the contrary.
The objective parts of this are disproved in several ways by the very article under which we're commenting. The subjective parts are... subjective, but arguably demonstrated as false in the very thread, through examples of SJ vs. Sky to listen side by side.
> such that the movie Her is referenced by Altman
You're creating a causal connection without a proof of one. We don't know why Altman referenced "Her", but I feel it's more likely because the product works in a way eerily similar to the movie's AI, not that because it sounds like it.
> and why is that actress upset?
Who knows? Celebrities sue individuals and companies all the time. Sometimes for a reason, sometimes to just generate drama (and capitalize on it).
Likewise, if someone’s attitude is - “OK, maybe there’s no paper trail, but I’m sure this is what the people were thinking”, then you’ve made an accusation that simply can’t be refuted, no matter how much evidence gets presented.
Since they withdrew the voice this will end, but if OpenAI hadn't backed off and ScarJo sued, there would be discovery, and we'd find out what her instructions were. If those instructions were "try to sound like the AI in the film Her", that would be enough for ScarJo to win.
I know that the Post article claims otherwise. I'm skeptical.
It seems she has every reason to benefit from claiming Sky sounded like her even if it was a coincidence. "Go away" payments are very common, even for celebrities - and OpenAI has deep pockets...
Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?
A lot of the argument here comes down to whether the article does refute that. I don't believe it does.
What it refutes is the accusation that they hired someone who sounds like Johansson after she told them she would not do it herself. That was certainly a more damning accusation, but it's not an identical one.
But in my view, it requires a pretty absurd level of benefit of the doubt to think that they didn't set out to make a voice that sounds like the one from the movie.
Maybe good for them that they felt icky about it, and tried to get her for real instead, but she said no, and they didn't feel icky enough about it to change the plan.
Do you believe the article "refutes" that? Does it truly not strike you as a likely scenario, given what is known, both before and after this reporting?
Are you saying that story is false?
If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?
- It's fine to hire a voice actor.
- It's fine to train a system to sound like that voice actor.
- It's fine to hire a voice actor who sounds like someone else.
- It's probably fine to go out of your way to hire a voice actor who sounds like someone else.
- It's probably not fine to hire a voice actor and tell them to imitate someone else.
- It's very likely not fine to market your AI as "sounds like Jane Doe, who sounds like SJ".
- It's definitely not fine to market your AI as "sounds like SJ".
Say I wanted to make my AI voice sound like Patrick Stewart. Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising. If so, would it have been OK for OpenAI to do all this as long as they didn't mention SJ? Or is SJ so clearly identifiable with her role in "Her" that it's never OK to try to make a product like "Her" that sounds like SJ?
It clearly refutes the claims that they got a Johansson impersonator. The article says this is a voice actress, speaking in her normal voice, who wasn’t told to mimic Johansson at all. You can say that you personally think she was chosen because people thought she sounded similar to Johansson, even though there’s no evidence for that at this point. But the claim - which was made several times in discussions on here before - that she is a Johansson impersonator is factually incorrect.
> But in my view, it requires a pretty absurd level of benefit of the doubt to think that they didn't set out to make a voice that sounds like the one from the movie.
I tried it several times in the past and never once thought it sounded like Johansson. When this controversy came out I looked at videos of Her, because I thought Johansson could have been using a different voice in that movie, but no - the voice in her is immediately recognizable as Johannson’s. Some have said Sky’s was much closer to Rashida Jones, and I agree, though I don’t know how close.
Pretty sure this is fine, otherwise cartoons like the simpsons or south park would've gotten in trouble years ago.
This particular area of law or even just type of "fairness" is by necessity very muddy, there isn't a set of well-defined rules you can follow that will guarantee an outcome where everyone is happy no matter what, sometimes you have to step back and evaluate how people feel about things at various steps along the way.
I'd speculate that OAI's attempts to reach out to SJ are probably the result of those evaluations - "this seems like it could make her people upset, so maybe we should pay her to not be mad?"
Thanks to Sam, this OpenAI case is clearer than others since he made a number of clear evidence against him.
The dispute is instead about statements just like your We don't know why Altman referenced "Her", which, on the one hand, you're right, the mind of another personally is technically unknowable, but on the other hand, no, that's total nonsense, we do indeed know exactly why he referenced the movie, because we're a social animal and we absolutely are frequently capable of reasoning out other people's motivations and intentions.
This is not a court of law, we don't have a responsibility to suspend disbelief unless and until we see a piece of paper that says "I did this thing for this reason", we are free to look at a pattern of behavior and draw obvious conclusions.
Indeed, if it were a court of law, that's still exactly what we'd be asked to do. Intent matters, and people usually don't spell it out in a memo, so people are asked to look at a pattern of behavior in context and use their judgement to determine what they think it demonstrates.
After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ. I guess that's just the kind of voice he likes.
So, Altman has forgotten about SJ, has 5 voice talents in the bag, and is good to go, right? But then, 2 days(!) before the release he calls SJ again, asking her to reconsider (getting nervous about what he's about to release, perhaps?).
But still, maybe we should give Altman the benefit of the doubt, and assume he wanted SJ so badly because he had a crush on her or something?
Then on release day, Altman tweets "her", and reveals a demo not of a sober AI assistant with a voice interface, but of a cringe-inducing AI girlfriend trying to be flirty and emotional. He could have picked any of the five voices for the demo, but you know ...
But as you say, he's not admitting anything. When he tweeted "her" maybe it was because he saw the movie for the first time the night before?
I think many people distrust and dislike Altman and Musk in particular because of their own specific behavior.
Some people are hated because people are just jealous, but other people use that as an umbrella excuse to deflect blowback from their behavior that they entirely deserve. I believe this is one of those latter cases.
I'll defer to a judge and jury about the legalities. As you noted, Sam gave them a lot of help.
Let's note that OpenAI didn't release the names of the voice talent since they said they wanted to protect their privacy...
So, how do you think the reporter managed to get not only the identity, but also the audition tape from "Sky"? Detective work?
An interesting twist here is that WashPo is owned by Bezos, who via Amazon are backing Anthropic. I wonder how pleased he is about this piece of "investigative reporting"?
But if it does look very much like her, it doesn't really matter whether you never intended to.
According to the article, the word After here is incorrect. It states the voice actor was hired months before the first contact with SJ. They might be lying, but when they hired the voice actor seems like it would be a verifiable fact that has contracts and other documentation.
And like others, not defending OpenAI, but that timeline does tend to break the narrative you put forth in this post.
Whether or not they had any interest in SJ’s voice when they hired the other actor, they clearly developed such an interest before they went to market, and there is at least an evidence-based argument that could be made in court that they did, in fact, commercially leverage similarity.
All I'm saying in the comment you are replying to is that it's incorrect to claim that Altman's saying that "the goal was specifically to copy 'Her'".
Moreover I can't see any reasonable person concluding that they were not trying to imitate her voice given that:
1. It sounds similar to her (It's unbelievable that anyone would argue that they aren't similar, moreso given #2).
2. Her voice is famous for the context in which synthetic voice is used
3. They contacted her at some point to get her permission to use her voice
4. The CEO referenced the movie which Johansson's voice is famous for (and again depicts the same context the synthetic voice is being used) shortly before they released the synthetic voice.
Not necessarily, when you're hiring them because they like someone else—especially someone else who has said that they don't want to work with you. OpenAI took enough steps to show they wanted someone who sounded like SJ.
> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.
See https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. and also Tom Waits vs. Frito-Lay.
> as long as they didn't mention SJ
Or tried to hire SJ repeatedly, even as late as 2 days before the launch.
There were some claims by some people when the issue first arose that they had specifically done a deepfake clone of SJ’s voice; probably because of the combination of apparent trading on the similarity and the nature of OpenAI’s business. That’s not the case as far as the mechanism by which the voice was produced.
I think the most plausible thing that happened is that they thought "hey it would be so awesome to have an AI voice companion like the one in Her, and we can totally do that with these new models", and then auditioned and hired someone that sounded like that.
Does it not fit the definition of "impersonator", since they didn't explicitly tell the person the hired to impersonate the voice from the movie? Sure, fine, I guess I'll give it to you.
But it doesn't refute "they wanted to use a voice that sounded like the one in Her", and there are a number of indications the this was indeed the case.
You can make a Saturday Night Live sketch making fun of Darth Vader.
You cannot use a Darth Vader imitator to sell light sabers.
But the tiny sliver of disagreement I have with "this is a bad thing to discuss here and we should all feel bad" is that some people who frequent this site are sometimes some of the people involved in making decisions that might lead to threads like this. And it might be nice for those people to have read some comments here that push back on the narrative that it's actually fine to do stuff like this, especially if it's legal (but maybe also if it isn't, sometimes?).
The way I see this particular discussion is: outside the tech bubble, regardless of the new facts in this article, people see yet another big name tech leader doing yet another unrelatable and clearly sleazy thing. Then what I see when I come to the thread is quite a few of the tech people who frequent this site being like "I don't get it, what's the problem?" or "this article totally refutes all of the things people think are a problem with this". And I feel like it's worth saying: no, get out of the bubble!
That in no way implies that people who have worked with these people or know them socially will agree with my assessments of them. People just assess things differently from one another. We all care about and prioritize different things.
(For what it's worth, it's possible that I also know a few people who have worked for or with Musk, and have incorporated the nuances of their views into mine, to some extent...)
The hilarious thing about this line of reasoning is that I am saying "that thing he said is not acceptable to me" and people like you say "that thing you said about that thing he said is not acceptable to me". The pattern here is "saying X is not acceptable to me", just different values of X. If what I'm doing is policing his speech, and that isn't acceptable to you, then what you're doing is policing my speech, which also shouldn't be acceptable to you.
A paradox? No, not at all, because the resolution of this paradox is just: Nobody here is policing anybody's speech. Everyone is just expressing their own opinions, in a completely normal way.
I'm free to care about how people behave, and not just what they accomplish. And you're free both to not care about that, and also to judge people who do care.
All of these streets go two ways!
You can hear them here:
Midler version https://www.youtube.com/watch?v=WFVhL0jbutU&t=22s
Car ad https://youtu.be/hxShNrpdVRs
As for your second question, yes. Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable.
It protect celebrities who rely on endorsements and “who they are” for income.
It very clearly prohibits copycats with near-likeness as a workaround to getting permission from a celebrity.
OpenAI asked SJ to use her voice. That right there helps her case immensely.
She said no. They went ahead anyway, presumably with someone or someone’s with a similar voice.
They publicized the product by referencing SJ.
These facts are damning.
They might be just a part of the story. Maybe 100 actresses, all sounding roughly the same, were given the offer over a two year period.
Maybe they all were given the same praise. Maybe one other, who signed an agreement, was praised on social media much more.
But this isn’t a slippery slope or a grey area. SJ was asked and said no.
That prohibits using a similar sounding copycat and publicizing as SJ.
Your opinion may vary, but they don't sound alike to me: >>40435695
This very well could be a contractual obligation.
Her, being the voice SJ did for the movie, not SJ's conversational voice which is somewhat different.
If OpenAI were smart, they did it in a chinese wall manner and looked for someone whose voice sounded like the movie without involving SJ's voice in the discussion.
Unless & until some 3rd other shoe drops, what we know now strongly --- overwhelmingly, really --- suggests that there was simply no story here. But we are all biased towards there being an interesting story behind everything, especially when it ratifies our casting of good guys and bad guys.
These don't have to be related. Maybe they are, but the confidence that they are is silly, since having a big celebrity name as the voice is a desirable thing, great marketing, especially one that did the voice acting for a movie about AI. My mind was completely changed when I actually listened to the voice comparison for myself [1].
[1] >>40435695
They could have taken auditions from 50 voice actors, come across this one, thought to themselves "Hey, this sounds just like the actress in 'Her', great, let's use them" and that would be fine. Laurence Fishburne does not own his "welcome to the desert of the real" intonation; other people have it too, and they can be hired to read in it.
Again: the Post has this voice actor reading in the normal voice. This wasn't an impersonator.
Then I'd say you have a point. But given all the other info, I'd have to say you're in denial.
Correct, that is not allowed in the US.
No.
The fact that it sounds very much like her and it is for a virtual assistant that clearly draws a parallel to the virtual assistant voiced by SJ in the movie (and it was not a protected use like parody) makes it not OK and not legal.
> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.
Nope.
If you made it sound identical to Patrick Stewart that would also likely not be OK or legal since his voice and mannerisms are very distinctive.
If you made it sound kind of like Patrick Stewart that is where things get really grey and it is probably allowed (but if you're doing other things to draw parallels to Patrick Stewart / Star Trek / Picard then that'd make your case worse).
And the law deals with grey areas all the time. You can drag experts in voice and language into the court to testify as to how similar or dissimilar the two voices and mannerisms are. It doesn't nullify the law that there's a grey area that needs to get litigated over.
The things that make this case a slam dunk are that there's the relevant movie, plus there's the previous contact to SJ, plus there's the tweet with "Her" and supporting tweets clearly conflating the two together. You don't even really need the expert witnesses in this case because the behavior was so blatant.
And remember that you're not asking a computer program to analyze two recordings and determine similarity or dissimilarity in isolation. You're asking a judge to determine if someone was ripping off someone else's likeness for commercial purposes, and that judge will absolutely use everything they've learned about human behavior in their lifetime to weigh what they think was actually going on, including all the surrounding human context to the two voices in question.
Except that is simply not true. If their intent was to sound like Her, and then they chose someone who sounds like Her, then they're in trouble.
Then there's no actual confusion.
You can use impersonators for parody, but not for selling products.
You agree that OpenAI is choosing the voice because it sounds like SJ. How exactly is that different from impersonation?
You don't know that's what happened, but it wouldn't matter either way. Regardless: it is misleading to call that person an "impersonator". I'm confident they don't wake up the morning and think to themselves "I'm performing SJ" when they order their latte.
Clearly a lot of people (including her "closest friends") find the ChatGPT demo to have been very similar to SJ/"her", which isn't to deny that the reporter was fed some (performance-wise) flat snippets from the voice actor's audition tape that sounded like flat sections of the ChatGPT demo. It'd be interesting to hear an in-depth comparison from a vocal expert, but it seems we're unlikey to get that.
The key here is intent. If there was no intention for OpenAI to model the voice after the character Samantha, then you're right, there's no foul.
But as I have explained to you elsewhere, that beggars belief.
We will see the truth when the internal emails come out.
There are multiple parts to the voice performance of ChatGPT - the voice (vocal traits including baseline pronunciation) plus the dynamic manipulation of synthesized intonation/prosody for emotion/etc, plus the flirty persona (outside of vocal performance) they gave the assistant.
The fact that the baseline speaking voice of the audition tape matches baseline of ChatGPT-4o only shows that the underlying voice was (at least in part, maybe in whole) from the actress. However, the legal case is that OpenAI deliberately tried to copy SJ's "her" performance, and given her own close friends noting the similarity, they seem to have succeeded, regardless of how much of that is due to having chosen a baseline sound-alike (or not!) voice actress.
But I wish you'd made your actual point, instead of asking this vague question. I don't want to guess what point this question is a setup to, but who knows if I'll ever make it back to this thread to find out what point you're going to make.
Actually it only says they reviewed "brief recordings of her initial voice test", which I assume refers to the voice test she did for OpenAI.
The "impersonating SJ" thing seems a straw man someone made up. The OpenAI talent call was for "warm, engaging, charismatic" voices sounding like 25-45 yr old (I assume SJ would have qualified, given that Altman specifically wanted her). They reviewed 400 applicants meeting this filtering criteria, and it seems threw away 395 of the ones that didn't remind Altman of SJ. It's a bit like natural selection and survival of the fittest. Take 400 giraffes, kill the 395 shortest ones, and the rest will all be tall. Go figure.
She doesn't need "go away" payments, and in any case that is not what we're looking at here. OpenAI offered her money to take the part, and she said no.
According to celebrity net worth website, SJ is worth $165M.
He probably just really wants to actually have SJ's voice from the film. But SJ doesn't really have a right to arbitrary west coast vocal fry female voices. Without the "disembodied voice of an AI" element, I don't think most people would note "Oh she sounds like SJ". In fact, that's what the voice actress herself has said -- she hadn't gotten compared to SJ before this.
If I had to guess the best faith order of events (more than what OpenAi deserves):
- someone liked Her (clearly)
- they got a voice that sounded like Her, subconsciously (this is fine)
- someone high up hears it and thinks "wow this sounds like SJ!" (again, fine)
- they think "hey, we have money. Why not get THE SJ?!"
- they contact SJ, she refuses and they realize money's isn't enough (still fine. But this is definitely some schadenfreude here)
- marketing starts semi-indepenently, and they make references to Her, becsuse famous AI voice (here's where the cracks start to form. Sadly the marketer may not have even realized what talks went on).
- someone at OpenAi makes one last hail Mary before the release and contacts SJ again (this is where the trouble starts. MAYBE they didn't know about SJ refusing, but someone in the pipeline should have)
- Altman, who definitely should have been aware of these contacts, makes that tweet. Maybe they forgot, maybe they didn't realize the implications. But the lawyer's room is now on fire
So yeah, hanlon's razor. Thus could he a good faith mistake, but OpenAi's done a good job before this PR disaster ruining their goodwill. Again, sweet Schadenfreude even if we are assuming none of this was intentional.
Taking the recent screwup out of account... It's tough. A commercial product shouldn't try to assossiate with another brand. But if we're being realistic: "Her" is nearly uncopyrightable.
Since Her isn't a tech branch it would be hard to get in trouble based on that association alone for some future company. Kind of like how Skynet in theory could have been taken by a legitimate tech company and how the Terminator IP owner would struggle to seek reparations (to satisfy curiosity, Skynet is a US govt. Program. So that's already taken care of).
As long as you don't leave a trail, you can probably get away with copying Stewart. But if you start making Star trek references (even if you never contacted Stewart), you're stepping in hit water.
Even if Altman was a good person, they are the face of a company that is doing some very suspicious actions. Actions that got the company cooked in litigation. So those consequences will assossiate with that face, consequences for not following robots.txt, for trying to ask forgiveness over permission against other large companies, and now this whole kerfuffle.
I don’t believe the burden would be to prove that the voice actor was impersonating, but that she was misappropriating. Walking down the street sounding like Bette Midler isn’t a problem but covering her song with an approximation of her voice is.
You are dead right that the order of operations recently uncovered precludes misappropriation. But it’s an interesting situation otherwise, hypothetically, to wonder if using SJ’s voice to “cover” her performance as the AI in the movie would be misappropriation.
I was just reading the comments after reading the article to see if anything new came up, and was pretty appalled at the quality of commentary here. I'm not participating in the thread more because it's not worth it.
> But the tiny sliver of disagreement I have with "this is a bad thing to discuss here and we should all feel bad" is that some people who frequent this site are sometimes some of the people involved in making decisions that might lead to threads like this. And it might be nice for those people to have read some comments here that push back on the narrative that it's actually fine to do stuff like this, especially if it's legal (but maybe also if it isn't, sometimes?).
I've been involved in big decisions in other Big Tech companies. I'm proud of having fought to preserve Tor access to our offerings because I believe in Tor despite the spam and attacks it brings. I don't know about other folks in these positions, but if I were to read discussion like this, I'd roll my eyes and close the thread. If a random incoherent drunk ranter told me something was wrong with my ideas, I'd dismiss them without much hesitation.
> The way I see this particular discussion is: outside the tech bubble, regardless of the new facts in this article, people see yet another big name tech leader doing yet another unrelatable and clearly sleazy thing.
Because journalists know there is anti-tech sentiment among a segment of the population and so they stoke it. I don't know that much about this case, but a different story I've been following for a while now is the California Forever creation of a new city adjacent to the Bay Area. Pretty much every article written about it calls the city a "libertarian city" or "libertarian, billionaire dream". I'm involved in local planning conversations. I've read over their proposals and communications. They never, ever, mention anything about libertarianism. They're not proposing anything libertarian. They're working with existing incorporation laws; they're literally acting as a real-estate developer the same as any other suburban tract developer anywhere else in the US. But the press, desperate to get clicks on the story, bills it as some "libertarian city".
This "bubble" that you speak of is literally just a bubble created by journalists. I'm not saying that tech hasn't created some new, big, real problems nor that we shouldn't discuss these problems, but we need to recognize low-effort clickbait where we see it. This [1,2] article and thread talks about the reasons why, and it's not simple or straightforward, but at this point I consider most (not all) tech journalism to basically be tabloid journalism. It's meant specifically to drive clicks.
The only silly thing is some folks on HN think this site is somehow more high-brow than some general social media conversation on the news. It's the same social media as everywhere else, it's just more likely that the person talking is a software nerd, so the clickbait they fall for is different. My comment is my attempt as a community member to remind us to strive for something better. If we want to be more than just another social media site then we need to act like it. That means reading articles and not reacting to headlines, having good-faith conversations and not bringing strong priors into the conversation, and actually responding to the best interpretations of our peers' comments not just dunking on them.
[1]: https://asteriskmag.com/issues/06/debugging-tech-journalism
[2]: >>40201818
But you more or less drain thst good faith when you are caught with your pants down and decide instead to double down. So I was pretty much against OpenAI ever since the whole "paying for training data is expensive" response during the NYT trials.
----
In general, the populace can be pretty unforgiving (sometimes justified, sometimes not). It really only takes one PR blunder to tank thst good faith. And much longer to restore it.
I don't think that follows. It's entirely possible that OpenAI wanted to get ScarJo, but believed that simply wasn't possible so went with a second choice. Later they decided they might as well try anyway.
This scenario does not seem implausible in the least.
Remember, Sam Altman has stated that "Her" is his favorite movie. It's inconceivable that he never considered marketing his very similar product using the film's IP.
I'm loathe to reply because I know you don't want to engage anymore, but I think it's fair to reply to your reply on this point:
I'm sure you're right that the people involved in this will be defensive and roll their eyes, and I'm sympathetic to that human reaction, but it's also why society at large will continue along this path of thinking we suck.
If we roll our eyes at their legitimate criticism of all this sleazy stuff that is going on, then they're just right to criticize us.
And sure, "we shouldn't care if people at large think we suck because we roll our eyes at their criticism of our sleaziness" is a totally valid response to that. But I'm certainly not going to take up our cause against the inevitable backlash, in that case.
> This "bubble" that you speak of is literally just a bubble created by journalists.
I don't think so. I'm honestly sympathetic to how you've become convinced of that by the California Forever thing, which I agree has gotten a raw deal in the press. But I think this tech / SV / HN bubble is nonetheless a real thing. I work inside that bubble but live outside it. I spend a decent amount of time during my days reading and (often foolishly, like today) commenting on threads here.
But I spend a lot of my evenings and weekends with friends and family (in Colorado) who are very distant from our "scene". And I'm telling you, in my twenty year career, I have lived this evaluation from from "the internet is so awesome, google search is amazing, you know how to do software, that's so cool!" to "I don't know how you can stomach working in that industry". Sure, the media has had some impact on this, but we've also been super arrogant, have screwed up tons of stuff that is visible and salient to lots of people, and have seemed completely oblivious to this.
This episode is just one more example of that trend, and I think it's crazy to think "nah, this is all fine, nothing to see here".
Notice how the only asterisk there is "it's technically not her voice, it's just someone who they picked because she sounded just like her"
How's that self driving coming along?
How about the Cybertruck being a few years late?
How about the low cost car being cancelled?
How about the you have all the hardware you need - oh wait oopsie you need to pay us thousands more
How about the taking Tesla private tweet?
How about the repeatedly and flagrantly violating government contracts that are basically his company's only revenue because he's too powerful for consequences?
How about...
Definitely the real deal.
When the reaction is "it doesn't matter, it's still not ok to copy someone's voice and then market it as being that person's voice or related to that person's voice" and your reaction is to cast that as being something else, it demonstrates you are not openly approaching things in good faith.
There's no proof needed. A marketer doesn't market something for no reason.
We are all capable of interpreting his statement and forming an opinion about its intent. Indeed, the entire point of making any statement is for others to form an opinion about it. That doesn't make our opinion invalid - nor does the whining and backpedaling of the person who made the statement.
Your opinion may be different than others, but I doubt that would be the case if you were truly approaching this situation in an unbiased way.
> She doesn't need "go away" payments
> According to celebrity net worth website, SJ is worth $165M.
I have no idea what Johansson's estimated net worth, or her acting career have to do with this? Wealthy people sue all the time for all kinds of ridiculous things.
The voice is, in fact, not Johansson. Yet, it appears she will be suing them non-the-less...
It's not illegal to sound like someone else - despite what people might be claiming. If it turns out to be true that Sky's voice actor was recorded prior to the attempted engagement with Johansson, then all of this is extra absurd.
Also, Sky doesn't sound like Johansson anyway... but apparently that isn't going to matter in this situation.
That is not decided. There have been high profile cases were someone's likeness was explicitly used without permission, and they still had no recourse. It was argued the person was notable enough they could not protect their likeness.
Regardless, it appears debated if Sky even sounded like Johansson, which will make this very difficult for anyone to argue (being subjective and all). If the Sky voice actor was recorded prior to engaging with Johansson (which has been claimed by OpenAI), then it seems even more difficult to argue.
In the end, this will net Johansson a nice "go away" payday and then everyone will forget about it.
> Well, from the start, Altman wanted SJ to do the voice. Perhaps he'd never seen or heard of the movie "her", and the association is just coincidental? After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ.
I suppose some might imagine asserting the opposite as a distinct concept from disputing but there you have it. You should be able to find a link quite easily.
This whole idea that some one has to comply to your idea of how one must set goals, and get there is something other people have no obligations to measure up to. Also that's the deal about his lies? He can say whatever he wants, and not get there. He is not exactly holding an oath to you or any one that he is at an error for not measuring up.
Musk might not get to Mars, he might end up mining asteroids or something. That is ok. That doesn't make him a conman.
tl;dr. Any one can say, work and fail at anything they want. And they don't owe anybody an explanation for a darn thing.
> Of course, this explanation only goes so far. We don’t know whether anyone involved in choosing Sky’s voice noted the similarity to Johansson’s, for example. And given how close the two voices sound to most ears, it might have seemed strange for the company to offer both the Sky voice and the Johansson voice, should the latter actor have chosen to participate in the project. [...] And I still don’t understand why Altman reportedly reached out to Johansson just two days before the demonstration to ask her to reconsider.
They absolutely have not earned the benefit of the doubt. Just look at their reaction to the NDA / equity clawback fiasco [2], and their focus on lifelong non-disparagement clauses. There's a lot of smoke there...
[1] https://www.platformer.news/openai-scarlett-johansson-chatgp...
[2] https://www.vox.com/future-perfect/351132/openai-vested-equi...
He can set all the goals he wants. Setting a goal is not the same as telling people the company that you are dictator of is going to do something.
He's not setting goals, he is marketing, and he does it very well.
As far as how he's a conman >>40462194 although you already know that full well so you'll continue thinking he's some sort of hero.
"Whether or not Altman wanted SJ's voice" and "whether or not they got someone else to do SJ's voice before asking SJ to do it" are two completely independent matters.
What does 'after' mean to you? Strange response.
Yes
> Say I wanted to make my AI voice sound like Patrick Stewart
Don't tweet "engage" of "boldly go where no man has gone before" when you release the product and you should be ok.
You know that this is exactly how SpaceX won big? There were many a competent, credentialed people telling Musk that reusable rockets are a pipe dream, all the way to the first Falcon 9 landing and reflight. Some of them even continued giving such "competent and informed" advice for many months afterwards.
> Setting a goal is not the same as telling people the company that you are dictator of is going to do something.
That's literally what it means, though.
Even if this did go down the way you suppose, once they realized the obvious similarities, the ethical thing to do was to not use the voice. It doesn’t matter if the intention was pure. It doesn’t matter if it was an accident.
Isla Fischer / Amy Adams
Margot Robbie / Samara Weaving / Jamie Pressley
Justin Long / Ezra Miller
Not to mention all of the token/stereotype characters where it hardly matters who the actor is at all. Need to fill the funny fat lady trope? If Rebel Wilson wasn't available, maybe they can get Melissa McCarthy.
The voice from Her isn't even the first voice I'd think of for a female computer voice. That trope has been around for decades. I'm sure OpenAI just wanted SJ specifically because she's currently one of the most popular celebrities in the world.
People lose their rational mind when it comes to people they hate (or the opposite I suppose). I don't care for Sam Altman, or OpenAI one way or another, so it was quite amusing to watch the absolute outrage the story generated, with people so certain about their views.
This is where your post breaks down. Many people say they don't think the voice sounds like SJ. Others do. But it appears you've made up your mind that they deliberately emulated her voice?
Yes this is pretty typical. The CEO doesn’t make all decisions. They hire people to make decisions. A company’s head of legal could definitely make decisions about what standard language to use in documents on their own.
> The appellate court ruled that the voice of someone famous as a singer is distinctive to their person and image and therefore, as a part of their identity, it is unlawful to imitate their voice without express consent and approval. The appellate court reversed the district court's decision and ruled in favor of Midler, indicating her voice was protected against unauthorized use.
1. OpenAI wants to make a voice assistant. 2. They hire the voice actor. 3. Someone at OpenAI wonders why they would make a voice assistant that doesn’t sound like the boss’s favorite movie. 4. They reach out to SJ who tells them to pound sand.
Accordingly, there is no misappropriation because there is no use.
But you need to understand that it does sound like ScarJo to a lot of people. Maybe 50% of the people who hear it.
Those kinds of coincidences are the things that make you lose in court.
You are suggesting that it is coincidence that they contacted SJ to provide her voice, they hired a voice actor that sounds like her, they contacted SJ again prior to launch, and then they chose that specific voice from their library of voices and tweeted the name of the movie that SJs voice is in as a part of the promo?
I haven't suggested what they have done is illegal, given that the fictional company that created the AI "her" is unlikely to be suing them, but it is CLEARLY what their intent was.
Bryce Dallas Howard & Jessica Chastain
Selena Gomez & Lucy Hale
Amy Adams & Isla Fisher
Keira Knightley & Natalie Portman
Not to mention that anytime an actor ages out of the early adulthood age range, a lookalike begins to play the same roles.
If you can't think of any examples, it's probably because you haven't been able to tell them apart yourself.
All the reporting around this I’ve seen uses incredibly short clips. There are hours of recorded audio of SJ speaking and there are lots of examples of the Sky voice out there since it’s been around since September.
The real source of your frustration though is that he does not fully subscribe to the neoliberal dogma and lets millions of others to say whatever they like. Totalitarian left can’t handle that, it’s a visceral reaction
"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
Ford having multiple ads would not have changed the determination.
It doesn't even need to sound like the person. It's about the intent. Did OpenAI intend to imitate ScarJo.
Half the people of the world thinking it's ScarJo is strong evidence that it's not an accident.
Given that "Her" is Sam's favorite movie, and that he cryptically tweeted "her" the day it launched, and that he reached out to ScarJo to do the voice, and that the company reached out to her again to reconsider two days before the launch -
I personally think the situation is very obvious. I understand that some people strongly disagree - but then there are some people who think the Earth is flat. So.
And there were also people saying it could be done. Where were those people for self-driving? (Oh right they were in the Facebook comment section with no relevant knowledge.)
> That's literally what it means, though.
No, it's not at all.
A goal is personal, or perhaps organizational. You need not announce something on Twitter in order to set a goal for yourself or for your company.
LOOOOOOOOOOOOOOOOOL
Anyway that would be cool if it was true but doesn't change what a conman he is. He said he was gonna make it true how long ago? Yeah.
> The real source of your frustration though is that he does not fully subscribe to the neoliberal dogma and lets millions of others to say whatever they like. Totalitarian left can’t handle that, it’s a visceral reaction
That's not even remotely true. Seriously. No part of it.
- It's been obvious to me that he was a conman since he started lying at Tesla, a decade or more ago. 'Fraid to say that alone disproves your unfounded personal attack since he didn't own a social media platform at the time.
- He doesn't allow people to say whatever they like unless he agrees with them.
- He (ok, and perhaps you) is the only totalitarian in this conversation.
But hey good job trying to make this out to be about politics instead of about what a terrible human Musk is, I know that's the only way for you Repugnantcans to cope with the cognitive dissonance.
Do you think they only got one person to do it or something?
If they weren't looking at multiple options then ... why did they still ask SJ?
Oh right because they asked SJ because they hadn't picked yet.
For example if someone anonymously used a celebrity's likeness to promote something you wouldn't need to identify the person (which would be necessary to prove intent) in order to stop have the offending material removed or prevented from being distributed.
And yet so many people think it does. What a weird coincidence.
> there’s a fair amount of proof to back up the claim that it wasn’t meant to
Sam has said his favorite movie is "Her". Sam tweeted "her" the day it was released. Sam wrote to ScarJo to try to get her to do the voice. OpenAI wrote to her two days before the release to try to get her to change her mind. A large percentage of the people who hear the voice thinks it sounds like ScarJo.
I think we're just going to have to agree to disagree about what the evidence says. You take care now.
The closing statement of Midler v. Ford is:
"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
Deliberate is a synonym for intentional.
It might make sense for intent to be required in order to receive damages but it would surprise me if you couldn't stop an inadvertent use of someone's likeness. In fact the Midler case cites the Ford one: 'The defendants were held to have invaded a "proprietary interest" of Motschenbacher in his own identity.'. I think you can invade someone's "proprietary interest" inadvertently just as you can take someone's property inadvertently; and courts can impose a corrective in both cases, in the first by ordering the invasion of proprietary interest be stopped and in the second by returning the taken property.
Wait, which one is the "off-brand" here?
> Bryce Dallas Howard & Jessica Chastain
I must confess I used to confuse these two!
Another one you didn't mention: until relatively recently, I thought actors Elias Koteas (Exotica) and Christopher Meloni (CSI: Special Victims Unit) were the same person!
I'm not familiar with all the case law but I assume that no case has been brought that directly speaks to the issue but people can and do discuss cases that don't yet have specific precedent.
Just seems like this area isn't that exotic.
And the legal universe is vast with new precedent case law being made every year so I don't think the corpus of undecided law is confined to well known legal paradoxes.
As for this case it doesn't seem that odd to me that the issue of intent has never been at issue: I would expect that typically the intent would be obvious (as it is in the OpenAI case) so no one has ever had to decide whether it mattered.
I don't see much merit in continuing our discussion. You take care now.