They could have taken auditions from 50 voice actors, come across this one, thought to themselves "Hey, this sounds just like the actress in 'Her', great, let's use them" and that would be fine. Laurence Fishburne does not own his "welcome to the desert of the real" intonation; other people have it too, and they can be hired to read in it.
Again: the Post has this voice actor reading in the normal voice. This wasn't an impersonator.
Except that is simply not true. If their intent was to sound like Her, and then they chose someone who sounds like Her, then they're in trouble.
You can use impersonators for parody, but not for selling products.
You agree that OpenAI is choosing the voice because it sounds like SJ. How exactly is that different from impersonation?
You don't know that's what happened, but it wouldn't matter either way. Regardless: it is misleading to call that person an "impersonator". I'm confident they don't wake up the morning and think to themselves "I'm performing SJ" when they order their latte.
The key here is intent. If there was no intention for OpenAI to model the voice after the character Samantha, then you're right, there's no foul.
But as I have explained to you elsewhere, that beggars belief.
We will see the truth when the internal emails come out.
You are suggesting that it is coincidence that they contacted SJ to provide her voice, they hired a voice actor that sounds like her, they contacted SJ again prior to launch, and then they chose that specific voice from their library of voices and tweeted the name of the movie that SJs voice is in as a part of the promo?
I haven't suggested what they have done is illegal, given that the fictional company that created the AI "her" is unlikely to be suing them, but it is CLEARLY what their intent was.