That feels off. When I watch an actor on screen conveying emotions, there's no actual human being feeling those emotions as I watch their movie. Very dumb machines have already been rendering emotions convincingly for a while in that way, and their rendering impacts our own emotional state.
Emotions expressed through tone of voice are just one mean of nonverbal communication. We should expect more of those to develop and become more widely available next.
In a way, we're lucky all that gpt-4o seems to be hell bent on communicating how cheerful and happy it is so far, because it's certainly not the only option.
Humans can be manipulated through nonverbal communications, in a way that's harder to consciously spot than through words, and a model that's able to craft its "emotional output" would not be far from being able to use it to adjust its interlocutor or audience's frame of mind.
I for one look forward to the arrival of our increasingly charismatic and oddly convincing LLMs.