Some of the reasoning:
>Preliminary assessment also suggests Imagen encodes several social biases and stereotypes, including an overall bias towards generating images of people with lighter skin tones and a tendency for images portraying different professions to align with Western gender stereotypes. Finally, even when we focus generations away from people, our preliminary analysis indicates Imagen encodes a range of social and cultural biases when generating images of activities, events, and objects. We aim to make progress on several of these open challenges and limitations in future work.
Really sad that breakthrough technologies are going to be withheld due to our inability to cope with the results.
Maybe that's a nice thing, I wouldn't say their values are wrong but let's call a spade a spade.
Translation: we need to hand-tune this to not reflect reality
Is it reflecting reality, though?Seems to me that (as with any ML stuff, right?) it's reflecting the training corpus.
Futhermore, is it this thing's job to reflect reality?
the world as we (Caucasian/Asian male American woke
upper-middle class San Fransisco engineers) wish it to be
Snarky answer: Ah, yes, let's make sure that things like "A giant cobra snake on a farm. The snake is made out of corn" reflect reality.Heartfelt answer: Yes, there is some of that wishful thinking or editorializing. I don't consider it to be erasing or denying reality. This is a tool that synthesizes unreality. I don't think that such a tool should, say, refuse to synthesize an image of a female POTUS because one hasn't existed yet. This is art, not a reporting tool... and keep in mind that art not only imitates life but also influences it.