Some of the reasoning:
>Preliminary assessment also suggests Imagen encodes several social biases and stereotypes, including an overall bias towards generating images of people with lighter skin tones and a tendency for images portraying different professions to align with Western gender stereotypes. Finally, even when we focus generations away from people, our preliminary analysis indicates Imagen encodes a range of social and cultural biases when generating images of activities, events, and objects. We aim to make progress on several of these open challenges and limitations in future work.
Really sad that breakthrough technologies are going to be withheld due to our inability to cope with the results.
The argument you're making, paraphrased, is that the idea that biases are bad is itself situated in particular cultural norms. While that is true to some degree, from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives.
> from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives
No, because depending on what set of values you have, it is easy to say that one set of biases is better than another. The entire point is that it should not be Google's role to make that judgement - people should be able to do it for themselves.