zlacker

[return to "Imagen, a text-to-image diffusion model"]
1. daenz+b5[view] [source] 2022-05-23 21:20:13
>>kevema+(OP)
>While we leave an in-depth empirical analysis of social and cultural biases to future work, our small scale internal assessments reveal several limitations that guide our decision not to release our model at this time.

Some of the reasoning:

>Preliminary assessment also suggests Imagen encodes several social biases and stereotypes, including an overall bias towards generating images of people with lighter skin tones and a tendency for images portraying different professions to align with Western gender stereotypes. Finally, even when we focus generations away from people, our preliminary analysis indicates Imagen encodes a range of social and cultural biases when generating images of activities, events, and objects. We aim to make progress on several of these open challenges and limitations in future work.

Really sad that breakthrough technologies are going to be withheld due to our inability to cope with the results.

◧◩
2. ceeplu+Q5[view] [source] 2022-05-23 21:24:11
>>daenz+b5
The ironic part is that these "social and cultural biases" are purely from a Western, American lens. The people writing that paragraph are completely oblivious to the idea that there could be other cultures other than the Western American one. In attempting to prevent "encoding of social and cultural biases" they have encoded such biases themselves into their own research.
◧◩◪
3. not2b+N6[view] [source] 2022-05-23 21:29:22
>>ceeplu+Q5
It seems you've got it backwards: "tendency for images portraying different professions to align with Western gender stereotypes" means that they are calling out their own work precisely because it is skewed in the direction of Western American biases.
◧◩◪◨
4. ceeplu+i7[view] [source] 2022-05-23 21:31:32
>>not2b+N6
Yes, the idea is that just because it doesn't align to Western ideals of what seems unbiased doesn't mean that the same is necessarily true for other cultures, and by failing to release the model because it doesn't conform to Western, left wing cultural expectations, the authors are ignoring the diversity of cultures that exist globally.
◧◩◪◨⬒
5. howint+0c[view] [source] 2022-05-23 21:58:16
>>ceeplu+i7
No, it's coming from a perspective of moral realism. It's an objective moral truth that racial and ethnic biases are bad. Yet most cultures around the world are racist to at least some degree, and to they extent that the cultures do, they are bad.

The argument you're making, paraphrased, is that the idea that biases are bad is itself situated in particular cultural norms. While that is true to some degree, from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives.

◧◩◪◨⬒⬓
6. tomp+ri[view] [source] 2022-05-23 22:36:41
>>howint+0c
You're confused by the double meaning of the word "bias".

Here we mean mathematical biases.

For example, a good mathematical model will correctly tell you that people in Japan (geographical term) are more likely to be Japanese (ethnic / racial bias). That's not "objectively morally bad", but instead, it's "correct".

◧◩◪◨⬒⬓⬔
7. astran+do[view] [source] 2022-05-23 23:19:07
>>tomp+ri
Although what you stated is true, it’s actually a short form of a commonly stated untrue statement “98% of Japan is ethnically Japanese”.

1. that comes from a report from 2006.

2. it’s a misreading, it means “Japanese citizens”, and the government in fact doesn’t track ethnicity at all.

Also, the last time I was in Japan (Jan ‘20) there were literally ten times more immigrants everywhere than my previous trip. Japan is full of immigrants from the rest of Asia these days. They all speak perfect Japanese too.

[go to top]