zlacker

[return to "Imagen, a text-to-image diffusion model"]
1. daenz+b5[view] [source] 2022-05-23 21:20:13
>>kevema+(OP)
>While we leave an in-depth empirical analysis of social and cultural biases to future work, our small scale internal assessments reveal several limitations that guide our decision not to release our model at this time.

Some of the reasoning:

>Preliminary assessment also suggests Imagen encodes several social biases and stereotypes, including an overall bias towards generating images of people with lighter skin tones and a tendency for images portraying different professions to align with Western gender stereotypes. Finally, even when we focus generations away from people, our preliminary analysis indicates Imagen encodes a range of social and cultural biases when generating images of activities, events, and objects. We aim to make progress on several of these open challenges and limitations in future work.

Really sad that breakthrough technologies are going to be withheld due to our inability to cope with the results.

◧◩
2. user39+C6[view] [source] 2022-05-23 21:28:28
>>daenz+b5
Translation: we need to hand-tune this to not reflect reality but instead the world as we (Caucasian/Asian male American woke upper-middle class San Fransisco engineers) wish it to be.

Maybe that's a nice thing, I wouldn't say their values are wrong but let's call a spade a spade.

◧◩◪
3. ceejay+A7[view] [source] 2022-05-23 21:33:21
>>user39+C6
"Reality" as defined by the available training set isn't necessarily reality.

For example, Google's image search results pre-tweaking had some interesting thoughts on what constitutes a professional hairstyle, and that searches for "men" and "women" should only return light-skinned people: https://www.theguardian.com/technology/2016/apr/08/does-goog...

Does that reflect reality? No.

(I suspect there are also mostly unstated but very real concerns about these being used as child pornography, revenge porn, "show my ex brutally murdered" etc. generators.)

◧◩◪◨
4. ChadNa+Hi[view] [source] 2022-05-23 22:38:33
>>ceejay+A7
You know, it wouldn't surprise me if people talking about how black curly hair shouldn't be seen as unprofessional contributed to google thinking there's an association between the concepts of "unprofessional hair" and "black curly hair"
◧◩◪◨⬒
5. nearbu+MY[view] [source] 2022-05-24 05:27:21
>>ChadNa+Hi
That's exactly what's happening. Doing the search from the article of "unprofessional hair for work" brings up images with headlines like "It's ridiculous to say that black women's hair is unprofessional". (In addition to now bringing up images from that article itself and other similar articles comparing Google Images searches.)
◧◩◪◨⬒⬓
6. ceejay+7j2[view] [source] 2022-05-24 15:33:00
>>nearbu+MY
You’re getting cause and effect backwards. The coverage of this changed the results, as did Google’s ensuing interventions.
◧◩◪◨⬒⬓⬔
7. nearbu+aE3[view] [source] 2022-05-24 23:05:15
>>ceejay+7j2
I don't think so. You can set the search options to only find images published before the article, and even find some of the original images.

One image links to the 2015 article, "It's Ridiculous To Say Black Women's Natural Hair Is 'Unprofessional'!". The Guardian article on the Google results is from 2016.

Another image has the headline, "5 Reasons Natural Hair Should NOT be Viewed as Unprofessional - BGLH Marketplace" (2012).

Another: "What to Say When Someone Calls Your Hair Unprofessional".

Also, have you noticed how good and professional the black women in the Guardian's image search look? Most of them look like models with photos taken by professional photographers. Their hair is meticulously groomed and styled. This is not the type of photo an article would use to show "unprofessional hair". But it is the type of photo the above articles opted for.

[go to top]