zlacker

[return to "Imagen, a text-to-image diffusion model"]
1. discmo+gF[view] [source] 2022-05-24 01:46:30
>>kevema+(OP)
For people complaining that they can't play with the model... I work at Google and I also can't play with the model :'(
◧◩
2. interb+BA1[view] [source] 2022-05-24 11:32:42
>>discmo+gF
I think they address some of the reasoning behind this pretty clearly in the write-up as well?

> The potential risks of misuse raise concerns regarding responsible open-sourcing of code and demos. At this time we have decided not to release code or a public demo. In future work we will explore a framework for responsible externalization that balances the value of external auditing with the risks of unrestricted open-access.

I can see the argument here. It would be super fun to test this model's ability to generate arbitrary images, but "arbitrary" also contains space for a lot of distasteful stuff. Add in this point:

> While a subset of our training data was filtered to removed noise and undesirable content, such as pornographic imagery and toxic language, we also utilized LAION-400M dataset which is known to contain a wide range of inappropriate content including pornographic imagery, racist slurs, and harmful social stereotypes. Imagen relies on text encoders trained on uncurated web-scale data, and thus inherits the social biases and limitations of large language models. As such, there is a risk that Imagen has encoded harmful stereotypes and representations, which guides our decision to not release Imagen for public use without further safeguards in place.

That said, I hope they're serious about the "framework for responsible externalization" part, both because it would be really fun to play with this model and because it would be interesting to test it outside of their hand-picked examples.

◧◩◪
3. Siira+dS2[view] [source] 2022-05-24 18:15:39
>>interb+BA1
> harmful stereotypes and representations

So we can't have this model because of ... the mere possibility of stereotypes? With this logic, humans should all die, as we certainly encode some nasty stereotypes in our brains.

This level of dishonesty to not give back to the community is not unexpected at this point, but seeing apologists here is.

[go to top]