zlacker

[parent] [thread] 4 comments
1. visarg+(OP)[view] [source] 2022-05-23 21:28:47
The big labs have become very sensitive with large model releases. It's too easy to make them generate bad PR, to the point of not releasing almost any of them. Flamingo was also a pretty great vison-language model that wasn't released, not even in a demo. PaLM is supposedly better than GPT-3 but closed off. It will probably take a year for open source models to appear.
replies(2): >>runner+f2 >>godels+83
2. runner+f2[view] [source] 2022-05-23 21:40:44
>>visarg+(OP)
The largest models which generate the headline benchmarks are never released after any number of years, it seems.

Very difficult to replicate results.

3. godels+83[view] [source] 2022-05-23 21:45:58
>>visarg+(OP)
That's because we're still bad about long-tailed data and that people outside the research don't realize that we're first prioritizing realistic images before we deal with long-tailed data (which is going to be the more generic form of bias). To be honest, it is a bit silly to focus on long-tailed data when results aren't great. That's why we see the constant pattern of getting good on a dataset and then focusing on the bias in that dataset.

I mean a good example of this is the Pulse[0][1] paper. You may remember it as the white Obama. This became a huge debate and it was pretty easily shown that the largest factor was the dataset bias. This outrage did lead to fixing FFHQ but it also sparked a huge debate with LeCun (data centric bias) and Timnit (model centric bias) at the center. Though Pulse is still remembered for this bias, not for how they responded to it. I should also note that there is human bias in this case as we have a priori knowledge of what the upsampled image should look like (humans are pretty good at this when the small image is already recognizable but this is a difficult metric to mathematically calculate).

It is fairly easy to find adversarial examples, where generative models produce biased results. It is FAR harder to fix these. Since this is known by the community but not by the public (and some community members focus on finding these holes but not fixing them) it creates outrage. Probably best for them to limit their release.

[0] https://arxiv.org/abs/2003.03808

[1] https://cdn.vox-cdn.com/thumbor/MXX-mZqWLQZW8Fdx1ilcFEHR8Wk=...

replies(2): >>alexb_+d01 >>visarg+Nr2
◧◩
4. alexb_+d01[view] [source] [discussion] 2022-05-24 06:53:37
>>godels+83
Well, if you showed that pixelated image to someone who has never seen Obama - would they make him white? I think so.
◧◩
5. visarg+Nr2[view] [source] [discussion] 2022-05-24 16:42:33
>>godels+83
> some community members focus on finding these holes but not fixing them

That's what bothered me the most in Timnit's crusade. Throw the baby with the bath water!

[go to top]