zlacker

[parent] [thread] 6 comments
1. tines+(OP)[view] [source] 2022-05-23 23:09:39
I'd say it doesn't actually matter, as long as the population sampled is made clear to the user.

If I ask for pictures of Japanese people, I'm not shocked when all the results are of Japanese people. If I asked for "criminals in the United States" and all the results are black people, that should concern me, not because the data set is biased but because the real world is biased and we should do something about that. The difference is that I know what set I'm asking for a sample from, and I can react accordingly.

replies(3): >>magica+T3 >>nyolfe+g5 >>jfoste+DT
2. magica+T3[view] [source] 2022-05-23 23:40:56
>>tines+(OP)
> If I asked for "criminals in the United States" and all the results are black people, that should concern me, not because the data set is biased

Well the results would unquestionably be biased. All results being black people wouldn't reflect reality at all, and hurting feelings to enact change seems like a poor justification for incorrect results.

> I'd say it doesn't actually matter, as long as the population sampled is made clear to the user.

Ok, and let's say I ask for "criminals in Cheyenne Wyoming" and it doesn't know the answer to that, should it just do its best to answer? Seem risky if people are going to get fired up about it and act on this to get "real change".

That seems like a good parallel to what we're talking about here, since it's very unlikely that crime statistics were fed into this image generating model.

3. nyolfe+g5[view] [source] 2022-05-23 23:53:02
>>tines+(OP)
> If I asked for "criminals in the United States" and all the results are black people,

curiously, this search actually only returns white people for me on GIS

4. jfoste+DT[view] [source] 2022-05-24 08:23:42
>>tines+(OP)
In a way, if the model brings back an image for "criminals in the United States" that isn't based on the statistical reality, isn't it essentially complicit in sweeping a major social issue under the rug?

We may not like what it shows us, but blindfolding ourselves is not the solution to that problem.

replies(1): >>webmav+nM7
◧◩
5. webmav+nM7[view] [source] [discussion] 2022-05-26 08:04:34
>>jfoste+DT
At the very least we should expect that the results not be more biased than reality. Not all criminals are Black. Not all are men. Not all are poor. If the model (which is stochastic) only outputs poor Black men, rather than a distribution that is closer to reality, it is exhibiting bias and it is fair to ask why the data it picked that bias up from is not reflective of reality.
replies(1): >>jfoste+BN7
◧◩◪
6. jfoste+BN7[view] [source] [discussion] 2022-05-26 08:18:15
>>webmav+nM7
Yeah, it makes sense for the results to simply reflect reality as closely as possible. No bias in any direction is desirable.
replies(1): >>webmav+CNa
◧◩◪◨
7. webmav+CNa[view] [source] [discussion] 2022-05-27 09:05:19
>>jfoste+BN7
Sarcasm, eh? At least there's no way THAT could be taken the wrong way.
[go to top]