Well the results would unquestionably be biased. All results being black people wouldn't reflect reality at all, and hurting feelings to enact change seems like a poor justification for incorrect results.
> I'd say it doesn't actually matter, as long as the population sampled is made clear to the user.
Ok, and let's say I ask for "criminals in Cheyenne Wyoming" and it doesn't know the answer to that, should it just do its best to answer? Seem risky if people are going to get fired up about it and act on this to get "real change".
That seems like a good parallel to what we're talking about here, since it's very unlikely that crime statistics were fed into this image generating model.