Probably the clearest example is in the section on stereotypes. Again, the problems are solidly identified but the understandings of their causes are lacking. The implication of surprise that an algorithm with real-world data would associate being a doctor with being a male and being a nurse with being a female is highly suspicious. There’s plenty of reason to assume a social bias feedback loop as discussed in the beginning of the article, but the argument that this is a result of bias rsearch is unsound. This correlation is consistent in the statistical data, in all societies and cultures on Earth. Again, there’s plenty of reason to argue the results correlate with societal gender bias, but not researcher gender bias. Big difference. The race/skin color correlations argues in the article do have sound arguments for researcher bias.
It literally says: “It’s not that data can be biased. Data is biased.” Know how your data was generated and what biases it may contain. We are encoding and even amplifying societal biases in the algorithms we create.