zlacker

[return to "Facial Recognition Leads To False Arrest Of Black Man In Detroit"]
1. w_t_pa+It2[view] [source] 2020-06-25 07:44:24
>>vermon+(OP)
Perhaps we, as technologists, are going about this the wrong way. Maybe, instead of trying to reduce the false alarm rate to an arbitrarily low number, we instead develop CFAR (Constant false alarm rate) systems, so that users of the system know that they will get some false alarms, and develop procedures for responding appropriately. In that way, we could get the benefit of the technology, whilst also ensuring that the system as a whole (man and machine together) are designed to be robust and have appropriate checks and balances.
◧◩
2. gwd+Du2[view] [source] 2020-06-25 07:53:03
>>w_t_pa+It2
If you follow the link, you'll see that the computer report had this message right at the top in massive letters:

THIS DOCUMENT IS NOT A POSITIVE IDENTIFICATION. IT IS AN INVESTIGATIVE LEAD ONLY AND IS _NOT_ PROBABLE CAUSE TO ARREST. FURTHER INVESTIGATION IS NEEDED TO DEVELOP PROBABLE CAUSE TO ARREST.

I mean, what else could the technologists have done?

◧◩◪
3. Mangal+GQ2[view] [source] 2020-06-25 11:24:10
>>gwd+Du2
I wonder if the problem here is just the way traditional policing works, not even the technology.
◧◩◪◨
4. noisy_+FU2[view] [source] 2020-06-25 11:57:00
>>Mangal+GQ2
It is indeed the way traditional policing works. The average police officer things facial recognition is the visual equivalent of Google and since he/she can similarly rely on search results - they either have no idea about false positive biases based on race or worse, probably are too lazy to dig deeper.

Though the suffering of the victims of such wrong matches is real, one consolation is that more of such cases will hopefully bring about the much needed scepticism in the results so that some old-fashioned validation/investigation is done.

[go to top]