zlacker

[return to "Facial Recognition Leads To False Arrest Of Black Man In Detroit"]
1. hpoe+V1[view] [source] 2020-06-24 14:55:21
>>vermon+(OP)
I don't think using the facial recognition is necessarily wrong to help identify probable suspects, but arresting someone based on a facial match algorithm is definitely going too far.

Of course really I blame the AI/ML hucksters for part of this mess who have sold us the idea of machines replacing rather than augmenting human decision making.

◧◩
2. dafoex+d5[view] [source] 2020-06-24 15:09:22
>>hpoe+V1
In a world where some police forces don't use polygraph lie detectors because they are deemed too inaccurate, it baffles me that people would make an arrest based on a facial recognition hit from poor quality data.

But no, its AI, its magical and it must be right.

◧◩◪
3. treis+4x[view] [source] 2020-06-24 16:49:03
>>dafoex+d5
This seems similar to self-driving cars where people hold the computer to much higher standards than humans. I don't have solid proof, but I suspect that using facial recognition with a reasonable confidence threshold and reasonable source images is more accurate than eyewitness ID. If for no other reason than the threshold for a positive eyewitness ID is laughably bad.

The current best practice is to have a witness pick out the suspect from 6 photos. It should be immediately obvious that right off the bat there's a 17% chance of the witness randomly picking the "right" person. It's a terrible way to do things and it's no surprise that people are wrongly convicted again and again on eyewitness testimony.

◧◩◪◨
4. TheCoe+Ap3[view] [source] 2020-06-25 15:00:20
>>treis+4x
You need a much higher standard standard of accuracy for facial recognition because it is applied indiscriminately to a large population. If it has 99.9% accuracy and you apply it to a population of 10,000 people, you will get on average 10 false positives.
[go to top]