zlacker

[parent] [thread] 2 comments
1. dafoex+(OP)[view] [source] 2020-06-24 15:09:22
In a world where some police forces don't use polygraph lie detectors because they are deemed too inaccurate, it baffles me that people would make an arrest based on a facial recognition hit from poor quality data.

But no, its AI, its magical and it must be right.

replies(1): >>treis+Rr
2. treis+Rr[view] [source] 2020-06-24 16:49:03
>>dafoex+(OP)
This seems similar to self-driving cars where people hold the computer to much higher standards than humans. I don't have solid proof, but I suspect that using facial recognition with a reasonable confidence threshold and reasonable source images is more accurate than eyewitness ID. If for no other reason than the threshold for a positive eyewitness ID is laughably bad.

The current best practice is to have a witness pick out the suspect from 6 photos. It should be immediately obvious that right off the bat there's a 17% chance of the witness randomly picking the "right" person. It's a terrible way to do things and it's no surprise that people are wrongly convicted again and again on eyewitness testimony.

replies(1): >>TheCoe+nk3
◧◩
3. TheCoe+nk3[view] [source] [discussion] 2020-06-25 15:00:20
>>treis+Rr
You need a much higher standard standard of accuracy for facial recognition because it is applied indiscriminately to a large population. If it has 99.9% accuracy and you apply it to a population of 10,000 people, you will get on average 10 false positives.
[go to top]