zlacker

[return to "Facial Recognition Leads To False Arrest Of Black Man In Detroit"]
1. danso+02[view] [source] 2020-06-24 14:55:32
>>vermon+(OP)
This story is really alarming because as described, the police ran a face recognition tool based on a frame of grainy security footage and got a positive hit. Does this tool give any indication of a confidence value? Does it return a list (sorted by confidence) of possible suspects, or any other kind of feedback that would indicate even to a layperson how much uncertainty there is?

The issue of face recognition algorithms performing worse on dark faces is a major problem. But the other side of it is: would police be more hesitant to act on such fuzzy evidence if the top match appeared to be a middle-class Caucasian (i.e. someone who is more likely to take legal recourse)?

◧◩
2. strgcm+w7[view] [source] 2020-06-24 15:18:43
>>danso+02
I think the NYT article has a little more detail: https://www.nytimes.com/2020/06/24/technology/facial-recogni...

Essentially, an employee of the facial recognition provider forwarded an "investigative lead" for the match they generated (which does have a score associated with it on the provider's side, but it's not clear if the score is clearly communicated to detectives as well), and the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man as being the suspect.

Everyone involved will probably point fingers at each other, because the provider for example put large heading on their communication that, "this is not probable cause for an arrest, this is only an investigative lead, etc.", while the detectives will say well we got a hit from a line-up, blame the witness, and the witness would probably say well the detectives showed me a line-up and he seemed like the right guy (or maybe as is often the case with line-ups, the detectives can exert a huge amount of bias/influence over witnesses).

EDIT: Just to be clear, none of this is to say that the process worked well or that I condone this. I think the data, the technology, the processes, and the level of understanding on the side of the police are all insufficient, and I do not support how this played out, but I think it is easy enough to provide at least some pseudo-justification at each step along the way.

◧◩◪
3. bryanr+le2[view] [source] 2020-06-25 04:40:53
>>strgcm+w7
>into a "6 pack" photo line-up

How did the people in the 6 pack photo line-up match up against the facial recognition? Were they likely matches?

◧◩◪◨
4. MertsA+ms2[view] [source] 2020-06-25 07:30:48
>>bryanr+le2
No clue about the likelihood of police using similar facial recognition matches for the rest, but normally the alternates need to be around the same height, build, and complexion as the subject. I would think including multiple potential matches would be a huge no-no simply because your alternates need to be people who you know are not a match. If you just grab the 6 most similar faces and ask the victim to choose, what do you do when they pick the third closest match?
◧◩◪◨⬒
5. bryanr+1u2[view] [source] 2020-06-25 07:48:23
>>MertsA+ms2
Well you may know some people are not a match because you know where they were, for example pictures could be of people who were incarcerated at the time of the crime.
[go to top]