Facial recognition produces potential matches. It's still up to humans to look at footage themselves and use their judgment as to whether it's actually the same person or not, as well as to judge whether other elements fit the suspect or not.
The problem here is 100% on the cop(s) who made that call for themselves, or intentionally ignored obvious differences. (Of course, without us seeing the actual images in question, it's hard to judge.)
There are plenty of dangers with facial recognition (like using it at scale, or to track people without accountability), but this one doesn't seem to be it.
I disagree. There is plenty of blame on the cops who made that call for themselves, true.
But there doesn't have to be a single party who is at fault. The facial recognition software is badly flawed in this dimension. It's well established that the current technologies are racially biased. So there's at least some fault in the developer of that technology, and the purchasing officer at the police department, and a criminal justice system that allows it to be used that way.
Reducing a complex problem to a single at-fault person produces an analysis that will often let other issues continue to fester. Consider if the FAA always stopped the analysis of air-crashes at: "the pilot made an error, so we won't take any other corrective actions other than punishing the pilot". Air travel wouldn't nearly as safe as it is today.
While we should hold these officers responsible for their mistake (abolish QI so that these officers could be sued civilly for the wrongful arrest!), we should also fix the other parts of the system that are obviously broken.
Who decided to use this software for this purpose, despite these bad flaws and well established bias? The buck stops with the cops.
Plenty of blame to go around.