zlacker

[return to "Facial Recognition Leads To False Arrest Of Black Man In Detroit"]
1. jandre+Wi1[view] [source] 2020-06-24 20:24:42
>>vermon+(OP)
It isn't just facial recognition, license plate readers can have the same indefensibly Kafka-esque outcomes where no one is held accountable for verifying computer-generated "evidence". Systems like in the article make it so cheap for the government to make a mistake, since there are few consequences, that they simply accept mistakes as a cost of doing business.

Someone I know received vehicular fines from San Francisco on an almost weekly basis solely from license plate reader hits. The documentary evidence sent with the fines clearly showed her car had been misidentified but no one ever bothered to check. She was forced to fight each and every fine because they come with a presumption of guilt, but as soon as she cleared one they would send her a new one. The experience became extremely upsetting for her, the entire bureaucracy simply didn't care.

It took threats of legal action against the city for them to set a flag that apparently causes violations attributed to her car to be manually reviewed. The city itself claimed the system was only 80-90% accurate, but they didn't believe that to be a problem.

◧◩
2. black_+Vl1[view] [source] 2020-06-24 20:41:28
>>jandre+Wi1
I agree that's bad, and license plate readers come with their own set of problems.

But being biased by the skin color of the driver is (AFAIK) not one of them. Which is exactly the problem with vision systems applied to humans, at least the ones we've seen deployed so far.

If a system discriminates against a specific population, that's very different from (indiscriminately) being unreliable.

[go to top]