zlacker

[parent] [thread] 5 comments
1. mytail+(OP)[view] [source] 2025-08-13 14:39:12
What happens when a police constable thinks they recognise you from evidence they have in an investigation or a wanted person notice?

This is nothing new. It is all about what is reasonable in the circumstances.

replies(2): >>cmcale+a5 >>southe+na
2. cmcale+a5[view] [source] 2025-08-13 15:01:20
>>mytail+(OP)
A constable is not going to be scanning the faces everyone going to Wembley in one night. Even 100 constables looking at faces entering faces going to Wembley is not going to scan everyone and recognise someone they know from a wanted poster (of maybe a couple hundred faces in their head).

The Met have already lied about the scale of false positives[0] by nearly 1000x, and it's not obvious how much better it will get. With the current tech, this rate will get worse as more faces are being looked for. If it's only looking for (I'm guessing) a thousand high-risk targets now and the rate is 1/40, as more and more faces get searched for this problem gets exponentially worse as the risk of feature collisions rise.

Of course, it'll also disproportionately affect ethnic groups who are more represented in this database too, making life for honest members of those groups more difficult than it already is.

The scale is what makes it different. The lack of accountability for the tech and the false confidence it gives police is what makes it different.

[0]: Met's claim was 1/33,000 false positives, actual 1/40 according to this article from last year https://www.bbc.com/news/technology-69055945

replies(1): >>mytail+Kb
3. southe+na[view] [source] 2025-08-13 15:25:09
>>mytail+(OP)
Again worth mentioning something I've mentioned in other comments, and it's enormously obvious: There's a massive differene between unluckily being misindentified by some random copper who needs to get his memory or eyesight checked, and the percentage of false positives that's nearly guaranteed from a mass digital facial rec surviellance system working around the clock on categorizing millions of faces all over the country. The first is a bit of bad luck, the second will likely become pervasive, systemic and lead to assorted other shit consequences for many people being cross-checked and categorized in all kinds of insidiuous ways
replies(1): >>mytail+ic
◧◩
4. mytail+Kb[view] [source] [discussion] 2025-08-13 15:31:27
>>cmcale+a5
> [0]: Met's claim was 1/33,000 false positives, actual 1/40 according to this article from last year https://www.bbc.com/news/technology-69055945

The article does not claim this:

"The Metropolitan Police say that around one in every 33,000 people who walk by its cameras is misidentified.

But the error count is much higher once someone is actually flagged. One in 40 alerts so far this year has been a false positive"

These are 2 different metrics that measure 2 different things and so they are both correct at the same time. But I must say I am not clear what each exactly means.

◧◩
5. mytail+ic[view] [source] [discussion] 2025-08-13 15:33:31
>>southe+na
You raise a good point that if the system wrongly ID you once it means that you're probably liable to be flagged every time you walk past one of those vans...
replies(1): >>southe+ns
◧◩◪
6. southe+ns[view] [source] [discussion] 2025-08-13 16:49:59
>>mytail+ic
I think it's almost inevitable. The very nature of the bureaucratic procedures that grow up around these sorts of flag lists is that effort tends to accumulate at those points, right or wrong, and your being listed on them becomes almost self-reinforcing through bureaucratic inertia and over-caution, mixed with laziness about investigating if their own systems are wrong and repairing the problem.
[go to top]