zlacker

[parent] [thread] 63 comments
1. danso+(OP)[view] [source] 2020-06-24 14:55:32
This story is really alarming because as described, the police ran a face recognition tool based on a frame of grainy security footage and got a positive hit. Does this tool give any indication of a confidence value? Does it return a list (sorted by confidence) of possible suspects, or any other kind of feedback that would indicate even to a layperson how much uncertainty there is?

The issue of face recognition algorithms performing worse on dark faces is a major problem. But the other side of it is: would police be more hesitant to act on such fuzzy evidence if the top match appeared to be a middle-class Caucasian (i.e. someone who is more likely to take legal recourse)?

replies(8): >>Pxtl+21 >>cacony+82 >>adim86+a4 >>zaroth+65 >>strgcm+w5 >>throwa+eb >>bsenft+tu >>peropo+rL2
2. Pxtl+21[view] [source] 2020-06-24 15:00:13
>>danso+(OP)
Intresting and related, a team made a neat "face depixelizer" that takes a pixelated image and uses machine learning to generate a face that should match the pixelated image.

What's hilarious is that it makes faces that look nothing like the original high-resolution images.

https://twitter.com/Chicken3gg/status/1274314622447820801

replies(5): >>mywitt+s1 >>jacque+52 >>danso+v4 >>nfrmat+K5 >>2038AD+0r2
◧◩
3. mywitt+s1[view] [source] [discussion] 2020-06-24 15:02:17
>>Pxtl+21
I wonder if this is trained on the same, or similar, datasets.
replies(1): >>jcims+sy1
◧◩
4. jacque+52[view] [source] [discussion] 2020-06-24 15:05:08
>>Pxtl+21
That should be called a face generator, not a depixelizer.
replies(1): >>Polyla+3H1
5. cacony+82[view] [source] 2020-06-24 15:05:18
>>danso+(OP)
People are not good at understanding uncertainty and its implications, even if you put it front and center. I used to work in renewable energy consulting and I was shocked by how aggressively uncertainty estimates are ignored by those whose goals they threaten.

In this case, it's incumbent on the software vendors to ensure that less-than-certain results aren't even shown to the user. American police can't generally be trusted to understand nuance and/or do the right thing.

6. adim86+a4[view] [source] 2020-06-24 15:12:53
>>danso+(OP)
I blame TV shows like CSI and all the other crap out there that make pixelated images look like something you could "Zoom" into or something the computer can still understand even if the eye does not. Because of this, non tech people do not really understand that pixelated images have LOST information. Add that to the racial situation in the U.S. and the the inaccuracy of the tool for black people. Wow, this can lead to some really troublesome results
replies(1): >>fickle+Vw1
◧◩
7. danso+v4[view] [source] [discussion] 2020-06-24 15:14:38
>>Pxtl+21
What's sad is that a tech entrepreneur will definitely add that feature and sell it to law enforcement agencies that believe in CSI magic: https://www.youtube.com/watch?v=Vxq9yj2pVWk
replies(1): >>barrke+df
8. zaroth+65[view] [source] 2020-06-24 15:17:03
>>danso+(OP)
> Does this tool give any indication of a confidence value?

Yes.

> Does it return a list (sorted by confidence) of possible suspects,

Yes.

> ... or any other kind of feedback that would indicate even to a layperson how much uncertainty there is?

Yes it does. It also states in large print heading “THIS DOCUMENT IS NOT A POSITIVE IDENTIFICATION IT IS AN INVESTIGATIVE LEAD AND IS NOT PROBABLE CAUSE TO ARREST”.

You can see a picture of this in the ACLU article.

The police bungled this badly by setting up a fake photo lineup with the loss prevention clerk who submitted the report (who had only ever seen the same footage they had).

However, tools that are rife for misuse do not get a pass because they include a bold disclaimer. If the tool/process can not prevent misuse, the tool/process is broken and possibly dangerous.

That said, we have little data on how often the tool results in catching dangerous criminals versus how often it misidentifies innocent people. We have little data on if those innocent people tend to skew toward a particular demographic.

But I have a fair suspicion that dragnet techniques like this unfortunately can be both effective and also problematic.

replies(1): >>danso+xh
9. strgcm+w5[view] [source] 2020-06-24 15:18:43
>>danso+(OP)
I think the NYT article has a little more detail: https://www.nytimes.com/2020/06/24/technology/facial-recogni...

Essentially, an employee of the facial recognition provider forwarded an "investigative lead" for the match they generated (which does have a score associated with it on the provider's side, but it's not clear if the score is clearly communicated to detectives as well), and the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man as being the suspect.

Everyone involved will probably point fingers at each other, because the provider for example put large heading on their communication that, "this is not probable cause for an arrest, this is only an investigative lead, etc.", while the detectives will say well we got a hit from a line-up, blame the witness, and the witness would probably say well the detectives showed me a line-up and he seemed like the right guy (or maybe as is often the case with line-ups, the detectives can exert a huge amount of bias/influence over witnesses).

EDIT: Just to be clear, none of this is to say that the process worked well or that I condone this. I think the data, the technology, the processes, and the level of understanding on the side of the police are all insufficient, and I do not support how this played out, but I think it is easy enough to provide at least some pseudo-justification at each step along the way.

replies(10): >>danso+p9 >>treis+ma >>ed2551+9b >>Burnin+3w >>gridlo+FF >>jhaywa+Lp1 >>aussie+OY1 >>classi+l02 >>bryanr+lc2 >>cavana+rf2
◧◩
10. nfrmat+K5[view] [source] [discussion] 2020-06-24 15:19:40
>>Pxtl+21
Interesting... Neat... Hilarious... In light of the submission and the comment you're responding to, these are not the words I would choose.

I think there's genuine cause for concern here, especially if technologies like these are candidates for inclusion in any real law enforcement decision-making.

◧◩
11. danso+p9[view] [source] [discussion] 2020-06-24 15:34:35
>>strgcm+w5
That's interesting. In many ways, it's similar to the "traditional" process I went through when reporting a robbery to the NYPD 5+ years ago: they had software where they could search for mugshots of all previously convicted felons living in a x-mile radius of the crime scene, filtered by the physical characteristics I described. Whether the actual suspect's face was found by the software, it was ultimately too slow and clunky to paginate through hundreds of results.

Presumably, the facial recognition software would provide an additional filter/sort. But at least in my situation, I could actually see how big the total pool of potential matches and thus have a sense of uncertainty about false positives, even if I were completely ignorant about the impact of false negatives (i.e. what if my suspect didn't live within x-miles of the scene, or wasn't a known/convicted felon?)

So the caution re: face recognition software is how it may non-transparently add confidence to this already very imperfect filtering process.

(in my case, the suspect was eventually found because he had committed a number of robberies, including being clearly caught on camera, and in an area/pattern that was easy to narrow down where he operated)

◧◩
12. treis+ma[view] [source] [discussion] 2020-06-24 15:37:40
>>strgcm+w5
I'm becoming increasingly frustrated with the difficulty in accessing primary source material. Why don't any of these outlets post the surveillance video and let us decide for ourselves how much of a resemblance there is.
replies(3): >>teduna+Oh >>njharm+NO >>BEEdwa+mT1
◧◩
13. ed2551+9b[view] [source] [discussion] 2020-06-24 15:39:40
>>strgcm+w5
> and the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man as being the suspect.

This is absurdly dangerously. The AI will find people who look like the suspect, that’s how the technology works. A lineup as evidence will almost guarantee a bad outcome, because of course the man looks like the suspect!

replies(2): >>barkin+fd >>kevin_+qv
14. throwa+eb[view] [source] 2020-06-24 15:40:18
>>danso+(OP)
> But the other side of it is: would police be more hesitant to act on such fuzzy evidence if the top match appeared to be a middle-class Caucasian (i.e. someone who is more likely to take legal recourse)?

Honest question: does race predict legal recourse when decoupled from socioeconomic status, or is this an assumption?

replies(3): >>advise+Ic >>SkyBel+bh >>danans+Pc2
◧◩
15. advise+Ic[view] [source] [discussion] 2020-06-24 15:46:36
>>throwa+eb
Race and socioeconomic status are deeply intertwined. Or to be more blunt - US society has kept black people poorer. To treat them as independent variables is to ignore the whole history of race in the US.
replies(1): >>throwa+do
◧◩◪
16. barkin+fd[view] [source] [discussion] 2020-06-24 15:48:58
>>ed2551+9b
I'm also half guessing if the "lineup" was 5 White people and the a photo of the victim.
◧◩◪
17. barrke+df[view] [source] [discussion] 2020-06-24 15:56:35
>>danso+v4
And another entrepreneur can add a feature to generate 10 different faces which match the same pixelation, and sell it to the defence.
replies(2): >>emilio+5I >>heavys+9I
◧◩
18. SkyBel+bh[view] [source] [discussion] 2020-06-24 16:03:36
>>throwa+eb
>Honest question: does race predict legal recourse when decoupled from socioeconomic status, or is this an assumption?

I think the issue is that regardless of the answer, it isn't decoupled in real world scenarios.

I think the solution isn't dependent upon race either. It is to ensure everyone have access to legal recourse regardless of socioeconomic status. This would have the side effect of benefiting races correlated with lower socioeconomic status more.

replies(1): >>throwa+vp
◧◩
19. danso+xh[view] [source] [discussion] 2020-06-24 16:05:04
>>zaroth+65
I think the software would be potentially less problematic if the victim/witness were given access, and (ostensibly) could see the pool of matches and how much/little the top likely match differed from the less confident matches.

> The police bungled this badly by setting up a fake photo lineup...*

FWIW, this process is similar to traditional police lineups. The witness is shown 4-6 people – one who is the actual suspect, and several that vaguely match a description of the suspect. When I was asked to identify a suspect in my robbery, the lineup included an assistant attorney who would later end up prosecuting the case. The police had to go out and find tall slight-skinned men to round out the lineup.

> ... with the loss prevention clerk who submitted the report (who had only ever seen the same footage they had).

Yeah, I would hope that this is not standard process. The lineup process is already imperfect and flawed as it is even with a witness who at least saw the crime first-hand.

◧◩◪
20. teduna+Oh[view] [source] [discussion] 2020-06-24 16:06:00
>>treis+ma
Do they have it? Police haven't always been forthcoming in publishing their evidence.
replies(1): >>treis+vq
◧◩◪
21. throwa+do[view] [source] [discussion] 2020-06-24 16:28:25
>>advise+Ic
> To treat them as independent variables is to ignore the whole history of race in the US.

Presumably the coupling of the variables is not binary (dependent or independent) but variable (degrees of coupling). Presumably these variables were more tightly coupled in the past than in the present. Presumably it's useful to understand precisely how coupled these variables are today because it would drive our approach to addressing these disparities. E.g., if the variables are loosely coupled then bias-reducing programs would have a marginal impact on the disparities and the better investment would be social welfare programs (and the inverse is true if the variables are tightly coupled).

◧◩◪
22. throwa+vp[view] [source] [discussion] 2020-06-24 16:32:24
>>SkyBel+bh
> I think the issue is that regardless of the answer, it isn't decoupled in real world scenarios.

Did you think I was asking about non-real-world scenarios? And how do we know that it's coupled (or rather, the degree to which it's coupled) in real world scenarios?

> I think the solution isn't dependent upon race either. It is to ensure everyone have access to legal recourse regardless of socioeconomic status. This would have the side effect of benefiting races correlated with lower socioeconomic status more.

This makes sense to me, although I don't know what this looks like in practice.

replies(1): >>jacobu+kb6
◧◩◪◨
23. treis+vq[view] [source] [discussion] 2020-06-24 16:35:03
>>teduna+Oh
If they don't how are they describing the quality of video and clear lack of resemblance?
replies(1): >>danso+Rw
24. bsenft+tu[view] [source] 2020-06-24 16:46:18
>>danso+(OP)
> The issue of face recognition algorithms performing worse on dark faces is a major problem.

This needs to be coupled with the truth that people (police) without diverse racial exposure are terrible at identifying people outside of their ethnicity. In the photo/text article they show the top of the "Investigative Lead Report" as an image. You mean to say that every cop who saw the two images side by side did not stop and say "hey, these are not the same person!" They did not, and that's because their own brains' could not see the difference.

This is a major reason police forces need to be ethnically diverse. Just that enables those members of the force who never grew up or spent time outside their ethnicity can learn to tell a diverse range of similar but different people outside their ethnicity apart.

◧◩◪
25. kevin_+qv[view] [source] [discussion] 2020-06-24 16:50:12
>>ed2551+9b
The worse part is that the employee wasn't a witness to anything. He was making the "ID" from the same video the police had.
◧◩
26. Burnin+3w[view] [source] [discussion] 2020-06-24 16:52:32
>>strgcm+w5
I can see why you'd only get 6 guys together for a physical "6 pack" line-up.

But for a photo lineup I can't imagine why you don't have least 25 photos to pick from.

replies(1): >>wtvanh+gC1
◧◩◪◨⬒
27. danso+Rw[view] [source] [discussion] 2020-06-24 16:54:58
>>treis+vq
I don't know what passage you're describing, but this one is implied to be part of a narrative that is told from the perspective of Mr. Williams, i.e. he's the one who remembers "The photo was blurry, but it was clearly not Mr. Williams"

> The detective turned over the first piece of paper. It was a still image from a surveillance video, showing a heavyset man, dressed in black and wearing a red St. Louis Cardinals cap, standing in front of a watch display. Five timepieces, worth $3,800, were shoplifted.

> “Is this you?” asked the detective.

> The second piece of paper was a close-up. The photo was blurry, but it was clearly not Mr. Williams. He picked up the image and held it next to his face.

All the preceding grafs are told in the context of "this what Mr. Williams said happened", most explicitly this one:

> “When’s the last time you went to a Shinola store?” one of the detectives asked, in Mr. Williams’s recollection.

According to the ACLU complaint, the DPD and prosecutor have refused FOIA requests regarding the case:

https://www.aclu.org/letter/aclu-michigan-complaint-re-use-f...

> Yet DPD has failed entirely to respond to Mr. Williams’ FOIA request. The Wayne County Prosecutor also has not provided documents.

replies(2): >>treis+Qz >>mgleas+KK1
◧◩◪◨⬒⬓
28. treis+Qz[view] [source] [discussion] 2020-06-24 17:07:31
>>danso+Rw
Maybe it's just me, but "we just took his word for it" doesn't strike me as particularly good journalism if that's what happened. If they really wrote these articles without that level of basic corroboration then that's pretty bad.
replies(1): >>danso+4F
◧◩◪◨⬒⬓⬔
29. danso+4F[view] [source] [discussion] 2020-06-24 17:30:53
>>treis+Qz
It's a common technique in journalism to describe and attribute someone's recollection of events in a series of narrative paragraphs. It does not imply "we just took his word for it", though it does imply that the reporter finds his account to be credible enough to be given some prominent space.

This arrest happened 6 months ago. Who else besides the suspect and the police do you believe reporters should ask for "basic corroboration" of events that took place inside a police station? Or do you think this story shouldn't be reported on at all until the police agree to give additional info?

replies(1): >>phendr+Ec1
◧◩
30. gridlo+FF[view] [source] [discussion] 2020-06-24 17:33:18
>>strgcm+w5
> Essentially, an employee of the facial recognition provider forwarded an "investigative lead" for the match they generated (which does have a score associated with it on the provider's side, but it's not clear if the score is clearly communicated to detectives as well)

This is the lead provided:

https://wfdd-live.s3.amazonaws.com/styles/story-full/s3/imag...

Note that it says in red and bold emphasis:

THIS DOCUMENT IS NOT A POSITIVE IDENTIFICATION. IT IS AN INVESTIGATIVE LEAD ONLY AND IS NOT PROBABLE CAUSE TO ARRREST. FURTHER INVESTIGATION IS NEEDED TO DEVELOP PROBABLE CAUSE TO ARREST.

replies(1): >>throwa+eI
◧◩◪◨
31. emilio+5I[view] [source] [discussion] 2020-06-24 17:42:42
>>barrke+df
A better strategy might be to pixelate a photo of each member of the jury, than de-pixelate it through the same service, and distribute the before and after. Maybe include the judge and prosecutor.
◧◩◪◨
32. heavys+9I[view] [source] [discussion] 2020-06-24 17:42:49
>>barrke+df
Doubt that many people can afford to hire an expert witness, or hire someone to develop bespoke software for their trial.
◧◩◪
33. throwa+eI[view] [source] [discussion] 2020-06-24 17:43:00
>>gridlo+FF
Dear god the input image they used to generate that is TERRIBLE! It could be damn near any black male.

The real negligence here is whoever tuned the software to spit out a result for that quality of image rather than a "not enough data, too many matches, please submit a better image" error.

replies(2): >>mindsl+RN >>treis+Jj1
◧◩◪◨
34. mindsl+RN[view] [source] [discussion] 2020-06-24 18:07:01
>>throwa+eI
I'm not even sure that's definitely a black man, rather than just any person with some kind of visor or mask. There does seem to be a face in the noise, but human brains are primed to see face shapes.

The deeper reform that needs to happen here is that every person falsely arrested and/or prosecuted needs to be automatically compensated for their time wasted and other harm suffered. Only then will police departments have some incentive for restraint. Currently we have a perverse reverse lottery where if you're unlucky you just lose a day/month/year of your life. With the state of what we're actually protesting I'm not holding my breath (eg the privileged criminals who committed the first degree murder of Breonna Taylor still have yet to be charged), but it's still worth calling out the smaller injustices that criminal "justice" system inflicts.

replies(2): >>alasda+Xn1 >>seekup+t82
◧◩◪
35. njharm+NO[view] [source] [discussion] 2020-06-24 18:10:52
>>treis+ma
Because they're not in the business of providing information, transparency or journalism.

They are in the business of exposing you to as many paid ads as possible. And they believe providing outgoing links reduces their ability to do that.

replies(1): >>alasda+hn1
◧◩◪◨⬒⬓⬔⧯
36. phendr+Ec1[view] [source] [discussion] 2020-06-24 20:00:32
>>danso+4F
It should at least be very clear at the paragraph level what is established fact and what is speculation/opinion.
replies(1): >>lefsta+WF1
◧◩◪◨
37. treis+Jj1[view] [source] [discussion] 2020-06-24 20:40:20
>>throwa+eI
You're also looking at a scan of a small print out with poor contrast and brightness. There's probably a lot more detail there at full resolution, brightened up to show the face, and then enhanced contrast that the computer is seeing.
◧◩◪◨
38. alasda+hn1[view] [source] [discussion] 2020-06-24 20:58:51
>>njharm+NO
>They are in the business of exposing you to as many paid ads as possible.

NPR is a non-profit that is mostly funded by donations. They only have minimal paid ads on their website to pay for running costs - they could easily optimize the news pages to increase ad revenue but they don't because it would get in the way of their goals.

◧◩◪◨⬒
39. alasda+Xn1[view] [source] [discussion] 2020-06-24 21:03:04
>>mindsl+RN
>The deeper reform that needs to happen here is that every person falsely arrested and/or prosecuted needs to be automatically compensated for their time wasted and other harm suffered.

I agree here, but doing that may lead to the prosecutors trying extra hard to find something to charge a person with after they are arrested, even if it was something trivial that would often go un-prosecuted.

Getting the details right seems tough, but doable.

◧◩
40. jhaywa+Lp1[view] [source] [discussion] 2020-06-24 21:14:31
>>strgcm+w5
> the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man

This is not correct. The "6-pack" was shown to a security firm's employee, who had viewed the store camera's tape.

"In this case, however, according to the Detroit police report, investigators simply included Mr. Williams’s picture in a “6-pack photo lineup” they created and showed to Ms. Johnston, Shinola’s loss-prevention contractor, and she identified him." [1]

[1] ibid.

◧◩
41. fickle+Vw1[view] [source] [discussion] 2020-06-24 22:05:44
>>adim86+a4
I lose hours every day just yelling "enhance" at my computer screen. Hasn't worked yet, but any day now...
replies(1): >>auggie+Kb2
◧◩◪
42. jcims+sy1[view] [source] [discussion] 2020-06-24 22:18:01
>>mywitt+s1
One of the underlying models, PULSE, was trained on CelebAHQ, which is likely what the results are mostly white-looking. StyleGAN, which was trained on the much more diverse (but sparse) FFHQ dataset does come up with a much more diverse set of faces[1]...but PULSE couldn't get them to converge very closely on the pixelated subjects...so they went with CelebA [2].

[1] https://github.com/NVlabs/stylegan [2] https://arxiv.org/pdf/2003.03808.pdf (ctrl+f ffhq)

◧◩◪
43. wtvanh+gC1[view] [source] [discussion] 2020-06-24 22:48:49
>>Burnin+3w
Excellent point. In fact, the entire process of showing the witness the photos should be recorded, and double blind. I.e the officer showing the person should not know anything about the lineup.
◧◩◪◨⬒⬓⬔⧯▣
44. lefsta+WF1[view] [source] [discussion] 2020-06-24 23:18:01
>>phendr+Ec1
Well, it was “according to someone familiar with the matter”
◧◩◪
45. Polyla+3H1[view] [source] [discussion] 2020-06-24 23:27:00
>>jacque+52
Basically. The faces look plausible but less useful than the original blurred image.
◧◩◪◨⬒⬓
46. mgleas+KK1[view] [source] [discussion] 2020-06-24 23:56:42
>>danso+Rw
>> I don't know what passage you're describing,

The 4th sentence says: "Detectives zoomed in on the grainy footage..."

◧◩◪
47. BEEdwa+mT1[view] [source] [discussion] 2020-06-25 01:14:41
>>treis+ma
Even if the guy was an exact facial match, that doesn't justify the complete lack of basic police work to establish it was him.
replies(1): >>czbond+F82
◧◩
48. aussie+OY1[view] [source] [discussion] 2020-06-25 02:08:09
>>strgcm+w5
Just a tip in case it happens to anyone - Never, ever agree to be in a lineup.
◧◩
49. classi+l02[view] [source] [discussion] 2020-06-25 02:21:59
>>strgcm+w5
This is why you should be scared of this tech. Computer assisted patsy finder. No need to find the right guy when the ai will happily cough up 20 people nearby who kinda sorta look like the perp enough to stuff them into a lineup in front of a confused and highly fallible witness.
replies(1): >>ntspln+T22
◧◩◪
50. ntspln+T22[view] [source] [discussion] 2020-06-25 02:51:08
>>classi+l02
Yep, the potential for abuse here is insane.
◧◩◪◨⬒
51. seekup+t82[view] [source] [discussion] 2020-06-25 03:52:38
>>mindsl+RN
>Currently we have a perverse reverse lottery where if you're unlucky you just lose a day/month/year of your life

that's what happens if you're lucky

◧◩◪◨
52. czbond+F82[view] [source] [discussion] 2020-06-25 03:53:59
>>BEEdwa+mT1
Absolutely agree - and the consequences to a personal citizen for the lack of that basic police work can be long lastingly negative.
◧◩◪
53. auggie+Kb2[view] [source] [discussion] 2020-06-25 04:32:19
>>fickle+Vw1
Bladerunner
◧◩
54. bryanr+lc2[view] [source] [discussion] 2020-06-25 04:40:53
>>strgcm+w5
>into a "6 pack" photo line-up

How did the people in the 6 pack photo line-up match up against the facial recognition? Were they likely matches?

replies(2): >>cavana+vf2 >>MertsA+mq2
◧◩
55. danans+Pc2[view] [source] [discussion] 2020-06-25 04:45:46
>>throwa+eb
Middle class black people often get harassed by police, and there is a long history of far steeper sentences for convictions for drugs used more by the black population (crack) than that used more by the white population (cocaine).

So unequal treatment based on race has quite literally been a feature of the US justice system, independent of socioeconomic status.

replies(1): >>throwa+Ih3
◧◩
56. cavana+rf2[view] [source] [discussion] 2020-06-25 05:18:19
>>strgcm+w5
It wasn't just that the employee picked the man out of 6 pack; the employee they interviewed wasn't even a witness to the crime in the first place.
◧◩◪
57. cavana+vf2[view] [source] [discussion] 2020-06-25 05:19:19
>>bryanr+lc2
Even worse, the employee who was asked to pick him out of a line up hadn't even witnessed the crime in the first place.
◧◩◪
58. MertsA+mq2[view] [source] [discussion] 2020-06-25 07:30:48
>>bryanr+lc2
No clue about the likelihood of police using similar facial recognition matches for the rest, but normally the alternates need to be around the same height, build, and complexion as the subject. I would think including multiple potential matches would be a huge no-no simply because your alternates need to be people who you know are not a match. If you just grab the 6 most similar faces and ask the victim to choose, what do you do when they pick the third closest match?
replies(1): >>bryanr+1s2
◧◩
59. 2038AD+0r2[view] [source] [discussion] 2020-06-25 07:37:39
>>Pxtl+21
Ironically, if the police had used and followed the face depixelizer then we may not have had the false arrest of a black man - not because of accuracy but because it doesn't produce many black faces
◧◩◪◨
60. bryanr+1s2[view] [source] [discussion] 2020-06-25 07:48:23
>>MertsA+mq2
Well you may know some people are not a match because you know where they were, for example pictures could be of people who were incarcerated at the time of the crime.
61. peropo+rL2[view] [source] 2020-06-25 10:51:52
>>danso+(OP)
It wouldn't make it into the newspapers, so it doesn't matter.
◧◩◪
62. throwa+Ih3[view] [source] [discussion] 2020-06-25 14:29:55
>>danans+Pc2
I’m aware, but that doesn’t answer my question about access to legal recourse.
replies(1): >>danans+E44
◧◩◪◨
63. danans+E44[view] [source] [discussion] 2020-06-25 18:56:55
>>throwa+Ih3
Once you are convicted, and are subject to one of the disproportionate sentences often given to black people, nothing short of a major change to how sentencing law works can provide legal recourse. See: https://www.sentencingproject.org/issues/racial-disparity/

If you survive violence at the hands of law enforcement and are not convicted of a crime, or if you don't and your family wants to hold law enforcement accountable, then the first option is to ask the local public prosecutor to pursue criminal charges against your attackers.

Depending on where you live could be a challenge, given the amount of institutional racial bias in the justice system, and how closely prosecutors tend to work with police departments. After all, if prosecutors were going after police brutality cases aggressively, there likely wouldn't be as much of a problem as there is.

If that's fruitless, you would need to seek the help of a civil rights attorney to push your case in the the legal system and/or the media. This is where a lot of higher profile cases like this end up - and often only because they were recorded on video.

◧◩◪◨
64. jacobu+kb6[view] [source] [discussion] 2020-06-26 14:24:37
>>throwa+vp
Like social democracy
[go to top]