zlacker

[parent] [thread] 2 comments
1. redorb+(OP)[view] [source] 2020-06-25 01:03:45
If it's statistically proven to not work with black people then I think the only options are

1) Make it avoid black people, i.e. they aren't stored in the database and aren't processed when scanned.

2) Put a 5 year hiatus on commercial / public use.

Either of these things are more acceptable than too many false positives. #1 is really interesting to me as a thought experiment because it makes everyone think twice.

replies(1): >>fatso7+z
2. fatso7+z[view] [source] 2020-06-25 01:07:38
>>redorb+(OP)
The issue is that it's hard to determine who is considered "black" and who is not since race is a made-up fiction constructed by white supremacists that is now so engrained in American society we find it difficult if not impossible to reflect on it. Are we using the Fitzpatrick skin color scale? Are we continuing the hypodescent rule (an algorithm, btw)?

Maybe we just outlaw face recognition in criminal justice entirely.

replies(1): >>redorb+fb
◧◩
3. redorb+fb[view] [source] [discussion] 2020-06-25 02:52:05
>>fatso7+z
It definitely shouldn't be allowed as the main evidence - perhaps the answer to skin tone is to use the data and see where the algo starts to get too many false positives..

The largest - over arching idea - is to get everyone to think twice by making the majority think twice - If white people think 'its only for us!?' it'll make them really study the effects.. (I'm white.)

[go to top]