zlacker

[parent] [thread] 1 comments
1. dtwest+(OP)[view] [source] 2020-06-25 14:28:13
My point was that this technology should not be used as evidence, and should not be grounds to take any forceful action against someone. If a cop abuses this, it is the cop's fault and we should hold them accountable. If the cop acted ignorantly because they were lied to by marketers, their boss, or a software company, those parties should be held accountable as well.

If your strategy is to get rid of all pretexts for police action, I don't think that is the right one. Instead we need to set a high standard of conduct and make sure it is upheld. If you don't understand a tool, don't use it. If you do something horrible while using a tool you don't understand, it is negligent/irresponsible/maybe even malevolent, because it was your responsibility to understand it before using it.

A weatherman saying there is a 90% chance of rain is not evidence that it rained. And I understand the fear that a prediction can be abused, and we need to make sure it isn't abused. But abolishing the weatherman isn't the way to do it.

replies(1): >>danans+i8
2. danans+i8[view] [source] 2020-06-25 15:13:47
>>dtwest+(OP)
> If your strategy is to get rid of all pretexts for police action, I don't think that is the right one.

Not at all.

> Instead we need to set a high standard of conduct and make sure it is upheld

Yes, but we should be real about what this means. The institution of law enforcement is rotten, which is why it protects bad actors to such a degree. It needs to be cleaved from its racist history and be rebuilt nearly from the ground up. Better training in interpreting results from an ML model won't be enough by a long shot.

[go to top]