zlacker

[parent] [thread] 6 comments
1. toofy+(OP)[view] [source] 2020-06-25 03:21:07
> Technically the software isn't killing anyone, irresponsible users of it are.

Sure, but at this point, we know how irresponsible users often are, we know this to be a an absolute fact. If the fact of user’s irresponsibility isn’t the centerpiece of our conversations, then we’re being incredibly irresponsible ourselves.

The material manifestations of how these tools will be used has to remain at the center if researchers place any value whatsoever on our ethical responsibilities.

replies(2): >>jessta+85 >>dzhiur+Z6
2. jessta+85[view] [source] 2020-06-25 04:21:50
>>toofy+(OP)
Yep, There are so many psychology studies that show groupthink , people using statements of an authority as a way remove individual responsibility and people overriding their own perceptions to agree with an authority.

"I guess the computer got it wrong" is a terrifying thing for a police officer to say.

3. dzhiur+Z6[view] [source] 2020-06-25 04:44:20
>>toofy+(OP)
Software flies rockets, planes, ships, cars, factories and just about everything else. Yet somehow LE shouldn't be using it because... they are dumb? Everyone else is smart tho.
replies(3): >>toofy+49 >>zumina+3d >>fivre+tj
◧◩
4. toofy+49[view] [source] [discussion] 2020-06-25 05:09:19
>>dzhiur+Z6
Did you respond to the wrong comment? I don’t believe I implied anything close to what you just said.
◧◩
5. zumina+3d[view] [source] [discussion] 2020-06-25 06:04:44
>>dzhiur+Z6
If you fly a plane, drive a car or operate a factory, your livelihood and often your life depends on your constantly paying attention to the output of the software and making constant course-correcting adjustments if necessary. And the software itself often has the ability to avoid fatal errors built in. You rely on it in a narrow domain because it is highly reliable within that domain. For example, your vehicle's cruise control will generally not suddenly brake and swerve off the road so you can relax your levels of concentration to some extent. If it were only 52% likely to be maintaining your velocity and heading from moment to moment, you wouldn't trust it for a second.

Facial recognition software doesn't have the level of reliability that control software for mechanical systems has. And if a mistake is made, the consequences to the LEO have been historically minimal. Shoot first and ask questions later has been deemed acceptable conduct, so why not implicitly trust in the software? If it's right and you kill a terrorist, you're a hero. If it's wrong and you kill a civilian, the US Supreme Court has stated, "Where the officer has probable cause to believe that the suspect poses a threat of serious physical harm, either to the officer or to others, it is not constitutionally unreasonable to prevent escape by using deadly force." The software provides probable cause, the subject's life is thereby forfeit. From the perspective of the officer, seems a no-brainer.

◧◩
6. fivre+tj[view] [source] [discussion] 2020-06-25 07:18:58
>>dzhiur+Z6
Were you asleep for all coverage of the 737 MAX MCAS, or the technical failures that contributed to multiple warships casually driving into other ships?

https://features.propublica.org/navy-accidents/uss-fitzgeral...

https://features.propublica.org/navy-uss-mccain-crash/navy-i...

Software allows us to work very efficiently because it can speed work up. It can speed us up when fucking things up just as well.

replies(1): >>dzhiur+Cy
◧◩◪
7. dzhiur+Cy[view] [source] [discussion] 2020-06-25 09:40:29
>>fivre+tj
Airbus has been fly-by-wire for something like 5 decades. They did have some issues but they were solved. So will be 737.
[go to top]