zlacker

[return to "Image Scrubber: tool for anonymizing photographs taken at protests"]
1. Ansil8+vn[view] [source] 2020-05-31 18:06:15
>>dsr12+(OP)
Some tips to maximise user privacy while deploying this tool:

1) The code, for now, runs locally. This is good. To avoid the possibility of the code being tampered with at a later day (for example, it could be modified to send copies of the image to a server), download the webpage and use the saved copy, not the live copy.

2) Do not use the blur functionality. For maximum privacy, this should be removed from the app entirely. There are _a lot_ of forensic methods to reverse blur techniques.

3) Be weary of other things in the photograph that might identify someone: reflections, shadows, so on.

4) Really a subset of 2 and 3, but be aware that blocking out faces is often times not sufficient to anonymise the subject in the photo. Identifying marks like tattoos, or even something as basic as the shoes they are wearing, can be used to identify the target.

◧◩
2. Nightl+Qn[view] [source] 2020-05-31 18:09:34
>>Ansil8+vn
"There are _a lot_ of forensic methods to reverse blur techniques"

Any examples? You can't reverse it if the data is gone.

◧◩◪
3. chriss+2p[view] [source] 2020-05-31 18:19:20
>>Nightl+Qn
> You can't reverse it if the data is gone.

That's the problem - the data you think is gone isn't gone. High frequencies are gone.... but you left all the low frequencies, didn't you? You can read a face from the low frequencies.

◧◩◪◨
4. pbhjpb+TB[view] [source] 2020-05-31 20:01:55
>>chriss+2p
If you blur then mosaic, or vice-versa, then presumably you get rid of the low and high frequencies? Depending on the detail shown in the original image either, or both, might remove enough information to render the image anonymised.

How about replace each face with a "this is not a person" AI generated face, then blur+mosaic. Or just a non-person face using a deepfake system that matches the facial expression?

◧◩◪◨⬒
5. wool_g+OC2[view] [source] 2020-06-01 16:03:18
>>pbhjpb+TB
I would be worried that a generated fake face would be similar enough to the face of a real someone, somewhere, to get that person in trouble. This isn't a crisp portrait photo; a blurry cell phone video with a lot of activity and noise already kind of leaves an opening for mis-identification.
[go to top]