zlacker

[return to "Image Scrubber: tool for anonymizing photographs taken at protests"]
1. Ansil8+vn[view] [source] 2020-05-31 18:06:15
>>dsr12+(OP)
Some tips to maximise user privacy while deploying this tool:

1) The code, for now, runs locally. This is good. To avoid the possibility of the code being tampered with at a later day (for example, it could be modified to send copies of the image to a server), download the webpage and use the saved copy, not the live copy.

2) Do not use the blur functionality. For maximum privacy, this should be removed from the app entirely. There are _a lot_ of forensic methods to reverse blur techniques.

3) Be weary of other things in the photograph that might identify someone: reflections, shadows, so on.

4) Really a subset of 2 and 3, but be aware that blocking out faces is often times not sufficient to anonymise the subject in the photo. Identifying marks like tattoos, or even something as basic as the shoes they are wearing, can be used to identify the target.

◧◩
2. Nightl+Qn[view] [source] 2020-05-31 18:09:34
>>Ansil8+vn
"There are _a lot_ of forensic methods to reverse blur techniques"

Any examples? You can't reverse it if the data is gone.

◧◩◪
3. forgot+lo[view] [source] 2020-05-31 18:14:05
>>Nightl+Qn
The data may still be there, it just looks like it's gone.
◧◩◪◨
4. okamiu+ep[view] [source] 2020-05-31 18:20:49
>>forgot+lo
Blur is in effect a lowpass filter on the image. The high frequency information is gone. Reconstruction based on domain knowledge, like AI methods etc is unlikely to be able to reconstruct the distinguishing features between people enough to avoid false positives when used to search for similar people.

Then again, maybe groups of people can be associated together, and a poor match is good enough given other clues.

So, much better to be safe than sorry.

I'm not sure if I had a particular good point to make, other than that blurring does remove information that cannot easily be reversed. You can probably make very convincing reconstructions, but they might not look like the original person.

◧◩◪◨⬒
5. thr0wa+ix[view] [source] 2020-05-31 19:25:39
>>okamiu+ep
Blur deconvolution is not exactly a new method. Easy to find examples of reconstruction from blurred images. Eg, https://www.instantfundas.com/2012/10/how-to-unblur-out-of-f...
◧◩◪◨⬒⬓
6. okamiu+uD2[view] [source] 2020-06-01 16:06:58
>>thr0wa+ix
I don't when de-blurring would be a novel idea. I think newer methods that use machine learning can produce very good results. But the math of it is much older than any computer implementation.

If you remove high frequency details, you in effect remove distinguishing features. That it is possible to create an absolutely convincing high-detail image that if blurred, gives the same "original" blurred image doesn't mean you have the correct deblurred image.

With not too fancy methods, I'm pretty sure you can make a blurred image identify as any multiple people.

I don't think this is a controversial statement either. In any case, this is a tangential discussion, since blurring to hide identities is a flawed method to begin with. With video recording, tracking, grouped individuals, etc, I'm sure reconstruction with good databases of likely subjects can have some surprising accuracy. So, better to avoid it altogether.

That said, one image, sufficiently blurred with a proper low-pass filter (i.e not a softer gaussian type, but one that just removes frequency ranges altogether), will absolutely not contain information to identify someone. The information literally isn't there. A large number of people are an equally good match, and then no one is. But, since combined with other methods I mentioned, it's a bad idea, then, yes, it's a bad idea.

[go to top]