zlacker

[parent] [thread] 20 comments
1. _bxg1+(OP)[view] [source] 2020-05-31 21:07:36
A replacement for blur could just be black boxes. Seems easy and safe enough.
replies(3): >>Polyla+li >>jazzyj+UN >>tornat+kp6
2. Polyla+li[view] [source] 2020-05-31 23:14:45
>>_bxg1+(OP)
Make sure they are 100% opacity. A lot of people mess this up and use 90% opacity or similar and the original image can be revealed by messing with the color levels.
replies(1): >>t-writ+Bp
◧◩
3. t-writ+Bp[view] [source] [discussion] 2020-06-01 00:13:25
>>Polyla+li
I've doxxed my Reddit username on my Apple phone doing that exact thing. The black marker is not opaque, even after a few stripes over the username. You have to do it many more times.
replies(1): >>aspenm+Vs
◧◩◪
4. aspenm+Vs[view] [source] [discussion] 2020-06-01 00:47:30
>>t-writ+Bp
Easier to select an area and delete it from the layer entirely so that a transparent hole is left. Then make sure you cleanup EXIF and other metadata or you may have the original image still in a thumbnail field at reduced fidelity.

Free online metadata viewer http://exif.regex.info

Powered by FOSS (Perl-based) https://exiftool.org

replies(1): >>girst+gA
◧◩◪◨
5. girst+gA[view] [source] [discussion] 2020-06-01 02:25:40
>>aspenm+Vs
Do not keep it transparent! The Gimp for example keeps the underlying colour data, and just sets the opacity to 0.

this bug (closed as Expected Behavior) has a demonstration: https://gitlab.gnome.org/GNOME/gimp/-/issues/4487

replies(3): >>_bxg1+6G >>Persei+VQ >>aspenm+fw1
◧◩◪◨⬒
6. _bxg1+6G[view] [source] [discussion] 2020-06-01 03:55:19
>>girst+gA
It blows my mind that there are so many ways to screw up something this simple.

Edit: to be clear I meant this as a commentary on the technology, not the people making the mistakes

replies(1): >>Polyla+1P
7. jazzyj+UN[view] [source] 2020-06-01 06:11:41
>>_bxg1+(OP)
As soon as deepfakes and "thispersondoesnotexist" started happening I wanted a tool that would replace everyone's face with a auto-generated face just so I could do street photography without feeling like I was invading people's right to obscurity
replies(4): >>kicksc+xO >>shavin+L91 >>livq+VC1 >>ikeyan+cw2
◧◩
8. kicksc+xO[view] [source] [discussion] 2020-06-01 06:25:18
>>jazzyj+UN
This is a cool idea. Oh to be a streamer with an auto generated face. Reminds me of A Scanner Darkly.
replies(1): >>m_eima+O41
◧◩◪◨⬒⬓
9. Polyla+1P[view] [source] [discussion] 2020-06-01 06:35:24
>>_bxg1+6G
I think there is a need for a dedicated image privacy offline program. On a technical level its very easy to preserve privacy, its just the tools people are using were built for other purposes (Non destructive editing is highly desirable in normal cases).

All the program has to do is scrub all exif data, have a censor box/brush that is 100% black and rencode the image so there is no remaining unneeded data.

replies(1): >>wool_g+iR1
◧◩◪◨⬒
10. Persei+VQ[view] [source] [discussion] 2020-06-01 07:09:05
>>girst+gA
Is PPM a safe round-trip format to remove all metadata and transparency? I'd like to recommend it to a friend and as far as I know it really only contains RGB as text and has no extensions for exif or similar. But after so many gotchas, as listed here in the thread, I'm somewhat paranoid...
replies(1): >>jstanl+A91
◧◩◪
11. m_eima+O41[view] [source] [discussion] 2020-06-01 10:01:41
>>kicksc+xO
Should be slowly morphing between different faces, maybe 10 minutes or so per transition. Should be fairly unsettling to watch! :D
◧◩◪◨⬒⬓
12. jstanl+A91[view] [source] [discussion] 2020-06-01 10:49:43
>>Persei+VQ
ASCII PPM supports comments, so it is possible that EXIF or other identifying information would get written into the comments by some tool.

I have only ever observed PPM comments right at the start of the file, so you could open it in a text editor and remove the comments from the start. Maybe check the very end of the file as well.

Binary PPM does not support comments, so that would be a better solution. PPM documentation here, you want possibly P3 or more likely P6 https://en.wikipedia.org/wiki/Netpbm#File_formats

◧◩
13. shavin+L91[view] [source] [discussion] 2020-06-01 10:52:22
>>jazzyj+UN
That's a really interesting idea. I'm not sure what the commercial value would be, but the artistic value (and gain in privacy) would be huge. I'm not sure what you'd do about identifying marks like tattoos, but perhaps that isn't the biggest concern when compared to faces.

Could you train a model with your own face as a start, and then run your photos through an existing consumer face-swap app? Or perhaps use a celebrities likeness? I wonder how much the visual 'likeness' of a stranger is worth.

replies(1): >>jb1533+tR1
◧◩◪◨⬒
14. aspenm+fw1[view] [source] [discussion] 2020-06-01 14:07:03
>>girst+gA
I didn’t specify a program to use, but I did not know this. A step in my personal workflow I neglected to state is to flatten all layers but I’m not sure what the best way is, so I’ll just say I am open to ideas for better ways.

There should be a test suite for image editing applications which will validate the different ways of editing a file to see which ones work as expected and which do not. I’m thinking something similar to web standards test for browsers. Does something like this already exist?

◧◩
15. livq+VC1[view] [source] [discussion] 2020-06-01 14:46:06
>>jazzyj+UN
I've always been partial to the laughing man from Ghost in the Shell. This would be a perfect use case :P
◧◩◪◨⬒⬓⬔
16. wool_g+iR1[view] [source] [discussion] 2020-06-01 15:55:11
>>Polyla+1P
Good thought; it seems like something the EFF (or maybe the ACLU) would be interested in producing or publishing.
◧◩◪
17. jb1533+tR1[view] [source] [discussion] 2020-06-01 15:56:36
>>shavin+L91
Commercial value may be for filmmakers who would no longer have to worry about getting waivers from people in the background of live shots. (Not a lawyer.)
replies(1): >>thr0w_+ao2
◧◩◪◨
18. thr0w_+ao2[view] [source] [discussion] 2020-06-01 18:27:30
>>jb1533+tR1
Also not a lawyer, and US-based in case it varies by country.

Do you know if a waiver is needed in this case? My understanding is that I can walk down a sidewalk, around Disneyland, around a resort, and film anyone / anything in plain sight. (I don't do that, by the way...) In other words, assuming you're not climbing over railings etc., if you can see it with your eyes, you can film it or photograph it.

Wonder if anyone here (plenty of legal eagles I'm sure) can confirm this or correct this. We don't need to get bogged down in corner cases & rare exceptions... for example, I think I heard that in some states, if the police ask (demand?) that you stop recording, you have to, otherwise you're in violation of the law... but even as I type that, as an American, it just sounds wrong... but I don't know.

replies(1): >>mikepu+bB2
◧◩
19. ikeyan+cw2[view] [source] [discussion] 2020-06-01 19:09:23
>>jazzyj+UN
A tool that automatically turns background faces into the faces of random animals would actually do quite well.
◧◩◪◨⬒
20. mikepu+bB2[view] [source] [discussion] 2020-06-01 19:34:15
>>thr0w_+ao2
Also not a lawyer, but I think it mostly has to do with commercial use. Filming people at Disney for your Instagram followers is different from making a feature film and turning everyone standing around on a busy street into uncredited extras.

This particular site is with respect to Canada, but I'm pretty sure the same basic idea applies everywhere:

"When publishing photos for commercial purposes: You need the permission of every identifiable model in the photo, even if the photo was taken in a public space. For example, if a photo has 10 identifiable models in the photo, you would require a model release for each of them."

https://www.lawdepot.ca/law-library/faq/model-and-entertainm...

21. tornat+kp6[view] [source] 2020-06-02 22:43:08
>>_bxg1+(OP)
Even better, replace the face with Kim-Jong Un and then blur it. When they apply those forensic techniques they'll discover that it was him all along!
[go to top]