zlacker

[parent] [thread] 3 comments
1. pbhjpb+(OP)[view] [source] 2023-09-30 07:42:06
Photographs haven't been able to be trusted since almost the beginning. Trusted as an image of a real scene that is.

Indeed, people viewing photographs have always been able to be manipulated by presentation as fact something that is not true -- you dress up smart, in borrowed clothes, when you're really poor; you stand with a person you don't know to indicate association; you get photographed with a dead person as if they're alive; you use a back drop or set; et cetera.

replies(1): >>jprete+hj
2. jprete+hj[view] [source] 2023-09-30 12:02:50
>>pbhjpb+(OP)
These aren’t even remotely comparable to AI photo manipulation.
replies(2): >>pbhjpb+V58 >>froggi+3a8
◧◩
3. pbhjpb+V58[view] [source] [discussion] 2023-10-02 21:21:00
>>jprete+hj
Agreed. My point was that trusting images ('seeing is believing') has always been at issue whilst we might imagine it is a new thing, the scale of the issue is different -- phenomenally so -- but it's not a category difference. Many people were convinced by the fairy hoaxes based on image manipulation in the early 20th Century (~1917). They fell for it hook-line-and-sinker, images made with ML weren't needed.
◧◩
4. froggi+3a8[view] [source] [discussion] 2023-10-02 21:40:33
>>jprete+hj
This is a bit of a stretch, but the end results from either manipulation technique would be comparable if they were meant to skew the truth the same way. However, that sounds stupid as shit when I read it back, but I'm not entirely sure why.

I think a use case for AI image manipulation could be more like if I need a picture where I'm poor but wearing smart borrowed clothes, standing with an unassociated associate and a dead alive, with a backdrop, with the only source image beimg selfie of someone else that incidentally caught half of me way in the background

The intent or use cases for these two (lacking a better term) manipulators aren't orthogonal here. The purpose of AI image generation is, well, images generated by AI. It could technically generate images that misrepresent info, but that's more of a side effect reached in a totally different way than staging a scene in an actual photo. It seems like using manipulation to stage misleading photos would be used primarily for the purpose of deceptive activities or subversive fuckery.

[go to top]