zlacker

[parent] [thread] 2 comments
1. jprete+(OP)[view] [source] 2023-09-30 12:02:50
These aren’t even remotely comparable to AI photo manipulation.
replies(2): >>pbhjpb+EM7 >>froggi+MQ7
2. pbhjpb+EM7[view] [source] 2023-10-02 21:21:00
>>jprete+(OP)
Agreed. My point was that trusting images ('seeing is believing') has always been at issue whilst we might imagine it is a new thing, the scale of the issue is different -- phenomenally so -- but it's not a category difference. Many people were convinced by the fairy hoaxes based on image manipulation in the early 20th Century (~1917). They fell for it hook-line-and-sinker, images made with ML weren't needed.
3. froggi+MQ7[view] [source] 2023-10-02 21:40:33
>>jprete+(OP)
This is a bit of a stretch, but the end results from either manipulation technique would be comparable if they were meant to skew the truth the same way. However, that sounds stupid as shit when I read it back, but I'm not entirely sure why.

I think a use case for AI image manipulation could be more like if I need a picture where I'm poor but wearing smart borrowed clothes, standing with an unassociated associate and a dead alive, with a backdrop, with the only source image beimg selfie of someone else that incidentally caught half of me way in the background

The intent or use cases for these two (lacking a better term) manipulators aren't orthogonal here. The purpose of AI image generation is, well, images generated by AI. It could technically generate images that misrepresent info, but that's more of a side effect reached in a totally different way than staging a scene in an actual photo. It seems like using manipulation to stage misleading photos would be used primarily for the purpose of deceptive activities or subversive fuckery.

[go to top]