zlacker

[return to "‘Fuck you, make me’ without saying the words"]
1. sixQua+98[view] [source] 2026-01-11 18:40:13
>>chmayn+(OP)
Users showed that Gemini and openAI also undress people, it’s not just grok.
◧◩
2. ctoth+K9[view] [source] 2026-01-11 18:48:09
>>sixQua+98
This is such a weird issue for me, who is blind. Did Grok undress people, or did Grok show extrapolated images of what people might look like undressed? The "undressed people" framing makes it sound like people physically had their clothes removed. Obviously this did not happen.

But, like.

If I have like ... a mole somewhere under my clothes, Grok cannot know about that right? People will know what they themselves look like naked?

Someone looking at Grok's output learns literally nothing about what the actual person looks like naked, right?

Kinda sounds like somebody should just make something that creates this for every picture ever? Then everybody has a defense -- "fake nudes!" and the pictures are meaningless?

◧◩◪
3. bryanr+Qb[view] [source] 2026-01-11 18:58:31
>>ctoth+K9
>If I have like ... a mole somewhere under my clothes, Grok cannot know about that right?

unless some ex spoke about that gross mole you had in twitter or some data that was scraped somewhere, no.

Not sure what the actual odds are of it knowing if you have a mole or not.

◧◩◪◨
4. ctoth+dd[view] [source] 2026-01-11 19:06:02
>>bryanr+Qb
Use the mole example as referring to any physical characteristic hidden by clothing that people want to remain hidden. It's an example to demonstrate that the AI is not "undressing" anybody. It is filling in an extrapolation of pixels which have no clear relationship to the underlying reality. If you have a hidden tattoo, that tattoo is still not visible.

This gets fuzzy because literally everything is correlated -- it may be possible to infer that you are the type of person who might have a tattoo there? But grok doesn't have access to anything that hasn't already been shared. Grok is not undressing anybody, the people using it to generate these images aren't undressing anybody, they are generating fake nudes which have no more relationship to reality than someone taking your public blog posts and then attempting to write a post in your voice.

◧◩◪◨⬒
5. bryanr+P22[view] [source] 2026-01-12 08:42:42
>>ctoth+dd
sure, but if I make a fake picture of someone having sex with a horse and someone else confirms "my gosh, that's really them! I recognize that mole" then I suppose it is the same damage.

At any rate where some of this stuff is concerned, fake CSAM for example, it doesn't matter that it is "fake" as fakes of the material is also against the law in some places at least.

if the problem is just the use of the word "undressing" I suppose the usage of the word is completely analogical, as nobody expects that Grok is actually going out and undressing anyone as the robots are not ready for that task yet.

[go to top]