zlacker

[return to "‘Fuck you, make me’ without saying the words"]
1. sixQua+98[view] [source] 2026-01-11 18:40:13
>>chmayn+(OP)
Users showed that Gemini and openAI also undress people, it’s not just grok.
◧◩
2. ctoth+K9[view] [source] 2026-01-11 18:48:09
>>sixQua+98
This is such a weird issue for me, who is blind. Did Grok undress people, or did Grok show extrapolated images of what people might look like undressed? The "undressed people" framing makes it sound like people physically had their clothes removed. Obviously this did not happen.

But, like.

If I have like ... a mole somewhere under my clothes, Grok cannot know about that right? People will know what they themselves look like naked?

Someone looking at Grok's output learns literally nothing about what the actual person looks like naked, right?

Kinda sounds like somebody should just make something that creates this for every picture ever? Then everybody has a defense -- "fake nudes!" and the pictures are meaningless?

◧◩◪
3. wasabi+Se[view] [source] 2026-01-11 19:12:58
>>ctoth+K9
> This is such a weird issue for me, who is blind.

I'm not sure what your mental model is for someone's visual likeness.

I'd propose a blind-inclusive analogy of what is happening on Twitter is anyone can create a realistic sexdoll with the same face and body proportions as any user online.

Doesn't that feel gross, even if the sexdoll's genitalia wouldn't match the real person's?

◧◩◪◨
4. ctoth+Uh[view] [source] 2026-01-11 19:27:40
>>wasabi+Se
What part of my original comment said it wasn't gross?

My point is that nobody is getting undressed and no privacy violation is being done. Fake nudes are fake.

◧◩◪◨⬒
5. wasabi+VU[view] [source] 2026-01-11 23:02:49
>>ctoth+Uh
I interpreted your last sentence as asserting it was no big deal (ie not gross) because it was all fake, but fair enough if you didn't mean it that way.

But to your main point: if you agree it's gross, do you not agree it is a violation of _something_? What is that thing if not privacy?

[go to top]