This gets fuzzy because literally everything is correlated -- it may be possible to infer that you are the type of person who might have a tattoo there? But grok doesn't have access to anything that hasn't already been shared. Grok is not undressing anybody, the people using it to generate these images aren't undressing anybody, they are generating fake nudes which have no more relationship to reality than someone taking your public blog posts and then attempting to write a post in your voice.
At any rate where some of this stuff is concerned, fake CSAM for example, it doesn't matter that it is "fake" as fakes of the material is also against the law in some places at least.
if the problem is just the use of the word "undressing" I suppose the usage of the word is completely analogical, as nobody expects that Grok is actually going out and undressing anyone as the robots are not ready for that task yet.