zlacker

[parent] [thread] 2 comments
1. scott_+(OP)[view] [source] 2026-02-04 11:37:37
> That post doesn't contain such an admission, it instead talks about forbidden prompting.

In response to what? If CSAM is not being generated, why aren't X just saying that? Instead they're saying "please don't do it."

> which contradicts your claim that there were no guardrails before.

From the linked post:

> However content is created or whether users are free or paid subscribers, our Safety team are working around the clock to add additional safeguards

Which was posted a full week after the initial story broke and after Ofcom started investigative action. So no, it does not contradict my point which was:

> Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

As you quoted.

I really can't decide if you're stupid, think I and other readers are stupid, or so dedicated to defending paedophilia that you'll just tell flat lies to everyone reading your comment.

replies(1): >>cubefo+Do
2. cubefo+Do[view] [source] 2026-02-04 14:21:44
>>scott_+(OP)
Leave your accusations for yourself. Grok already didn't generate naked pictures of adults months ago when I tested it for the first time. Clearly the "additional safeguards" are meant to protect the system against any jailbreaks.
replies(1): >>scott_+hu
◧◩
3. scott_+hu[view] [source] [discussion] 2026-02-04 14:49:48
>>cubefo+Do
Just to be clear, I'm to ignore:

* Internet Watch Foundation

* The BBC

* The Guardian

* X themselves

* Ofcom

And believe the word of an anonymous internet account who claims to have tried to undress women using Grok for "research."

[go to top]