zlacker

[parent] [thread] 10 comments
1. scott_+(OP)[view] [source] 2026-02-04 06:18:10
Did you miss the numerous news reports? Example: https://www.theguardian.com/technology/2026/jan/08/ai-chatbo...

For obvious reasons, decent people are not about to go out and try to general child sexual abuse material to prove a point to you, if that’s what you’re asking for.

replies(1): >>cubefo+51
2. cubefo+51[view] [source] 2026-02-04 06:29:04
>>scott_+(OP)
First of all, the Guardian is known to be heavily biased again Musk. They always try hard to make everything about him sound as negative as possible. Second, last time I tried, Grok even refused to create pictures of naked adults. I just tried again and this is still the case:

https://x.com/i/grok/share/1cd2a181583f473f811c0d58996232ab

The claim that they released a tool with "seemingly no guardrailes" is therefore clearly false. I think what instead has happened here is that some people found a hack to circumvent some of those guardrails via something like a jailbreak.

replies(5): >>scott_+E5 >>emsign+g7 >>jibal+k7 >>Hikiko+uU >>neorom+tv1
◧◩
3. scott_+E5[view] [source] [discussion] 2026-02-04 07:10:54
>>cubefo+51
For more evidence:

https://www.bbc.co.uk/news/articles/cvg1mzlryxeo

Also, X seem to disagree with you and admit that CSAM was being generated:

https://arstechnica.com/tech-policy/2026/01/x-blames-users-f...

Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

This is because of government pressure (see Ofcom link).

I’d say you’re making yourself look foolish but you seem happy to defend nonces so I’ll not waste my time.

replies(1): >>cubefo+ep
◧◩
4. emsign+g7[view] [source] [discussion] 2026-02-04 07:26:46
>>cubefo+51
> First of all, the Guardian is known to be heavily biased again Musk.

Says who? Musk?

◧◩
5. jibal+k7[view] [source] [discussion] 2026-02-04 07:27:10
>>cubefo+51
That is only "known" to intellectually dishonest ideologues.
◧◩◪
6. cubefo+ep[view] [source] [discussion] 2026-02-04 09:46:15
>>scott_+E5
> Also, X seem to disagree with you and admit that CSAM was being generated

That post doesn't contain such an admission, it instead talks about forbidden prompting.

> Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

That article links to this article: https://x.com/Safety/status/2011573102485127562 - which contradicts your claim that there were no guardrails before. And as I said, I already tried it a while ago, and Grok also refused to create images of naked adults then.

replies(1): >>scott_+yD
◧◩◪◨
7. scott_+yD[view] [source] [discussion] 2026-02-04 11:37:37
>>cubefo+ep
> That post doesn't contain such an admission, it instead talks about forbidden prompting.

In response to what? If CSAM is not being generated, why aren't X just saying that? Instead they're saying "please don't do it."

> which contradicts your claim that there were no guardrails before.

From the linked post:

> However content is created or whether users are free or paid subscribers, our Safety team are working around the clock to add additional safeguards

Which was posted a full week after the initial story broke and after Ofcom started investigative action. So no, it does not contradict my point which was:

> Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

As you quoted.

I really can't decide if you're stupid, think I and other readers are stupid, or so dedicated to defending paedophilia that you'll just tell flat lies to everyone reading your comment.

replies(1): >>cubefo+b21
◧◩
8. Hikiko+uU[view] [source] [discussion] 2026-02-04 13:36:46
>>cubefo+51
>First of all, the Guardian is known to be heavily biased again Musk.

Biased against the man asking Epstein which day would be best for the "wildest" party.

◧◩◪◨⬒
9. cubefo+b21[view] [source] [discussion] 2026-02-04 14:21:44
>>scott_+yD
Leave your accusations for yourself. Grok already didn't generate naked pictures of adults months ago when I tested it for the first time. Clearly the "additional safeguards" are meant to protect the system against any jailbreaks.
replies(1): >>scott_+P71
◧◩◪◨⬒⬓
10. scott_+P71[view] [source] [discussion] 2026-02-04 14:49:48
>>cubefo+b21
Just to be clear, I'm to ignore:

* Internet Watch Foundation

* The BBC

* The Guardian

* X themselves

* Ofcom

And believe the word of an anonymous internet account who claims to have tried to undress women using Grok for "research."

◧◩
11. neorom+tv1[view] [source] [discussion] 2026-02-04 16:35:16
>>cubefo+51
>First of all, the Guardian is known to be heavily biased again Musk.

Which is good, that is the sane position to take these days.

[go to top]