zlacker

[return to "X offices raided in France as UK opens fresh investigation into Grok"]
1. verdve+Kv1[view] [source] 2026-02-03 18:15:48
>>vikave+(OP)
France24 article on this: https://www.france24.com/en/france/20260203-paris-prosecutor...

lol, they summoned Elon for a hearing on 420

"Summons for voluntary interviews on April 20, 2026, in Paris have been sent to Mr. Elon Musk and Ms. Linda Yaccarino, in their capacity as de facto and de jure managers of the X platform at the time of the events,

◧◩
2. why_at+HZ1[view] [source] 2026-02-03 20:18:01
>>verdve+Kv1
>The Paris prosecutor's office said it launched the investigation after being contacted by a lawmaker alleging that biased algorithms in X were likely to have distorted the operation of an automated data processing system.

I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?

Distorted the operation how? By making their chatbot more likely to say stupid conspiracies or something? Is that even against the law?

◧◩◪
3. int_19+kk2[view] [source] 2026-02-03 22:01:43
>>why_at+HZ1
Holocaust denial is illegal in France, for one, and Grok did exactly that on several occasions.
◧◩◪◨
4. pyrale+mp2[view] [source] 2026-02-03 22:28:26
>>int_19+kk2
Also, csam and pornographic content using the likeness of unwilling people. Grok’s recent shit was bound to have consequences.
◧◩◪◨⬒
5. chrisj+8O2[view] [source] 2026-02-04 00:49:33
>>pyrale+mp2
If the French suspected Grok/X of something as serious as CSAM, you can bet they would have mentioned it their statement. They didn't. Porn, they did.
◧◩◪◨⬒⬓
6. pyrale+bq3[view] [source] 2026-02-04 06:18:08
>>chrisj+8O2
The first two points of the official document, which I re-quote below, are about CSAM.

> complicité de détention d’images de mineurs présentant un caractère pédopornographique

> complicité de diffusion, offre ou mise à disposition en bande organisée d'image de mineurs présentant un caractère pédopornographique

[1]: https://www.tribunal-de-paris.justice.fr/sites/default/files...

◧◩◪◨⬒⬓⬔
7. chrisj+LJ3[view] [source] 2026-02-04 09:02:36
>>pyrale+bq3
> The first two points of the official document, which I re-quote below, are about CSAM.

Sorry, but that's a major translation error. "pédopornographique" properly translated is child porn, not child sexual abuse material (CSAM). The difference is huge.

◧◩◪◨⬒⬓⬔⧯
8. pyrale+rO3[view] [source] 2026-02-04 09:39:36
>>chrisj+LJ3
Quote from US doj [1]:

> The term “child pornography” is currently used in federal statutes and is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old. While this phrase still appears in federal law, “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child. In fact, in 2016, an international working group, comprising a collection of countries and international organizations working to combat child exploitation, formally recognized “child sexual abuse material” as the preferred term.

Child porn is csam.

[1]: https://www.justice.gov/d9/2023-06/child_sexual_abuse_materi...

◧◩◪◨⬒⬓⬔⧯▣
9. chrisj+s24[view] [source] 2026-02-04 11:28:06
>>pyrale+rO3
> “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child.

Yes, CSAM is preferred for material depicting abuse reflecting resulting trauma.

But not for child porn such as manga of fictional children depicting no abuse and traumatising no child.

> Child porn is csam.

"CSAM isn’t pornography—it’s evidence of criminal exploitation of kids."

That's from RAINN, the US's largest anti-sexual violence organisation.

◧◩◪◨⬒⬓⬔⧯▣▦
10. pyrale+Sq5[view] [source] 2026-02-04 18:45:19
>>chrisj+s24
> That's from RAINN, the US's largest anti-sexual violence organisation.

For everyone to make up their own opinion about this poster's honesty, here's where his quote is from [1]. Chosen quotes:

> CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.

> It doesn’t matter if the child agreed to it. It doesn’t matter if they sent the image themselves. If a minor is involved, it’s CSAM—and it’s illegal.

[1]: https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

[go to top]