> complicité de détention d’images de mineurs présentant un caractère pédopornographique
> complicité de diffusion, offre ou mise à disposition en bande organisée d'image de mineurs présentant un caractère pédopornographique
[1]: https://www.tribunal-de-paris.justice.fr/sites/default/files...
Sorry, but that's a major translation error. "pédopornographique" properly translated is child porn, not child sexual abuse material (CSAM). The difference is huge.
> The term “child pornography” is currently used in federal statutes and is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old. While this phrase still appears in federal law, “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child. In fact, in 2016, an international working group, comprising a collection of countries and international organizations working to combat child exploitation, formally recognized “child sexual abuse material” as the preferred term.
Child porn is csam.
[1]: https://www.justice.gov/d9/2023-06/child_sexual_abuse_materi...
The way chatbots actually work, I wonder if we shouldn't treat the things they say more or less as words in a book of fiction. Writing a character in your novel who is a plain parody of David Irving probably isn't a crime even in France. Unless the goal of the book as such was to deny the holocaust.
As I see it, Grok can't be guilty. Either the people who made it/set its system prompt are guilty, if they wanted it to deny the holocaust. If not, they're at worst guilty of making a particularly unhinged fiction machine (as opposed to the more restrained fiction machines of Google, Anthropic etc.)
Yes, CSAM is preferred for material depicting abuse reflecting resulting trauma.
But not for child porn such as manga of fictional children depicting no abuse and traumatising no child.
> Child porn is csam.
"CSAM isn’t pornography—it’s evidence of criminal exploitation of kids."
That's from RAINN, the US's largest anti-sexual violence organisation.
It all depends on the severity of the offence, which itself depends on the category of the material, including whether or not it is CSAM.
The Supreme Court has today delivered its judgment in the case where the court of appeals and district court sentenced a person for child pornography offenses to 80 day fines on the grounds that he had called Japanese manga drawings into his computer. Supreme Court dismiss the indictment.
The judgment concluded that the cartoons in and of itself may be considered pornographic, and that they represent children. But these are fantasy figures that can not be mistaken for real children.
https://bleedingcool.com/comics/swedish-supreme-court-exoner...
For future readers: the [Swedish] supreme court.
You provided a terminology preference notice from the (non-lawmaking) DOJ containing a suggestion which the (lawmaking) Congress did not take up.
Thanks for that.
And if/when the French in question decide to take it up, I am sure we'll hear the news! :)
For everyone to make up their own opinion about this poster's honesty, here's where his quote is from [1]. Chosen quotes:
> CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.
> It doesn’t matter if the child agreed to it. It doesn’t matter if they sent the image themselves. If a minor is involved, it’s CSAM—and it’s illegal.
[1]: https://rainn.org/get-the-facts-about-csam-child-sexual-abus...