zlacker

[parent] [thread] 11 comments
1. amarch+(OP)[view] [source] 2025-12-11 16:44:03
Just a few days ago I was doing some low paid (well, not so low) Ai classification task - akin to mechanical turk ones - for a very big company and was - involuntarily, since I guess they don't review them before showing - shown an ai image by the platform depicting a naked man and naked kid. though it was more barbie like than anything else. I didn't really enjoy the view tbh, contacted them but got no answer back
replies(2): >>ipytho+B8 >>kennyl+jm1
2. ipytho+B8[view] [source] 2025-12-11 17:22:12
>>amarch+(OP)
If the picture truly was of a child, the company is _required_ to report CSAM to NCMEC. It's taken very seriously. If they're not being responsive, escalate and report it yourself so you don't have legal problems.

See https://report.cybertip.org/.

replies(3): >>amarch+fb >>kotaKa+xl >>moi238+z42
◧◩
3. amarch+fb[view] [source] [discussion] 2025-12-11 17:33:18
>>ipytho+B8
Even if it's an Ai image? I will follow through contacting them directly rather than with the platform messaging system, then I'll see what to do if they don't answer

Edit i read the informations given in the briefing before the task, and they say that there might be offensive content displayed. They say to tell them if it happens, but well I did and got no answer so weeeell, not so inclined to believe they care about it

replies(2): >>jfindp+tb >>ipytho+qB1
◧◩◪
4. jfindp+tb[view] [source] [discussion] 2025-12-11 17:34:15
>>amarch+fb
>Even if it's an Ai image?

This varies by country, but in many countries it doesn't matter if it is a drawing, AI, or a real image -- they are treated equally for the purposes of CSAM.

replies(1): >>amarch+bc
◧◩◪◨
5. amarch+bc[view] [source] [discussion] 2025-12-11 17:37:23
>>jfindp+tb
That's understandable
◧◩
6. kotaKa+xl[view] [source] [discussion] 2025-12-11 18:25:16
>>ipytho+B8
> It's taken very seriously

Can confirm. The amount of people I see in my local news getting arrested for possession that "... came from a cybertip escalated to NCMEC from <BIGCOMPANY>" is... staggering. (And it's almost always Google Drive or GMail locally, but sometimes a curveball out there.)

replies(1): >>lostms+XC1
7. kennyl+jm1[view] [source] 2025-12-11 23:36:58
>>amarch+(OP)
How can I find work like this?
replies(1): >>amarch+Na3
◧◩◪
8. ipytho+qB1[view] [source] [discussion] 2025-12-12 01:33:14
>>amarch+fb
The company may not care, but the gov definitely does. And if you don’t report then you could be in serious legal jeopardy. If any fragments of that image are still present on your machine, whether it came from the company or not, you could be held accountable for possessing csam.

So screw the company, report it yourself and make sure to cite the company and their lack of a response. There’s a Grand Canyon sized chasm between “offensive content” and csam.

◧◩◪
9. lostms+XC1[view] [source] [discussion] 2025-12-12 01:47:20
>>kotaKa+xl
Where does this happen?
◧◩
10. moi238+z42[view] [source] [discussion] 2025-12-12 07:14:40
>>ipytho+B8
A nude picture of a child is not automatically CSAM.

It needs to be sexually abused or exploited for something to be CSAM.

replies(1): >>amarch+KA2
◧◩◪
11. amarch+KA2[view] [source] [discussion] 2025-12-12 12:50:31
>>moi238+z42
That's understandable, I still felt uneasy
◧◩
12. amarch+Na3[view] [source] [discussion] 2025-12-12 16:47:38
>>kennyl+jm1
Sorry, i'm not comfortable sharing the name of the platform given the situation. However, for similar jobs i find that browsing the web with the string "serious tasks beermoney reddit" gives you similar results to what i'm talking about
[go to top]