A scanning system will never be perfect. But there is a better approach: what the FTC now requires Pornhub to do. Before an image is uploaded, the platform scans it; if it’s flagged as CSAM, it simply never enters the system. Platforms can set a low confidence threshold and block the upload entirely. If that creates too many false positives, you add an appeals process.
The key difference here is that upload-scanning stops distribution before it starts.
What Google is doing is scanning private cloud storage after upload and then destroying accounts when their AI misfires. That doesn’t prevent distribution — it just creates collateral damage.
It also floods NCMEC with automated false reports. Millions of photos get flagged, but only a tiny fraction lead to actual prosecutions. The system as it exists today isn’t working for platforms, law enforcement, or innocent users caught in the blast radius.