You don't think Meta, TikTok etc are the cause of the problem ?
I appreciate that Lfgss is somewhat collateral damage but the fact is that if you're going to run a forum you do have some obligation to moderate it.
Lfgss is heavily moderated, just maybe not in a way you could prove to a regulator without an expensive legal team...
"some"?
> The Act would also require me to scan images uploading for Child Sexual Abuse Material and other harmful content, it requires me to register as the responsible person for this and file compliance. It places technical costs, time costs, risk, and liability, onto myself as the volunteer who runs it all... and even if someone else took it over those costs would pass to them if the users are based in the UK.
There is no CSAM ring hiding on this cycling forum. The notion that every service which transmits data from one user to another has to file compliance paperwork and pay to use a CSAM hashing service is absurd.