A little while back there was the story [0] of a Mastodon admin who hated CloudFlare and its centralized protection, but found that he had no choice but to sign up for it anyway because a disgruntled user kept launching DDoS attacks and he had no other way to keep his instance online. A bunch of people here and elsewhere kept unhelpfully replying that, “you don’t need CloudFlare, you could just do [incredibly convoluted and time-consuming solution] instead”, and all of those people were missing the point: CloudFlare is “set it and forget it”, which is a non-negotiable requirement for anything which is run as a hobby instead of a full-time job.
It’s the same with this UK law: yes, you could spend weeks of your life learning the intricacies of laws in some other country, or you could just block them and be done with it. Businesses which might need revenue from UK users will do the former, but if I’m running a site out of my own time and money, I’ll do the latter. And I don’t want hobby sites to have to disappear: the Internet is commercialized enough as it is, and regulating passion projects out of existence would kill the last remaining independent scraps.
[0]: >>21719793
I’m not entirely sure I agree with this sentiment. It certainly isn’t true from a legal standpoint in America, where section 230 explicitly absolves you from any such responsibility. I also don’t think most of the objections to the OSA center around a need to remove child pornography, but instead the fact that you are forced to employ technological measures that don’t currently exist to remove child pornography. All of this is a little besides the point though because…
> If you can't deal with those responsibilities you can't run the website.
I absolutely can. If the law is unreasonable, I can block all users from the host country, and keep running my website. Which is exactly what Lobsters and a lot of other people are choosing to do.
The scanning for CSAM only applies to (1) large sites that are at medium or high risk for image-based CSAM and (2) services that are at high risk of image-based CSAM and either have more than 700k monthly UK users or are file-storage or file-sharing services.
I might have missed it but the only age verification requirement I'm seeing is that if the site is a large service that has a medium risk for grooming or it has a high risk for grooming and it already has a means to determine the age of users, then the site has to do some things like when recommending new contacts not recommend connections between adults and children and not allowing adults to send unsolicited DMs to children.
A small to medium sized forum site that is already doing to kind of monitoring and moderation that you have to do to keep your site from being completely overrun with spam and off topic material shouldn't really have to make many changes. Mostly it will just be writing some documentation.