I’m not entirely sure I agree with this sentiment. It certainly isn’t true from a legal standpoint in America, where section 230 explicitly absolves you from any such responsibility. I also don’t think most of the objections to the OSA center around a need to remove child pornography, but instead the fact that you are forced to employ technological measures that don’t currently exist to remove child pornography. All of this is a little besides the point though because…
> If you can't deal with those responsibilities you can't run the website.
I absolutely can. If the law is unreasonable, I can block all users from the host country, and keep running my website. Which is exactly what Lobsters and a lot of other people are choosing to do.
The scanning for CSAM only applies to (1) large sites that are at medium or high risk for image-based CSAM and (2) services that are at high risk of image-based CSAM and either have more than 700k monthly UK users or are file-storage or file-sharing services.
I might have missed it but the only age verification requirement I'm seeing is that if the site is a large service that has a medium risk for grooming or it has a high risk for grooming and it already has a means to determine the age of users, then the site has to do some things like when recommending new contacts not recommend connections between adults and children and not allowing adults to send unsolicited DMs to children.
A small to medium sized forum site that is already doing to kind of monitoring and moderation that you have to do to keep your site from being completely overrun with spam and off topic material shouldn't really have to make many changes. Mostly it will just be writing some documentation.