zlacker

[parent] [thread] 9 comments
1. basisw+(OP)[view] [source] 2025-02-23 19:52:22
As much as I hate this legislation, this is really just a small forum deciding they don't have the time to understand the legislation and therefore it's easier to block IP's from the UK (while it's not even clear if that will exempt them from liability). Fair enough but hardly earth shattering. I remember several US based websites geo-blocking all of Europe after GDPR came in (I think the LA Times was one of the biggest) and that went on for years.

We need legislation that tackles the various issues this legislation aims to tackle but in much more targeted ways. It needs to specifically target the biggest social media companies (Meta, X, Reddit, etc). Smaller forums are irrelevant. If your algorithm recommends self-harm content to kids who then go on to kill themselves it is right that you should be held responsible. "We're so big we can't be expected to police the content we host" should not be an acceptable argument.

replies(2): >>Analem+f2 >>mattlo+h6
2. Analem+f2[view] [source] 2025-02-23 20:07:04
>>basisw+(OP)
Small forums run as hobby projects need to require little-to-no investment of time and money, or the ROI quickly goes negative and they just shut down instead.

A little while back there was the story [0] of a Mastodon admin who hated CloudFlare and its centralized protection, but found that he had no choice but to sign up for it anyway because a disgruntled user kept launching DDoS attacks and he had no other way to keep his instance online. A bunch of people here and elsewhere kept unhelpfully replying that, “you don’t need CloudFlare, you could just do [incredibly convoluted and time-consuming solution] instead”, and all of those people were missing the point: CloudFlare is “set it and forget it”, which is a non-negotiable requirement for anything which is run as a hobby instead of a full-time job.

It’s the same with this UK law: yes, you could spend weeks of your life learning the intricacies of laws in some other country, or you could just block them and be done with it. Businesses which might need revenue from UK users will do the former, but if I’m running a site out of my own time and money, I’ll do the latter. And I don’t want hobby sites to have to disappear: the Internet is commercialized enough as it is, and regulating passion projects out of existence would kill the last remaining independent scraps.

[0]: >>21719793

replies(1): >>basisw+v3
◧◩
3. basisw+v3[view] [source] [discussion] 2025-02-23 20:15:58
>>Analem+f2
It would be much easier for me to run a small business if I didn't have to worry about the intricacies of tax law or how the various company structures affect my liability - but that's life. If you want to do various things in life, you may have certain responsibilities. Again - I don't like this particular legislation, but if your hobby is a website where others can post content you have a responsibility that the content shouldn't be illegal or harmful to others. If you can't deal with those responsibilities you can't run the website. It's no different than being required to follow health & safety regulations in an IRL business, even if it's just my 'hobby'.
replies(3): >>Analem+o4 >>ss64+z9 >>jwalto+MN
◧◩◪
4. Analem+o4[view] [source] [discussion] 2025-02-23 20:21:27
>>basisw+v3
Yes, I could do that, or I could block the only country which is imposing these burdens on me, keep my website running as it always has been, and call it a day. It’s a pretty easy choice.
5. mattlo+h6[view] [source] 2025-02-23 20:34:31
>>basisw+(OP)
I think the legislation does not name specifically companies (that would be quite shortsighted since new apps etc appear all the time and they don't want to be updating legislation every time something gets popular), but simply says if you have more than 7 million 30-day active UK users of the user-to-user part of your site, then you're in-scope. That's quite a big audience.
◧◩◪
6. ss64+z9[view] [source] [discussion] 2025-02-23 21:00:20
>>basisw+v3
You can still be fined for 'failing to comply' with the legislation even if no objectionable content has been posted. To be in compliance there is a whole list of things you need to do, some of which are expensive.
replies(1): >>tzs+Ga
◧◩◪◨
7. tzs+Ga[view] [source] [discussion] 2025-02-23 21:08:37
>>ss64+z9
Which things are expensive?
replies(1): >>ss64+bB1
◧◩◪
8. jwalto+MN[view] [source] [discussion] 2025-02-24 03:49:55
>>basisw+v3
> but if your hobby is a website where others can post content you have a responsibility that the content shouldn't be illegal or harmful to others

I’m not entirely sure I agree with this sentiment. It certainly isn’t true from a legal standpoint in America, where section 230 explicitly absolves you from any such responsibility. I also don’t think most of the objections to the OSA center around a need to remove child pornography, but instead the fact that you are forced to employ technological measures that don’t currently exist to remove child pornography. All of this is a little besides the point though because…

> If you can't deal with those responsibilities you can't run the website.

I absolutely can. If the law is unreasonable, I can block all users from the host country, and keep running my website. Which is exactly what Lobsters and a lot of other people are choosing to do.

◧◩◪◨⬒
9. ss64+bB1[view] [source] [discussion] 2025-02-24 12:24:45
>>tzs+Ga
Hiring lawyers, attending compliance training courses, writing software to scan for CSAM, modifying your website so that it can verify the identity and age of every poster.
replies(1): >>tzs+Gr3
◧◩◪◨⬒⬓
10. tzs+Gr3[view] [source] [discussion] 2025-02-24 22:43:35
>>ss64+bB1
The compliance training requirement isn't that you attend training. It is that when you hire people to design or operationally manage your site you train then in how the site handles compliance. It also only applies to large sites or multi-risk sites.

The scanning for CSAM only applies to (1) large sites that are at medium or high risk for image-based CSAM and (2) services that are at high risk of image-based CSAM and either have more than 700k monthly UK users or are file-storage or file-sharing services.

I might have missed it but the only age verification requirement I'm seeing is that if the site is a large service that has a medium risk for grooming or it has a high risk for grooming and it already has a means to determine the age of users, then the site has to do some things like when recommending new contacts not recommend connections between adults and children and not allowing adults to send unsolicited DMs to children.

A small to medium sized forum site that is already doing to kind of monitoring and moderation that you have to do to keep your site from being completely overrun with spam and off topic material shouldn't really have to make many changes. Mostly it will just be writing some documentation.

[go to top]