zlacker

[return to "Lobsters blocking UK users because of the Online Safety Act"]
1. basisw+t6[view] [source] 2025-02-23 19:52:22
>>ColinW+(OP)
As much as I hate this legislation, this is really just a small forum deciding they don't have the time to understand the legislation and therefore it's easier to block IP's from the UK (while it's not even clear if that will exempt them from liability). Fair enough but hardly earth shattering. I remember several US based websites geo-blocking all of Europe after GDPR came in (I think the LA Times was one of the biggest) and that went on for years.

We need legislation that tackles the various issues this legislation aims to tackle but in much more targeted ways. It needs to specifically target the biggest social media companies (Meta, X, Reddit, etc). Smaller forums are irrelevant. If your algorithm recommends self-harm content to kids who then go on to kill themselves it is right that you should be held responsible. "We're so big we can't be expected to police the content we host" should not be an acceptable argument.

◧◩
2. Analem+I8[view] [source] 2025-02-23 20:07:04
>>basisw+t6
Small forums run as hobby projects need to require little-to-no investment of time and money, or the ROI quickly goes negative and they just shut down instead.

A little while back there was the story [0] of a Mastodon admin who hated CloudFlare and its centralized protection, but found that he had no choice but to sign up for it anyway because a disgruntled user kept launching DDoS attacks and he had no other way to keep his instance online. A bunch of people here and elsewhere kept unhelpfully replying that, “you don’t need CloudFlare, you could just do [incredibly convoluted and time-consuming solution] instead”, and all of those people were missing the point: CloudFlare is “set it and forget it”, which is a non-negotiable requirement for anything which is run as a hobby instead of a full-time job.

It’s the same with this UK law: yes, you could spend weeks of your life learning the intricacies of laws in some other country, or you could just block them and be done with it. Businesses which might need revenue from UK users will do the former, but if I’m running a site out of my own time and money, I’ll do the latter. And I don’t want hobby sites to have to disappear: the Internet is commercialized enough as it is, and regulating passion projects out of existence would kill the last remaining independent scraps.

[0]: >>21719793

◧◩◪
3. basisw+Y9[view] [source] 2025-02-23 20:15:58
>>Analem+I8
It would be much easier for me to run a small business if I didn't have to worry about the intricacies of tax law or how the various company structures affect my liability - but that's life. If you want to do various things in life, you may have certain responsibilities. Again - I don't like this particular legislation, but if your hobby is a website where others can post content you have a responsibility that the content shouldn't be illegal or harmful to others. If you can't deal with those responsibilities you can't run the website. It's no different than being required to follow health & safety regulations in an IRL business, even if it's just my 'hobby'.
◧◩◪◨
4. ss64+2g[view] [source] 2025-02-23 21:00:20
>>basisw+Y9
You can still be fined for 'failing to comply' with the legislation even if no objectionable content has been posted. To be in compliance there is a whole list of things you need to do, some of which are expensive.
◧◩◪◨⬒
5. tzs+9h[view] [source] 2025-02-23 21:08:37
>>ss64+2g
Which things are expensive?
◧◩◪◨⬒⬓
6. ss64+EH1[view] [source] 2025-02-24 12:24:45
>>tzs+9h
Hiring lawyers, attending compliance training courses, writing software to scan for CSAM, modifying your website so that it can verify the identity and age of every poster.
◧◩◪◨⬒⬓⬔
7. tzs+9y3[view] [source] 2025-02-24 22:43:35
>>ss64+EH1
The compliance training requirement isn't that you attend training. It is that when you hire people to design or operationally manage your site you train then in how the site handles compliance. It also only applies to large sites or multi-risk sites.

The scanning for CSAM only applies to (1) large sites that are at medium or high risk for image-based CSAM and (2) services that are at high risk of image-based CSAM and either have more than 700k monthly UK users or are file-storage or file-sharing services.

I might have missed it but the only age verification requirement I'm seeing is that if the site is a large service that has a medium risk for grooming or it has a high risk for grooming and it already has a means to determine the age of users, then the site has to do some things like when recommending new contacts not recommend connections between adults and children and not allowing adults to send unsolicited DMs to children.

A small to medium sized forum site that is already doing to kind of monitoring and moderation that you have to do to keep your site from being completely overrun with spam and off topic material shouldn't really have to make many changes. Mostly it will just be writing some documentation.

[go to top]