figured this might be interesting... I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.
and it's all being shut down.
the UK Online Safety Act creates a massive liability, and whilst at first glance the risk seems low the reality is that moderating people usually provokes ire from those people, if we had to moderate them because they were a threat to the community then they are usually the kind of people who get angry.
in 28 years of running forums, as a result of moderation I've had people try to get the domain revoked, fake copyright notices, death threats, stalkers (IRL and online)... as a forum moderator you are known, and you are a target, and the Online Safety Act creates a weapon that can be used against you. the risk is no longer hypothetical, so even if I got lawyers involved to be compliant I'd still have the liability and risk.
in over 28 years I've run close to 500 fora in total, and they've changed so many lives.
I created them to provide a way for those without families to build families, to catch the waifs and strays, and to try to hold back loneliness, depression, and the risk of isolation and suicide... and it worked, it still works.
but on 17th March 2025 it will become too much, no longer tenable, the personal liability and risks too significant.
I guess I'm just the first to name a date, and now we'll watch many small communities slowly shutter.
the Online Safety Act was supposed to hold big tech to account, but in fact they're the only ones who will be able to comply... it consolidates more on those platforms.
I want to emphasize just how true this is, in case anyone thinks this is hyperbole.
I managed a pissant VBulletin forum, and moderated a pretty small subreddit. The former got me woken up at 2, 3, 4am with phone calls because someone got banned and was upset about it. The latter got me death threats from someone who lived in my neighborhood, knew approximately where I lived, and knew my full name. (Would they have gone beyond the tough-guy-words-online stage? Who knows. I didn't bother waiting to find out, and resigned as moderator immediately and publicly.)
Sometimes it's explicitly mentioned but oftentimes it's behind "appropriate and proportionate measures"
I feel like the whole time this was being argued and passed, everyone in power just considered the internet to be the major social media sites and never considered that a single person or smaller group will run a site.
IMO I think that you're going to get two groups of poeple emerge from this. One group will just shut down their sites to avoid running a fowl of the rules and the other group will go the "go fuck yourself" route and continue to host anonymously.
We should make the laws for our digital spaces for human person use cases first, not corporate person use cases. Even if it's in the sense of trying to protect humans from corporations.
I home-hosted a minecraft server and was repeatedly DDoS'd. Don't underestimate disgruntled 10yo's.
https://www.ofcom.org.uk/siteassets/resources/documents/onli...
It amounts to your basic terms of service. It means that you'll need to moderate your forums, and prove that you have a policy for moderation. (basically what all decent forums do anyway) The crucial thing is that you need to record that you've done it, and reassessed it. and prove "you understand the 17 priority areas"
Its similar for what a trustee of a small charity is supposed to do each year for its due diligence.
> The act creates a new duty of care of online platforms, requiring them to take action against illegal, or legal but "harmful", content from their users. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher.
edit: removed unintentional deadnaming
Thank you to those who have tirelessly run these online communities for decades, I'm sorry we can't collectively elect lawmakers who are more educated about the real challenges online, and thoughtful on real ways to solve them.
Does this shock you? I don't recall a time in memory where a politician discussing technology was at best, cringe and at worst, completely incompetent and factually wrong.
The act is intentionally very vague and broad.
Generally, the gist is that it's up to the platforms themselves to assess and identify risks of "harm", implement safety measures, keep records and run audits. The guidance on what that means is very loose, but some examples might mean stringent age verifications, proactive and effective moderation and thorough assessment of all algorithms.
If you were to ever be investigated, it will be up to someone to decide if your measures were good or you have been found lacking.
This means you might need to spend significant time making sure that your platform can't allow "harm" to happen, and maybe you'll need to spend money on lawyers to review your "audits".
The repercussions of being found wanting can be harsh, and so, one has to ask if it's still worth it to risk it all to run that online community?
I sympathise with the OP because at some point everyone becomes too old to deal with the headaches of running a community. I have no opposition to their choice to shut down the forum. I just don't believe liability as a result of the new bill is the reason.
Argentina has had nearly 100 years of decline, Japan is onto its third lost decade. The only other party in the UK that has a chance of being elected (because of the voting system) is lead by someone who thinks sandwiches are not real [1]. It's entirely possible the UK doesn't become a serious country in our lifetimes.
[1] https://www.politico.eu/article/uk-tory-leader-sandwiches-no...
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A...
2. This Regulation does not apply to the processing of personal data: (c) by a natural person in the course of a purely personal or household activity;
I'm a little confused about this part. Does the Online Safety Act create personal liabilities for site operators (EDIT: to clarify: would a corporation not be sufficient protection)? Or are they referring to harassment they'd receive from disgruntled users?
Also, this is the first I've heard of Microcosm. It looks like some nice forum software and one I maybe would've considered for future projects. Shame to see it go.
Rightly or wrongly, limited companies in the UK provide a high degree of protection for wrongdoing. Defrauding HMRC out of hundreds of thousands of pounds and suffering no consequence is happening day in day out. An Ofcom fine is nothing by comparison.
But they make a good point: if you exclude the smaller providers, that’s where the drugs and CSAM and the freewheeling dialog go. Assuming it’s their policy goal to deter these categories of speech, I’m not sure how you do that without a net fine enough to scoop up the 4chans of the world too.
It’s not the behavior of a confident, open, healthy society, though…
They mention especially in their CSAM discussion that, in practice, a lot of that stuff ends up being distributed by smallish operators, by intention or by negligence—so if your policy goal is to deter it, you have to be able to spank those operators too. [0]
> In response to feedback, we have expanded the scope of our CSAM hash-matching measure to capture smaller file hosting and file storage services, which are at particularly high risk of being used to distribute CSAM.
Surely we can all think of web properties that have gone to seed (and spam) after they outlive their usefulness to their creators.
I wonder how much actual “turnover” something like 4chan turns over, and how they would respond to the threat of a 10% fine vs an £18mm one…
[0] https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...
This basically ensures that the only people allowed to host online services for other people in the UK will be large corporations. As they are the only ones that can afford the automation and moderation requirements imposed by this bill.
You should be able to self-host content, but you can't do something like operate a forums website or other smaller social media platform unless you can afford to hire lawyers and spend thousands of dollars a month hiring moderators and/or implementing a bullet proof moderation system.
Otherwise you risk simply getting shutdown by Ofcom. Or you can do everything yo are supposed to do and get shutdown anyways. Good luck navigating their appeals processes.
But surely no right minded judge would do such a thing, right?
Increase fuel economy -> Introduce fuel economy standards -> Economic cars practically phased out in favour of guzzling "trucks" that are exempt from fuel economy standards -> Worse fuel economy.
or
Protect the children -> Criminalize activites that might in any way cause an increase in risk to children -> Best to just keep them indoors playing with electronic gadgets -> Increased rates of obesity/depression etc -> Children worse off.
As the article itself says: Hold big tech accountable -> Introduce rules so hard to comply with that only big tech will be able to comply -> Big tech goes on, but indie tech forced offline.
Then again, maybe he's just burnt out from running these sites and this was the final straw. I can understand if he wants to pack it in after so long, and this is as good reason as any to call it a day.
Though, has no-one in that community offered to take over? Forums do change hands now and then.
there's never been an instance of any of the proclaimed things that this act protects [...] people from, so he should be safe, right?
but despite this, he is already being attacked, and those attacks will not just continue but they are likely to increase because the attack surface has become larger.
Having said all that, I can’t criticise the decision. It makes me sad to see it and it feels like the end of an era online
If we exclude politicians whose tech awareness is curated by lobbyists, Ron Wyden may be the entire list.
Very little legislation does.
Two things my clients have dealt with: VATMOSS and GDPR. The former was fixed with a much higher ceiling for compliance but not before causing a lot of costs and lost revenue to small businesses. GDPR treats a small businesses and non profits that just keep simple lists for people (customers, donors, members, parishioners, etc.) has to put effort into complying even thought they have a relatively small number of people's data and do not use it outside their organisation. The rules are the same as for a huge social network that buys and sells information about hundreds of millions of people.
This seems very plausible to me, given what they and other moderators have said about the lengths some people will go to online when they feel antagonised.
The headline is clickbait. She didn't say that sandwiches are not real. She is saying that she doesn't believe it is a proper lunch/meal.
- Bad actors go everywhere now.
- £18 million fines seem like a fairly unhinged cannon to aim at small webistes.
- A baseless accusation is enough to trigger a risk of life-changing fines. Bad actors don't just sell drugs and freewheel; they also falsely accuse.
Orgs are already fleeing LSEG for deeper capital markets in the US.
You just have to make individual value judgements every day on thousands of pieces of content for SEVENTEEN highly specific priority areas.
Then keep detailed records on each value judgement such that it can hold up to legal scrutiny from an activist court official.
> Any competent forum operator is already doing all of thisWhat is your evidence that the record keeping described by the parent is routine among competent forum operators?
2. Doesn't the fact that simple legal manoeuvring can be used to dodge 99% of this law make the law (and laws like it) farcical on its face? Merely an elaborate set of extra hoops that only serves to punish the naive, while increasing everyone's compliance costs?
- very basic macro economics
- very basic game theory
- very basic statistics
Come to think of it, kids should learn this in high school
As i have read it, no, it's worth a read to see for yourself though.
> it doesn't put that much of an onerous demand on forum operators.
It doesn't until it does, the issue is the massive amount of work needed to cover the "what if?".
It's not clear that it doesn't apply and so it will be abused, that's how the internet works, DMCA, youtube strikes, domain strikes etc.
> Then again, maybe he's just burnt out from running these sites and this was the final straw. I can understand if he wants to pack it in after so long, and this is as good reason as any to call it a day.
Possibly, worth asking.
> Though, has no-one in that community offered to take over? Forums do change hands now and then.
Someone else taking over doesn't remove the problem, though there might be someone willing to assume the risk.
Beautiful landscape, the best breakfast around, really nice people, tons of sights to see.
Orwell pointed this out in England your England which was written during the Blitz. Many of the problems he described have only got worse in the decades since he wrote about them in my opinion. While the essay is a bit dated now (it predates the post-war era of globalisation for example which created new axes in UK politics) I still think it's essential background reading for people who want to know what's wrong with the UK, and it's an excellent example of political writing in general.
what can we do about this creep up of totalitarian surveillance plutocracy?
sweet were the 1990s with a dream.of.information access for all.
little did we know we were the information being accessed.
srry
very un-HN-y.. maybe it's just the time of the year but this really pulls me down currently.
If your CCTV system captures images of people outside the boundary of your private domestic property...
Most European countries have laws for recording spaces not your own. They typically predate the GDPR by decades. AFAIK, they are not harmonized, except for a tiny bit by the GDPR.If I understand it well, this is a big difference between the USA where you can mostly record the public space and create databases of what everyone does in public. IN Europe (even outside the EU), there is a basic expectation of privacy even in public spaces. You are allowed to make short term recordings, do journalism, and have random people accidentally wander in and out of your recording. Explicitly targetting specific people or long-term recording is somewhere between frowned upon to flat out illegal.
It seems like OP is commenting on this thread; you can accuse them of lying directly, if you'd like.
GDPR, Safeguarding, liability for the building you operate in, money laundering. there are lots of laws you are liable for.
Does this in practice mean that the original human person would have to pay that fine? What would the consequences likely be for the original human person?
If those consequences remain severe, then it's not a simple legal manoeuvre after all. This reduces farcicality, but also means there's no way for an individual to safely run this kind of website.
If those consequences round to zero, my next question would be: Can a large company spin up a CIC just to shield itself in the same way? (If so, it seems the farce would be complete.)
It’s why when a law/rule/standard has a carveout for its first edge case, it quickly becomes nothing but edge cases all the way down. And because language is ever-changing, rules lawyering is always possible - and governments must be ever-resistant to attempts to rules lawyer by bad actors.
Modern regulations are sorely needed, but we’ve gone so long without meaningful reform that the powers that be have captured any potential regulation before it’s ever begun. I would think most common-sense reforms would say that these rules should be more specific in intent and targeting only those institutions clearing a specific revenue threshold or user count, but even that could be exploited by companies with vast legal teams creating new LLCs for every thin sliver of services offered to wiggle around such guardrails, or scriptkiddies creating millions of bot accounts with a zero-day to trigger compliance requirements.
Regulation is a never-ending game. The only reason we “lost” is because our opponent convinced us that any regulation is bad. This law is awful and nakedly assaults indietech while protecting big tech, but we shouldn’t give up trying to untangle this mess and regulate it properly.
> Senior accountability for safety. To ensure strict accountability, each provider should name a senior person accountable to their most senior governance body for compliance with their illegal content, reporting and complaints duties.
I have zero legal connection to the UK and their law doesn't mean jack to me. I look forward to thoroughly ignoring it, in the same way that I thoroughly ignore other dumb laws in other distant jurisdictions.
UK, look back on this as the day -- well, another day -- when you destroyed your local tech in favor of the rest of the world.
this is similar to running a cricket club, or scout club
For running a scout association each lesson could technically require an individual risk assessment for every piece of equipment, and lesson. The hall needs to be safe, and you need to prove that it's safe. Also GDPR, and safeguarding, background checks, money laundering.
> hold up to legal scrutiny from an activist court official
Its not the USA. activist court officials require a functioning court system. Plus common law has the concept of reasonable. A moderated forum will be of a much higher standard of moderation than facebook/twitter/tiktok.
I've been wanting to pay with remote modern terminals and Ratatui anyway.
Too many cobras > bounty for slain cobras > people start breeding them for the bounty > law is revoked > people release their cobras > even more cobras around
These are not unintended consequences. All media legislation of late has been to eliminate all but the companies that are largest and closest to government. Clegg works at Facebook now, they'd all be happy to keep government offices on the premises to ensure compliance; they'd even pay for them.
Western governments are encouraging monopolies in media (through legal pressure) in order to suppress speech through the voluntary cooperation of the companies who don't want to be destroyed. Those companies are not only threatened with the stick, but are given the carrots of becoming government contractors. There's a revolving door between their c-suites and government agencies. Their kids go to the same schools and sleep with each other.
A cycling site with 275k MAU would be in the very lowest category where compliance is things like 'having a content moderation function to review and assess suspected illegal content'. So having a report button.
Does the UK have a similar concept?
Im surprised they don’t already have some form of report/flag button.
As the manager of a community where people meet in person, I understand where he is coming from. Acting like law enforcement puts one in a position to confront dangerous individuals without authority or weapons. It is literally life-endangering.
If we can get the voters to understand the things you mention, then maybe we’d have a chance.
The US Supreme Court disagrees. https://www.dentons.com/en/insights/articles/2024/july/3/-/m...
> Read the regs and you can absolutely see how complying with them to allow for banana peeling could become prohibitively costly. But the debate of whether they are pro-fruit or anti-fruit misses the point. If daycares end up serving bags of chips instead of bananas, that’s the impact they’ve had. Maybe you could blame all sorts of folks for misinterpreting the regs, or applying them too strictly, or maybe you couldn’t. It doesn’t matter. This happens all the time in government, where policy makers and policy enforcers insist that the negative effects of the words they write don’t matter because that’s not how they intended them.
> I’m sorry, but they do matter. In fact, the impact – separate from the intent – is all that really matters.
[0] https://www.eatingpolicy.com/p/stop-telling-constituents-the...
I think most of the examples fit this, but a few don't.
Industrialization was somewhat successful; I am eating off an Argentine plate, on an Argentine table, with Argentine utensils (ironically made of stainless steel rather than, as would be appropriate for Argentina, silver) while Argentine-made buses roar by outside. A century ago, when we were rich, all those would have been imported from Europe or the US, except the table. My neighborhood today is full of machine shops and heavy machinery repair shops to support the industrial park across the street. Even the TV showing football news purports to be Argentine, but actually it's almost certainly assembled in the Tierra del Fuego duty-free zone from a Korean or Chinese kit.
There is not much similarity.
Plenty of things in UK law attract "an unlimited fine", but even that doesn't lead to people actually being fined amounts greater than all the money that's ever existed.
Sad that lufguss will probably become just another channel on one of the big platforms. RIP.
The problem is that the real problems are very hard, and their job is to simplify it to their constituents well enough to keep their jobs, which may or may not line up with doing the right thing.
This is a truly hard problem. CSAM is a real problem, and those who engage in its distribution are experts in subverting the system. So is freedom of expression. So is the onerous imposition of regulations.
And any such issue (whether it be transnational migration, or infrastructure, or EPA regulations in America, or whatever issue you want to bring up) is going to have some very complex tradeoffs and even if you have a set of Ph.Ds in the room with no political pressure, you are going to have uncomfortable tradeoffs.
What if the regulations are bad because the problem is so hard we can't make good ones, even with the best and brightest?
The EU and UK have been making these anti-tech, anti-freedom moves for years. Nothing can be better if you are from the US. Just hoover up talent from their continent.
CSAM is absolutely horrible.. but CSAM laws don't stop CSAM (primarily this happens from group defections).
Instead it's just a form of tarring, in this case unliked speech, by associating it with the most horrible thing anyone can think of.
But for an answer, I've done what folks do - spent decades carefully listening to legislators (and judges!) reveal their expertise in the fields I work and interact with.
Ron Wyden aside, authentic technical competency from legislators is so uncommon it stand out. Glaringly. What technical acumen we do get pretty much always rhymes with lobbyists talking points.
I expect my perspective to be boringly familiar here.
And AFAIK, we don't have any other Ron Wydens serving in Congress or coming onboard.
That is, someone with the basic technical understanding to foresee reasonable downstream consequences of the laws they vote on. Not someone with a minimal technical awareness that was crafted to be a lobbyists tool.
I will be genuinely grateful if someone would correct me here.
the liability is very high, and whilst I would perceive the risk to be low if it were based on how we moderate... the real risk is what happens when one moderates another person.
as I outlined, whether it's attempts to revoke the domain names with ICANN, or fake DMCA reports to hosting companies, or stalkers, or pizzas being ordered to your door, or being signed up to porn sites, or being DOX'd, or being bombarded with emails... all of this stuff has happened, and happens.
but the new risk is that there is nothing about the Online Safety Act or Ofcom's communication that gives me confidence that this cannot be weaponised against myself, as the person who ultimately does the moderation and runs the site.
and that risk changes even more in the current culture war climate, given that I've come out, and that those attacks now take a personal aspect too.
the risk feels too high for me personally. it's, a lot.
Seems a bit megalomaniacal.
"I'm not interested in doing this any more. Therefore I'll shut it down for everyone"
That means you need to do CSAM scanning if you accept images, CSAM URL scanning if you accept links, and there’s a lot more than that to parse here.
Not sure how keeping kids off the internet keeps them indoors? Surely the opposite is true?
In practice, this means the local cycling forum that fostered trust, friendship, and even mental health support is at risk of vanishing, while the megacorps sail on without a scratch. Ironically, a measure allegedly designed to rein in “Big Tech” ends up discouraging small, independent communities and pushing users toward the same large platforms the legislation was supposedly targeting.
It’s discouraging to watch governments double down on complex, top-down solutions that ignore the cultural and social value of these smaller spaces. We need policy that recognises genuine community-led forums as a public good, encourages sustainable moderation practices, and holds bad actors accountable without strangling the grassroots projects that make the internet more human. Instead, this act risks hollowing out our online diversity, leaving behind a more homogenised, corporate-dominated landscape.
Exactly the complaint that everyone on here made about GDPR, saying the sky would fall in. If you read UK law like an American lawyer you will find it very scary.
But we don't have political prosecuters out to make a name for themselves, so it works ok for us.
I can't imagine one person running over 300 forums with 275,000 active users. That gives you an average of eight minutes a week to tend to the needs of each one.
I used to run a single forum with 50,000 active users, and even putting 20 hours a week into it, I still didn't give it everything it needed.
I know someone currently running a forum with about 20,000 active users and it's a full-time job for him.
I don't understand how it's possible for one person to run 300 forums well.
In the same way that you could be sued for anything, I'm sure you could also be dragged to court for things like that under this law... And probably under existing laws, too.
That doesn't mean you'll lose, though. It just means you're out some time and money and stress.
So what do you do to entertain children? Use what you have. Dunk them on the internet via YouTube first and then let them free range because you’re tired and can’t give a fuck anymore.
^1 https://abcnews.go.com/amp/GMA/Family/mom-arrested-after-son... ^2 https://www.aol.com/news/2015-12-03-woman-gets-arrested-for-...
This way, people have been given plenty of advanced notice and can start their own forums somewhere instead. I'm sure each of the 300 subforums already has some people running them, and they could do the above if they actually cared.
I find it hard to believe someone will take over 300 forums out of the goodness of their hearts and not start making it worse eventually, if not immediately.
https://www.enforcementtracker.com/
They're only cataloging the (2500+) publicly known ones, most of which have a link to a news article. As an example: some guy in Croatia emailed a couple websites he thought might be interested in his marketing services, and provided a working opt-out link in his cold emails. One of them reported the email to the Italian Data Protection Authority who then put him through an international investigation and fined him 5000 euro.
"Assuming here that the reasons expressed in the aforementioned document have been fully recalled, [individual] was charged with violating articles 5, par. 1, letter a), 6, par. 1, letter a) of the Regulation and art. 130 of the Code, since the sending of promotional communications via e-mail was found to have been carried out without the consent of the interested parties. Therefore, it is believed that - based on the set of elements indicated above - the administrative sanction of payment of a sum of €5,000.00 (five thousand) equal to 0.025% of the maximum statutory sanction of €20 million should be applied."
Seriously, the problem is not politicians being clueless about all the above, but having too much power which makes them think they need to solve everything.
I've heard it called "law of unintended consequences" and "cobra effect".
For instance, I know of a company that flouted GDPR and got multiple letters off the ICO trying to help them with compliance before finally, months later, they ended up in court and got a very small fine.
Edit: it is not cool to edit your post after I replied to make it look more reasonable
When intentional, this is Regulatory Capture. Per https://www.investopedia.com/terms/r/regulatory-capture.asp :
> Regulation inherently tends to raise the cost of entry into a regulated market because new entrants have to bear not just the costs of entering the market but also of complying with the regulations. Oftentimes regulations explicitly impose barriers to entry, such as licenses, permits, and certificates of need, without which one may not legally operate in a market or industry. Incumbent firms may even receive legacy consideration by regulators, meaning that only new entrants are subject to certain regulations.
A system with no regulation can be equally bad for consumers, though; there's a fine line between too little and too much regulation. The devil, as always, is in the details.
I would argue the honorable thing to do in the event excess monies remain would be to donate it to a charity. Using it for personal ends, whatever the details, is wrong because that's not what the donations were for.
>Every step that law takes down the enormous hierarchy of bureaucracy, the incentives for the public servants who operationalize it is to take a more literal, less flexible interpretation. By the time the daycare worker interacts with it, the effect of the law is often at odds with lawmakers’ intent.
Put another way, everyone in the chain is incentivized to be very risk averse when faced with a vague regulation, and this risk aversion can compound to reach absurd places.
I used to walk and ride my bike to school. I was in 4th grade. 9 years old.
You show me a 9-year-old walking alone to school today, and I'll show you a parent who's getting investigated for child neglect. It's maddening.
So that chain of consequences means today's kids are meant to be watched 24/7, and that usually means they're cooped up inside. They're still facing "Stranger Danger" (except through Snap or whatever games they're playing), and now they're also in poorer health.
No a large company can't spin up a CIC to run a business website (because it is not community interest), but it doesn't need to, it is already a limited liability company. However this is not a farce, the limited liability applies to the shareholders, not the company. The company gets fined, and has to pay the fine or risk having its assets siezed.... then the shareholders have lost their company. The liability of the shareholders is limited to the shareholders invested amount, ie the shareholders can't lose any more than they put in. So if the fine was more than the company can afford, the shareholders lose their company, but don't have to pay the rest.
It is not a farce, because losing a profit earning company is bad for a shareholder
The Online safety bill gives Ofcom the power to levy regulatory fines, not criminal sanctions, so is very different
I have no knowledge of your site, but I'm still sad to see it having to shut down.
So? Tons of millennials barely understand technology too. I'd say a politician being one makes the odds they know tech marginally better, but I still interact with people of my generation that barely know what a filesystem is, let alone how to make one, or why it's important.
Filing accounts: £15. An online form will ask you for your balance sheet summary only unless you are very large.
One off registration:£65
Annual confirmation statement:£34
So depends on your perspective I suppose.
Honestly, same could be said for this one, it reads less like an attempt at making the internet better and more like a technical sounding PR stunt with sneaky power encroachment thrown in.
"We just need you to uses your government ID to sign in because of the children, we have a long track record of competent execution, maintenance and accountability, we are 100% not going to use this for other ...reasons"
It's the same governmental "Trust me bro, think of the children" they always throw out.
Outside of the intelligence agencies the UK government is absolutely diabolical at anything technical, chronically overbudget (because their original budget was decided by someone in an office with no actual experience managing an IT project) on projects they outsourced to corrupt friends who siphon the money away, not just IT, all projects.
They pay atrociously for the level of skill required for the positions advertised, so they get middle of the road staff, which isn't a problem normally, middle of the road is the backbone of IT projects.
The problem arises when you get actively bad project management, either incompetence or outright maliciousness, throw in some glacial bureaucracy laden processes that didn't work when they were drafted 40 years ago, let alone now.
and you get an entire industry of corruption and mediocrity.
/rant
anyway, i mean, sure you can take the lacklustre GDPR enforcement and use that to make decisions going forward, i wouldn't personally, because i don't think a single data point is a good basis for risk assessment.
DMCA, youtube copyright strikes, domain strikes, bank transaction complaints/chargebacks, all are mechanisms used to attack internet based businesses.
Do they serve a purpose, debatable, are they misused on a regular basis, absolutely.
This isn't a "the sky is falling" this is a "They have put into law the ability to drop the sky on me just because they (the government, or disgruntled internet denizens) feel like it"
It's up to you to decide how likely you think that is and plan accordingly.
There was a story very recently about the whole of itch.io going down because of some overzealous rent-seeking bullshit middleman (hired by rent-seeking bullshit artist FunkoPop)
HOWEVER: I'm not sure how you would get access to the CSAM hash database if you're were starting a new online image hosting service.
The requirements to sign up for IWF (the defacto UK CSAM database) membership are:
- be legally registered organisations trading for more than 12 months;
- be publicly listed on their country registration database;
- have more than 2 full-time unrelated employees;
- and demonstrate they have appropriate data security systems and processes in place.
Cloudflare have a free[1] one but you have to be a Cloudflare customer.
Am I missing something, or does this make it very difficult to start up a public facing service from scratch?
The usual alternative to import substitution industrialization is export-focused industrialization. Argentina and Brazil exemplify the former; Japan, Taiwan, South Korea, Hong Kong, and now the PRC exemplify the latter. The line between them is whether the country's manufactures are widely exported.
From the linked document above: "You need to keep a record of each illegal content risk assessment you carry out", "service providers may fully review their risk assessment (for example, as a matter of course every year)"
And links to a guidance document on reviewing the risk assessment[1] which says: "As a minimum, we consider that service providers should undertake a compliance review once a year".
[1] https://www.ofcom.org.uk/siteassets/resources/documents/onli...
To begin with, the premise would have to be challenged. Many, many bad regulations are bad because of incompetence or corruption rather than because better regulations are impossible. But let's consider the case where there really are no good regulations.
This often happens in situations where e.g. bad actors have more resources, or are willing to spend more resources, to subvert a system than ordinary people. For example, suppose the proposal is to ban major companies from implementing end-to-end encryption so the police can spy on terrorists. Well, that's not going to work very well because the terrorists will just use a different system that provides E2EE anyway and what you're really doing is compromising the security of all the law-abiding people who are now more vulnerable to criminals and foreign espionage etc.
The answer in these cases, where there are only bad policy proposals, is to do nothing. Accept that you don't have a good solution and a bad solution makes things worse rather than better so the absence of any rule, imperfect as the outcome may be, is the best we know how to do.
The classical example of this is the First Amendment. People say bad stuff, we don't like it, they suck and should shut up. But there is nobody you can actually trust to be the decider of who gets to say what, so the answer is nobody decides for everybody and imposing government punishment for speech is forbidden.
Heh, welcome to the internet where the perpetrator and the beneficiary can be in different jurisdictions that make enforcement on the original bad actors impossible.
For example, have a friend in China upload something terrible to a UK site and then 'drop the dime' to a regular in the UK. The UK state can easily come after you and find it nearly impossible to go after the international actor.
The UK has lots of regulatory bodies and they all work in broadly the same way. Provided you do the bare minimum to comply with the rules as defined in plain English by the regulator, you won't either be fined or personally liable. It's only companies that either repeatedly or maliciously fail to put basic measures in place that end up being prosecuted.
If someone starts maliciously uploading CSAM and reporting you, provided you can demonstrate you're taking whatever measures are recommended by Ofcom for the risk level of your business (e.g. deleting reported threads and reporting to police), you'll be absolutely fine. If anything, the regulators will likely prove to be quite toothless.
It seems far too common that regulations are putting the liability / responsibility for a problem onto some group of people who are not the cause of the problem, and further, have limited power to do anything about the problem.
As they say, this is why we can't have nice things.
I think there’s a pretty decent argument being made here that OP is reading too far in the new rules and letting the worst case scenario get in the way of something they’re passionate about.
I wonder if they consulted with a lawyer before making this decision? That’s what I would be doing.
The UK tends to be a lot more (IMO) reasonable in its approach than some other European countries. Italy tends to be one of the strictest, and likes to hand out fines, even to private individuals for things like having a doorbell camera. The UK has only fined one person on that basis, and it was more of a harassment case rather than just simply that they had a camera.
ICO and Ofcom aren't generally in the business of dishing out fines unless it's quite obviously warranted.
You have to keep accounts if a business even if not incorporated. A company has to keep accounts if it has any assets (e.g. a domain) or any financial transactions (e.g. paying for hosting)
You will also probably have to file a tax return. You have to keep a register of shareholders.
In fact if definitely not making a profit a standard ltd might be simpler (or maybe a company limited by guarantee) then a CIC as all a CIC does it add restrictions and extra regulation https://assets.publishing.service.gov.uk/media/5a7b800640f0b...
Or go further.
Sometimes the answer is to remove regulations. Specifically, those laws that protect wrongdoers and facilitators of problems. Then you just let nature take its course.
For the mostpart though, this is considered inhumane and unacceptable.
Indeed, so this cost is not relevant to the decision to set up a CIC or not
Too bad this isn't the case here.
If you're talking about legalizing vigilantism, you would then have to argue that this is a better system and less prone to abuse than some variant of the existing law enforcement apparatus. Which, if you could do it, would imply that we actually should do that. But in general vigilantes have serious problems with accurately identifying targets and collateral damage.
> provided you can demonstrate you're taking whatever measures are recommended by Ofcom
That level of moderation might not be remotely feasible for a sole operator. And yes, there's a legitimate social question here: Should we as a society permit sites/forums that cannot be moderated to that extent? But the point I'm trying to make is not whether the answer to that question is yes or no, it's that the consequences of this Act are that no sensible individual person or small group will now undertake the risk of running such a site.
Even if US immigration were more liberal, moving is very costly (financially, emotionally, psychologically). Injustice anywhere is a threat to justice everywhere.
The risk and cost imbalance is much more extreme than that of a lawsuit.
I'm confident that, were I sufficiently motivated, I could upload a swathe of incriminating material to a website and cover my tracks within a couple of hours, doing damage that potentially costs the site operator £18M with no risk to myself -- not even my identity would be revealed. OTOH, starting a lawsuit at the very least requires me to pay for a lawyer's time, my face to appear in the court -- and if the suit is thrown out, I'll need to pay their court costs, too.
Could a company create a non-CIC sub-company (with ~$0 in assets) to own the website, and thereby shield shareholders of the original company? (If so, I think farcicality is conserved.)
It gets messy because, by definition the moment you remove the laws, the parties cease to be criminals... hence my Bushism "wrongdoers" (can't quite bring myself to say evil-doers :)
One hopes that "criminals" without explicit legal protection become disinclined to act, rather than become victims themselves. Hence my allusion to "nature", as in "Natural Law".
"Might is right" is no good situation either. But I feel there's a time and place for tactical selective removal of protectionism (and I am thinking giant corporations here) to re-balance things.
As a tepid example (not really relevant to this thread), keep copyright laws in place but only allow individuals to enforce them.
This is what judges are for. A human judge can understand that the threshold is intended to apply across the parent company when there is shared ownership, and that bot accounts aren't real users. You only have to go back and fix it if they get it wrong.
> The only reason we “lost” is because our opponent convinced us that any regulation is bad. This law is awful and nakedly assaults indietech while protecting big tech, but we shouldn’t give up trying to untangle this mess and regulate it properly.
The people who passed this law didn't do so by arguing that any regulation is bad. The reason you lost is that your regulators are captured by the incumbents, and when that's the case any regulation is bad, because any regulation that passes under that circumstance will be the one that benefits the incumbents.
"children are getting raped and we aren't going to do anything about it because we want to protect indie websites" sounds a lot worse than "this is a significant step in combatting the spread of online child pornography", even if reality is actually far more complicated.
The next UK general election is ~5 years away so this makes no sense.
The more likely reason is that it's simply good policy. We have enough research now that shows that (a) social media use is harmful for children and (b) social media companies like Meta, TikTok etc have done a wilfully poor job at protecting them.
It is bizarre to me how many people here seem willing to defend them.
"The Spectator asked the Tory leader — elected to the head of the U.K. opposition party in November — if she ever took a lunch break."
The Spectator are using their press privileges to ask party leaders about their personal lifestyle rather than asking about anything relevant to policy - and although the Spectator might be forgiven for that, it is indefensible for 'serious' newspapers such as the Guardian and the Telegraph to be giving this story front-page status.
There are lots of politicians for us to be embarrassed about, but perhaps even more journalists.
tl;dr: This is a myth.
There is no incentive to the consumer to purchase a vehicle with worse fuel economy.
There USED to be an incentive, 30-40 years ago.
It is not 1985 anymore.
The gas guzzler tax covers a range of fuel economies from 12.5 to 22.5 mpg.
It is practically impossible to design a car that gets less than 22.5 mpg.
The Dodge Challenger SRT Demon 170, with an 6.2 L 8 cylinder engine making ONE THOUSAND AND TWENTY FIVE horsepower is officially rated for 13 mpg but that's bullshit, it's Dodge juicing the numbers just so buyers can say "I paid fifty-four hundred bucks gas guzzler tax BAYBEE" and in real-world usage the Demon 170 is getting 25 mpg. Other examples of cars that cannot achieve 22.5 mpg are the BMW M2/M3/M4/M8, the Cadillac CT5, high-performance sports sedans for which the gas guzzler tax is a <5% price increase. ($5400 is 5% of the Demon 170 price, but 2-3% of what dealers are actually charging for it.)
The three most popular vehicles by sales volume in the United States are: 1. The Ford F-150, 2. The Chevy Silverado, and 3. The Dodge Ram 1500.
The most popular engine configuration for these vehicles is the ~3L V6. Not a V8. A V6.
Less than 1/4th of all pickup trucks are sold equipped with a V8.
According to fueleconomy.gov every single Ford, Chevrolet, and Ram full-size pickup with a V6 would pay no gas guzzler tax.
Most V8s would be close, perhaps an ECU flash away, to paying no gas guzzler tax. The only pickups that would qualify for a gas guzzler tax are the high-performance models-- single-digit percentages of the overall sales volume and at those prices the gas guzzler tax would not even factor into a buyer's decision.
People buy trucks, SUVs, and compact SUVs because they want them and can afford them.
Not because auto manufacturers phased out cars due to fuel economy standards. Not because consumers were "tricked" or "coerced". And certainly not because "the gubmint" messed things up.
They buy them because they WANT them.
The Toyota RAV4 is the 4th most popular car in the US. The Corolla is the 13th most popular. They are built on the same platform and dimensionally, the Corolla is actually very slightly larger except for height. They both come with the same general ballpark choices in engines. The gas guzzler tax only applies to the Corolla, but that doesn't matter because they both would be exempt. People don't freely choose the RAV4 over the Corolla because of fuel economy they buy it because the Corolla has 13 cubic feet of cargo capacity and the RAV4 has 70 cubic feet.
And before anyone says that the gas guzzler tax made passenger cars more expensive, passenger cars can be purchased for the same price adjusted for inflation they could be 50 years ago, but people don't want a Mitsubishi Mirage, which is the same price as a vintage VW Beetle (perennial cheapest new car from the 1960s) and better in every quantifiable metric, they want an SUV.
What may be true is that there is a national policy to keep fuel prices as low as possible, for a myriad of reasons, with one side effect of that policy being that it has enabled people to buy larger less fuel-efficient cars.
I do not believe it is auto manufacturers who are pushing for this policy. I believe it is the freight and logistic market. The auto market is valued at $4 billion, the freight and logistics market is $1,300 billion. GM and Ford are insignificant specks compared to the diesel and gasoline consumers of the freight and logistics firms (who have several powerful lobbies).
https://www.thetruthaboutcars.com/2017/08/v8-market-share-ju...
https://www.irs.gov/pub/irs-pdf/f6197.pdf (gas guzzler worksheet)
You don't think Meta, TikTok etc are the cause of the problem ?
I appreciate that Lfgss is somewhat collateral damage but the fact is that if you're going to run a forum you do have some obligation to moderate it.
Which really should be happening anyway.
I would strongly prefer that forums I visit not expose me to child pornography.
I've just finished recording a Cybershow episode with two experts in compliance (ISO42001 coming on the AI regulatory side - to be broadcast in January).
The conversation turned to what carrots can be used instead of sticks? Problem being that large corps simply incorporate huge fines as the cost of doing business (that probably is relevant to this thread)
So to legally innovate, instead, give assistance (legal aid, expert advisor) to smaller firms struggling with compliance. After all governments want companies to comply. It's not a punitive game.
Big companies pay their own way.
This is the problem with many European (and I guess also UK) laws.
GDPR is one notable example. Very few people actually comply with it properly. Hidden "disagree" options in cookie pop-ups and unauthorized data transfers to the US are almost everywhere, not to mention the "see personalized ads or pay" business model.
Unlike with most American laws, GDPR investigations happen through a regulator, not a privately-initiated discovery process where the suing party has an incentive to dig up as much dirt as possible, so in effect, you only get punished if you either really go overboard or are a company that the EU dislikes (which is honestly mostly just Meta at this point).
If it's the company, the shareholders etc are not liable.
I don’t like this new legislation one bit, but…
It’s not obvious to me that from the post or what I know of the legislation that OP is at meaningfully greater risk of being sued by someone malicious/vindictive or just on a crusade about something that they have been prior to the legislation. (Unless, of course, there forums have a consistent problem with significant amounts of harmful content like CSAM, hate speech, etc.)
I am not saying that the risk isn’t there or that this isn’t the prudent course of action, I just don’t feel convinced of it at this point.
Politicians can be very very good at those things, when they have a reason to be.
I am of the opinion that the vast majority of journalists are simply stenographers. I wouldn't expect them to do their job. Unfortunately you have do piece together the truth for yourself.
https://developers.cloudflare.com/cache/reference/csam-scann...
The thing though is how to finance it and how to provide stewardship for the sites going forward.
Running sites like this post is about is not profitable. Nor is it too resource intensive.
This is a way to regulate political speech and create a weapon to silence free speech online. It's what opponents to these measures have been saying forever. Why do we have to pretend those enacting them didn't listen, are naive, or are innocent well intentioned actors? They know what this is and what it does. The purpose of a system is what it does.
Related to this, and one version of a label for this type of silencing particularly as potentially weaponized by arbitrary people not just politicians is Heckler's veto. Just stir up a storm and cite this convenient regulation to shut down a site you don't like. It's useful to those enacting these laws that they don't even themselves have to point the finger, disgruntled users or whoever will do it for them.
You can’t really put a corporation in jail, but you could cut it off from the world in the same way that a person in jail is cut off. Suspend the business for the duration of the sentence. Steal a few thousand bucks? Get shut down for six months, or whatever that sentence would be.
I work for the latter kind of merchant, and "complexity" is not a word I would associate with VATMOSS. Here is what we've had to do to deal with VATMOSS:
• Register with the tax authority in a country that was part of VATMOSS. We registered with Ireland. We did this online via the Irish tax authority's web site. It took something like 15-30 minutes.
• Collect VAT. VAT rates are country wide and don't change very often so it is easy to simply have a rate table in our database. No need to integrate any third party paid tax processing API into our checkout process.
Once a month I run a script that uses a free API from apilayer.com to get the rates for each country and tell me if any do not match the rates in our database, but that's just because I'm lazy. :-) It's not much work to just manually search to find news of upcoming VAT rate changes.
• At the start of each quarter we have to report how much we sold and how much VAT we collected for each country. I run a script I wrote that generates a CSV file with that data from our database. We upload it to the Irish tax authority's web site and send them the total VAT. They deal with distributing the data and money to the other countries.
It was a bit more complicated before Brexit. Back then we made the mistake of picking the UK as our country to register with. Instead of going online by making a web-based way to do things like Ireland did, the UK did it by making available OpenOffice versions of their paper forms for download. You could download those, edit them to contain your information, and then upload them.
Is there an argument why we would want it any other way?
so thanks for all that buro9! <3
It may turn out that it is too much work to comply and so you might still need to shut down, but with the LLC you've got a lot more leeway to try without personal risk.
Authoritarians don't want people to be able to talk (and organize) in private. What better way to discourage them than some "think of the children" nonsense? That's how they attacked (repeatedly) encryption.
Google, Facebook, and Twitter all could have lobbied against this stuff and shut it down, hard. They didn't.
That speaks volumes, and my theory is that they feel shutting down these forums will push people onto their centralized platforms, increasing ad revenues - and the government is happy because it's much easier to find out all the things someone is discussing online.
Finding someone trustworthy is hard, but I know buro9 knows tons of people.
I completely understand a desire to shut things down cleanly, rather than risk something you watched over for years become something terrible.
Because no one would fork over stupid amounts of money for a f*k off big truck if they didn't have a real need. Right?
I have imagined a sci-fi skit where James works at CorpCo, a company that was caught doing something illegal and sentences to prison. As punishment James goes to work by reporting in at a prison at 8 am. He sits in his cell until his 'work day' is over and it's released at 5 pm to go home. It's boring, but hey, it pays well.
CSAM is NOT a hard problem. You solve it with police work. That's how it always gets solved.
You don't solve CSAM with scanners. You don't solve CSAM with legislation. You don't solve CSAM by banning encryption.
You solve CSAM by giving money to law enforcement to go after CSAM.
But, see, the entities pushing these laws don't actually care about CSAM.
Everything else you listed are right wing conspiracy theories.
I used to run a moderately sized forum for a few years. Death threats, legal threats, had faeces mailed to my house, someone found out where I worked and started making harrasing calls/turning up to the office.
I don't run a forum no more. For what I feel are obvious reasons.
> I do so philanthropically without any profit motive (typically losing money)
the cost (and hassle) of consulting with a lawyer is potentially a lot in relative terms.
That said, I thought that the rule in the UK was generally that the loser pays the winners costs, so I'd think that limit the costs of defending truly frivolous suits. The downside risks are possibly still high though.
Restauranteurs can't say they don't have time to comply with the law. Construction companies can't. Doctors can't. Why should online service providers be able to?
As an example of impacts not necessarily correlated with size, a comms platform for, say, the banking or finance communities, or defence and military systems, would likely have stronger concerns than one discussing the finer points of knitting and tea.
It's honestly super weird. Now of course they are just proposing to tax the tech companies if they don't pay money to our local media orgs for something the tech companies neither want nor care about.
It seems that some people are convinced that the benefits of having strangers interact with each other are not worth the costs. I certainly disagree.
That wasn't the one I was thinking of, to be honest.
I'd have thought you would be mentioning the latest ball of WTF: "Online Safety Amendment (Social Media Minimum Age) Bill 2024".
According to the bill, HN needs to identify all Australian users to prevent under-16's from using it.
https://www.aph.gov.au/Parliamentary_Business/Bills_Legislat...
It almost always doesn't, because the big guys have lobbyists and the small guys don't.
The big guys would rather not have to comply with these rules, but typically their take is, well, if we're going to have to anyway, let's at least make it an opportunity to drive out some of the scrappy competition and claim the whole pie for ourselves.
I don't know where you're seeing that as the site does not have such things. The only cookies present are essential and so nothing further was needed.
The site does not track you, sell your data, or otherwise test you as a source of monetisation. Without such things conforming with cookie laws is trivial... You are conformant by just connecting nothing that isn't essential to providing the service.
For most of the sites only a single cookie is set for the session, and for the few via cloudflare those cookies get set too.
But yes, I'm confused as to whether it applies to online gaming, or sites such as wikipedia as well
The other link you have is neighbors that obviously dislike each other, and they told the cops the kid was in danger.
Although I do think they overlook that their legislation is restricted to their domestic market though, so any potential positive effect is more or less immediately negated. That is especially true for English speaking countries.
Imagine a society so stable it doesn't need new laws or rules. All the elected representatives would just sit around all day and twiddle their thumbs. A bad look in their eyes.
This is how it should be of course.
I don't believe this kind of regulation will do anything but put the real criminals more underground while killing all these helpful community initiatives. It's just window dressing for electoral purposes.
The joke used to be that Boomers don't understand the internet, even though they invented it.
Based on that experience I guess it should be no surprise that now Millennials don't understand the web even though we were born on web 1.0, grew up on web 2.0, and created web 3.0.
That is like saying "when we write software there are bugs, so rather than fix them, we should never write software again".
Your second example is ascribing to regulation something that goes way beyond regulation.
Lfgss is heavily moderated, just maybe not in a way you could prove to a regulator without an expensive legal team...
(Unless of course someone is resurrecting the site)
Ggovernment regulation - "good" centralisation?
However it probably wouldn't work for a profit seeking company in this case. Big Corp owns Web Corp, and Web Corp owns the site. Which company is operating the site? If it is Web Corp. So when Web Corp gets fined, you lose your site. This is a problem for a profit seeking company, because it lost its value. If Big Corp owned the site, and Web corp operated the site, you may be OK. Your accountancy costs just went through the roof though. Not sure about this law, but some compliance laws treat the group as one whole entity to stop this sort of thing.
Since this applies to laws in general, are you arguing that corporations are a farce? I may be inclined to agree.
Edit: answering your other point, the company could not have no assets, if it owns the site then it has the site as an asset. If it runs the site then it will have cash etc. Etc.
Winning against the government is difficult - an asymmetric unfair fight. You can't afford to pay the costs to try: financial, risk, opportunity cost, and most importantly YOUR time.
Things change - e.g. 50 years ago no online chats, no drones, very little terrorism, travel was more costly and slower, medical drugs were less efficient, live span was shorter.
The current actual leader of the UK decided to politicise this, in a real moist bread response:
> Prime Minister Keir Starmer — who leads a country grappling with a stagnant economy, straining public services and multiple crises abroad — in turn accused Badenoch of talking down a “Great British institution.”
It's by design. Politicians have fallen for big tech lobbyists once again.
Also who says that the hashes provided by your CSAM database of choice are actually flagging illegal data and not also data that whoever runs the database wants to track down? You have no idea. You are just complicit in the surveillance state, really.
It might even be possible now to combine nuanced perspectives/responses to proposed policies from millions of people together!? I think it's not that unreasonable to suggest that kind of thing nowadays, I think there's precedent for it too even though stuff like how-wikipedia-works isn't really ideal, (even though it's somewhat an example of the main idea!).
This way, the public servants (including politicians) can mainly just take care of making sure the ideas that the people vote-for get implemented! (like all the lower tiers of government currently do - just extend it to the top level too!) I don't think we should give individuals that power any more!
That’s generally true… but only happens after those costs have been incurred and probably paid.
There’s no guarantee the party suing will be able to cover their own costs and the defendant’s costs. That leaves OP on the hook for defence costs with the hope that they might get them back after a successful and likely expensive defence.
In that situation, I can understand why OP wouldn’t want to take the risk.
As sad as it may be, their imagination is correct. The small spaces, summed up all together, are lost in the rounding errors.
Cases where they assume you should say "medium risk" without evidence of it happening are if you've got several major risk factors:
> (a) child users; (b) social media services; (c) messaging services; (d) discussion forums and chat rooms; (e) user groups; (f) direct messaging; (g) encrypted messaging.
Also, before someone comes along with a specific subset and says those several things are benign
> This is intended as an overall guide, but rather than focusing purely on the number of risk factors, you should consider the combined effect of the risk factors to make an overall judgement about the level of risk on your service
And frankly if you have image sharing, groups, direct messaging, encrypted messaging, child users, a decent volume and no automated processes for checking content you probably do have CSAM and grooming on your service or there clearly is a risk of it happening.
The example of "medium risk" for CSAM urls is a site with 8M users that has actively had CSAM shared on it before multiple times, been told this by multiple international organisations and has no checking on the content. It's a medium risk of it happening again.
That would be insane, and it's not true. You have to consider the risks and impacts of your service, and scale is a key part of that.
I think it's really important around this to actually talk about what's in the requirements, and if you think something that has gone through this much stuff is truly insane (rather than just a set of tradeoffs you're on the other side of) then it's worth asking if you have understood it. Maybe you have and lots of other people are extremely stupid, or maybe your understanding of it is off - if it's important to you in any way it probably makes sense to check right?
That's simply not true.
I'm sorry, what precisely do you mean by this? The rules don't punish you for illegal content ending up on your site, so you can't have a user upload something then report it and you get in trouble.
"[A] significant number"? How Britishly vague.
There was one person involved in the doompf of that ceo guy....
I would say that a significant-sized football crowd would be over 75,000.
That's a lot of numbers that 'significant', has to lean on.
Also, is not just small businesses, it is not for profits too.
Sketchy large employers like G4S responded by setting up tens of thousands of "Mini umbrella companies" [1] with directors in the Philippines, each company employing only a handful of people - allowing G4S to benefit from the £4,000 discount tens of thousands of times.
Sadly, exempting small operations from regulation isn't a simple matter.
You can’t beat a good fry up!
He was recently interviewed about that book on the New Books Network:
<https://newbooksnetwork.com/michael-g-vann-the-great-hanoi-r...>
Audio: <https://traffic.megaphone.fm/LIT1560680456.mp3> (mp3)
(Episode begins at 1:30.)
Among the interesting revelations: the rat problem was concentrated in the French Quarter of Hanoi, as that's where the sewerage system was developed. What drained away filth also provided an express subway for rats. Which had been brought to Vietnam by steamship-powered trade, for what it's worth.
(That's only a few minutes into the interview. The whole episode is great listening, and includes a few details on the Freakonomics experience.)
There's only 13 provisions that apply to sites with less than 7 million users (10% of the UK population).
7 of those are basically having an inbox where people can make a complaint and there is a process to deal with complaints.
1 is having a 'report' button for users.
2 say you will provide a 'terms of service'
1 says you will remove accounts if you think they're run by terrorists.
The OP is blowing this out of proportion.
Or another thought, distribute it only through VPN, OpenVPN can be installed on mobiles these days (I have one installed on my Android). Make keys creation part of registration process.
Funnily enough we wonder this about the USA and its drain-circling obsession with giving power -- and now grotesque, performative preemptive obeisance -- to Donald Trump.
As written, it should. Which is ridiculous, and it's a ridiculous law in the first place. I'm loathe to discuss politics, but by god both Labor and the LNP are woeful when it comes to tech policy.
There has not been regulation for online forums for forty years and Earth did not explode or human kind did not end.
Nonsense.
In particular, Merton notes:
Discovery of latent functions represents significant increments in sociological knowledge .... It is precisely the latent functions of a practice or belief which are not common knowledge, for these are unintended and generally unrecognized social and psychological consequences.
Robert K. Merton, "Manifest and Latent Functions", in Wesley Longhofer, Daniel Winchester (eds) Social Theory Re-Wired, Routledge (2016).
<https://www.worldcat.org/title/social-theory-re-wired-new-co...>
More on Merton:
<https://en.wikipedia.org/wiki/Robert_K._Merton#Unanticipated...>
Unintended consequences:
<https://en.wikipedia.org/wiki/Unintended_consequences#Robert...>
Manifest and latent functions:
<https://en.wikipedia.org/wiki/Manifest_and_latent_functions_...>
Also if it is well monitored and seems to have a positive community, I don't see the major risk to shut down. Seems more shutting down out of frustration against a law that, while silly on it's face, doesn't really impact this provider.
Also depending on the terms agreed to when people signed on and started posting, it might be legally or morally difficult because transferring the data to the control of another party could be against the letter or the spirit of the terms users agreed to. Probably not, but I wouldn't want to wave such potential concerns off as “nah, it'll be fine” and hoping for the best.
Even leaving a read-only version up, so a new home could develop with the old content remaining for reference, isn't risk free: the virtual-swatting risk that people are concerned about with this regulation would be an issue for archived content as much as live stuff.
At least people have a full three months notice. Maybe in that time someone can come up with a transfer and continuation plan the current maintainer is happy with, if not the users at least have some time to try to move any social connectivity based around the site elsewhere.
>So when Web Corp gets fined, you lose your site.
My mind immediately goes in the direction of "Maybe you lost that server, but just buy a new one and change some DNS entries", which isn't free but a lot less than £18M. But maybe there are protections against this kind of scheming? I'd like to think there were.
>If Big Corp owned the site, and Web corp operated the site, you may be OK.
I don't follow -- if Big Corp owns the site, won't it lose everything?
>Since this applies to laws in general, are you arguing that corporations are a farce? I may be inclined to agree.
I think I am actually. They do seem like a way to get something for (almost) nothing (and they seem like they were probably engineered to be this way deliberately).
This says it so well, acknowledging the work of a misguided bureaucracy.
Looks like it now requires an online community to have its own bureaucracy in place, to preemptively stand by ready to effectively interact in new ways with a powerful, growing, long-established authoritarian government bureaucracy of overwhelming size and increasing overreach.
Measures like this are promulgated in such a way that only large highly prosperous outfits beyond a certain size can justify maintaining readiness for their own bureaucracies to spring into action on a full-time basis with as much staff as necessary to compare to the scale of the government bureaucracy concerned, and as concerns may arise that mattered naught before. Especially when there are new open-ended provisions for unpredictable show-stoppers, now fiercely codified to the distinct disadvantage of so many non-bureaucrats just because they are online.
If you think you are going to be able to rise to the occasion and dutifully establish your own embryonic bureaucracy for the first time to cope with this type unstable landscape, you are mistaken.
It was already bad enough before without a newly imposed, bigger moving target than everything else combined :\
Nope, these type regulations only allow firms that already have a prominent well-funded bureaucracy of their own, on a full-time basis, long-established after growing in response to less-onerous mandates of the past. Anyone else who cannot just take this in stride without batting an eye, need not apply.
What do you mean by bureaucracy in this case? Doing the risk assessment?
Complex corporate structures enable plausible deniability. The CEO of GFS probably didn't know what was happening, but also probably didn't want to know whilst enjoying the low fees charged from the recruiters.
You need to do a risk assessment and keep a copy. Depending on how risky things are, you need to put more mitigations in place.
If you have a neighbourhood events thing that people can post to, and you haven't had complaints and generally keep an eye out for misuse, that's it.
If you run a large scale chat room for kids with suicidal thoughts where unvetted adults can talk to them in DMs you're going to have a higher set of mitigations and things in place.
Scale is important, but it's not the only determining factor. An example of low risk for suicide harm is
> A large vertical search service specialised in travel searches, including for flights and hotels. It has around 10 million monthly UK users. It uses recommender systems, including for suggesting destinations. It has a basic user reporting system. There has never been any evidence or suggestion of illegal suicide content appearing in search results, and the provider can see no way in which this could ever happen. Even though it is a large service, the provider concludes it has negligible or no risk for the encouraging or assisting suicide offence
An example for high risk of grooming is
> A social media site has over 10 million monthly UK users. It allows direct messaging and has network expansion prompts. The terms of service say the service is only for people aged 16 and over. As well as a content reporting system, the service allows users to report and block other users. While in theory only those aged 16 and over are allowed to use the service, it does not use highly effective age assurance and it is known to be used by younger children. While the service has received few reports from users of grooming, external expert organisations have highlighted that it is known to be used for grooming. It has been named in various police cases and in a prominent newspaper investigation about grooming. The provider concludes the service is high risk for grooming
In this case, it's "I'm shutting down my hobby that I've had for years because I have to add a report button".
I know my wife likes storing things in the boot of our car and I'm not even American. It means they're always conveniently there - chairs for sitting in the park, shopping bags, groceries that she's going to take to a party or bought for someone else, kids sports equipment.
So basically is this act a ban on indvidual communication through undermoderated platforms?
Then you will see that a forum that allows user generated content, and isn't proactively moderated (approval prior to publishing, which would never work for even a small moderately busy forum of 50 people chatting)... will fall under "All Services" and "Multi-Risk Services".
This means I would be required to do all the following:
1. Individual accountable for illegal content safety duties and reporting and complaints duties
2. Written statements of responsibilities
3. Internal monitoring and assurance
4. Tracking evidence of new and increasing illegal harm
5. Code of conduct regarding protection of users from illegal harm
6. Compliance training
7. Having a content moderation function to review and assess suspected illegal content
8. Having a content moderation function that allows for the swift take down of illegal content
9. Setting internal content policies
10. Provision of materials to volunteers
11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM
12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs
...
the list goes on.
It is technical work, extra time, the inability to not constantly be on-call when I'm on vacation, the need for extra volunteers, training materials for volunteers, appeals processes for moderation (in addition to the flak one already receives for moderating), somehow removing accounts of proscribed organisations (who has this list, and how would I know if an account is affiliated?), etc, etc.
Bear in mind I am a sole volunteer, and that I have a challenging and very enjoyable day job that is actually my primary focus.
Running the forums is an extra-curricular volunteer thing, it's a thing that I do for the good it does... I don't do it for the "fun" of learning how to become a compliance officer, and to spend my evenings implementing what I know will be technically flawed efforts to scan for CSAM, and then involve time correcting those mistakes.
I really do not think I am throwing the baby out with the bathwater, but I did stay awake last night dwelling on that very question, as the decision wasn't easily taken and I'm not at ease with it, it was a hard choice, but I believe it's the right one for what I can give to it... I've given over 28 years, there's a time to say that it's enough, the chilling effect of this legislation has changed the nature of what I was working on, and I don't accept these new conditions.
The vast majority of the risk can be realised by a single disgruntled user on a VPN from who knows where posting a lot of abuse material when I happen to not be paying attention (travelling for work and focusing on IRL things)... and then the consequences and liability comes. This isn't risk I'm in control of, that can be easily mitigated, the effort required is high, and everyone here knows you cannot solve social issues with technical solutions.
While almost everybody including me shares this perference maybe it should be something that browsers could do? After all why put the burden on countless various websites if you can implement it in a single piece of software?
This could also make it easier to go after people who are sources of such material because it wouldn't immediately disappear from the network often without a trace.
A forum that isn't proactively monitored (approval before publishing) is in the "Multi-Risk service" category (see page 77 of that link), and the "kinds of illegal harm" include things as obvious as "users encountering CSAM" and as nebulous as "users encountering Hate".
Does no-one recall Slashdot and the https://en.wikipedia.org/wiki/Gay_Nigger_Association_of_Amer... trolls? Such activity would make the site owner liable under this law.
You might glibly reply that we should moderate, take it down, etc... but we, is me... a single individual who likes to go hiking off-grid for a vacation and to look at stars at night. There are enough times when I could not respond in the timely way to moderate things.
This is what I mean by the Act providing a weapon to disgruntled users, trolls, those who have been moderated... a service providing user generated content in a user to user environment can trivially be weaponised, and it will be a very short amount of time before it happens.
Forum invasions by 4chan and others make this extremely obvious.
> What may be true is that there is a national policy to keep fuel prices as low as possible, for a myriad of reasons, with one side effect of that policy being that it has enabled people to buy larger less fuel-efficient cars.
Yes. Americans have always had cheap fuel and it's shaped the entire society around it.
Bigger vehicles are popular in the US because people want to be in a bigger vehicle and sit higher up than others, AND can afford to do so (ignoring their long term finances). I.e. the politically popular policy of low gas prices.
That's the long and short of it. Buyers rewarded the sellers that sold big and tall vehicles, so obviously sellers are going to sell big and tall vehicles.
There was no situation where buying a big and tall vehicle was cheaper than a smaller, more fuel efficient vehicle, so conclusively, people chose to spend more to get what they wanted. Of course, once someone else gets a bigger vehicle, then you are less safe, unless you get a bigger vehicle, and so on and so forth.
That section details how to calculate the figures, because they're relevant for sections like CSAM scanning
> Services that are at high risk of imagebased CSAM and (a) have more than 700,000 monthly active United Kingdom users or (b) are file-storage and file-sharing services.
And since social media works the way it does, you can also expect a ton of unaffected people to also pick up their pitchforks and join in without having any real clue what's going on.
And that's assuming there isn't some law somewhere that really does put you on the hook for it.
300 forums is a lot of power.
It makes a lot more sense to expect the sub-admins of those forums to start their own communities elsewhere than for just hand power to a single person over all of them.
Misinformation and disinformation were terms created by censors as an excuse to censor ideas they didn't like, mostly criticism. What we call misinformation and disinformation has been a property of communication since grunting. People are wrong about stuff, even people who we currently think are right. To censor is going back to just knowing the wrong thing for years because someone with censor powers thought they were right.
If a company suddenly starts doing something that costs society more in externalities, does it suddenly start paying more taxes to deal with the enforcement required to get them to stop?
After all, the whole point of regulation is to get the regulated to stop hurting society and costing it money.
But: https://www.inf.ed.ac.uk/teaching/courses/seoc2/1996_1997/ad...
Any bureaucracy evolves, ultimately, to serve and protect itself. So the populist boss snips at the easy, but actually useful parts: Social safety nets, environmental regulations, etc. Whereas the core bureaucracy, the one that should really be snipped, has gotten so good at protecting itself that it remains untouchable. So in the end the percentage of useless administratium is actually up, and the government, as a whole, still bloated but even less functional. Just another "unintended consequences" example.
We'll see if Argentina can do better than this.
I would say more like the prohibitive cost of compliance comes from the non-productive (or even anti-productive) nature of the activities needed to do so, as an ongoing basis.
An initial risk assessment is a lot more of a fixed target with a goal that is in sight if not well within reach. Once it's behind you, it's possible to get back to putting more effort into productive actions. Assessments are often sprinted through so things can get "back to normal" ASAP, which can be worth it sometimes. Other times it's a world of hurt without paying attention to whether it's a moving goalpoast and the "sprint" might need to last forever.
Which can also be coped with successfully, like dealing with large bureaucratic institutions as customers, since that's another time when you've got to have your own little bureaucracy. To be fully dedicated to the interaction and well-staffed enough for continuous 24/7 problem-solving operation at a moment's notice. If it's just a skeleton crew at a minimum they will have a stunted ability for teamwork since the most effective deployment can be more like a relay race, where each member must pull the complete weight, go the distance, not drop the baton, and pass it with finesse.
While outrunning a pursuing horde and their support vehicles ;)
But how about Trump winning popular vote? Millions of people are sure this is about as bad as explosion of the Earth or ending of the humankind.
Oh I do... the link... HN must have a word based deny list
I vouched for it, so it should be visible now.
It's literally managements job to be aware.
Imagine if a crossing guard waves cars through an intersection as children crossed and goes "Well, you know, I wasn't driving the car".
We are not talking about a business here. The whole problem is that these are things that people are doing as essentially voluntary work.
What your saying would be true in a different context, but this is not business. I do not know whether you find it hard to grasp that some people will put a lot of effort into something for motives other than profit.
Although to be fair to your hypothetical millions, a guy known for repeating getting bankrupt was elected to lead the country. Seems a bit fair to say his track record implies he'd bankrupt the country.
Look at the prices of new trucks, then at the median salary. People should not have car payments that rival a small mortgage, yet they do.
That's not true, you'd need to conclude you're at a medium or high risk of things happening and consider the impact on people if they do.
> and as nebulous as "users encountering Hate".
But users posting public messages can easily fit into the low risk category for this, it's even one of their examples of low risk.
We were interviewed, they found there were no issues, and the case was dropped. Very stressful experience, though.
And for what? I grew up on a farm in Nebraska. We had endless fields and roads around us to explore. The only off-limits area was an abandoned hog confinement, which to be fair, absolutely could have killed us (by falling into the open trench of porcine waste) – naturally, we still went there.
I know that reeks of survivor bias, but given the length of time Homo sapiens have survived, I think it’s a reasonably safe assumption that kids, when left to their own devices, are unlikely to be seriously injured or killed. Though, that’s probably only true if they’ve been exposed to it gradually over time, and are aware of the risks.
Many things in a society exist on thin margins, not only monetary, but also of attention, free time, care and interest, etc. You put a burden, such as a regulation, saying that people have to either comply or cease the activity, and people just cease it, like in the post. What used to be a piece of flourishing (or festering, depending on your POV) complexity gets reduced to a plain, compliant nothing.
Maybe that was the plan all along.
However this doesn't mean the government should not act. An interview of a false complaint is a small cost to pay compared to not doing anything when there is a real problem. Most of the time those employed to do the investigation known to look for signs of false reports and neighbor conflicts in order to filter them out, but at the same time they do need to make sure as to not miss-classify a real complaint.
Not true: Section 179 [0]. Luxury auto manufacturers are well-aware of this [1] and advertise it as a benefit. YouTube et al. are also littered with videos of people discussing how they're saving $X on some luxury vehicle.
> Not because consumers were "tricked" or "coerced". ... They buy them because they WANT them.
To be fair, they only want them because they've been made into extremely comfortable daily drivers. Anyone who's driven a truck from the 90s or earlier can attest that they were not designed with comfort in mind. They were utilitarian, with minimal passenger seating even with Crew Cab configurations. At some point – and I have no idea if this was driven by demand or not – trucks became, well, nice. I had a 2010 Honda Ridgeline until a few weeks ago, which is among the un-truck-iest of trucks, since it's unibody. That also means it's extremely comfortable, seats 5 with ease, and can still do what most people need a truck to do: carry bulky items home from Lowe's / Home Depot. Even in the 2010 model, it had niceties like heated seats. I just replaced it last week with a 2025 Ridgeline, and the new one is astonishingly nicer. Heated and ventilated seats, seat position memory, Android Auto / Apple CarPlay, adaptive cruise control, etc.
That's also not to say that modern trucks haven't progressed in their utility. A Ford F-350 from my youth could pull 20,000 lbs. on a gooseneck in the right configuration. The 2025 model can pull 40,000 lbs., and will do it in quiet luxury, getting better fuel economy.
[0]: https://www.irs.gov/publications/p946#idm140048254261728
This was, oddly, for regulatory reasons; the concern was that a blocked mixer unit could cause hot water (considered potentially unsafe) to be forced into the mains supply (presumed safe). This hasn't been a concern with mixer designs for a long time, but it took til the 90s to get the rule changed.
Yes, that's possibly 100 middle-aged men you could urge into battle!
From another commenter:
Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher.
"some"?
> The Act would also require me to scan images uploading for Child Sexual Abuse Material and other harmful content, it requires me to register as the responsible person for this and file compliance. It places technical costs, time costs, risk, and liability, onto myself as the volunteer who runs it all... and even if someone else took it over those costs would pass to them if the users are based in the UK.
There is no CSAM ring hiding on this cycling forum. The notion that every service which transmits data from one user to another has to file compliance paperwork and pay to use a CSAM hashing service is absurd.
https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...
So, I fully understand why someone would rather shut down their site rather than potentially deal with the legal fallout. Even if the end result is "just getting shut down", that will come after a significant amount of legal troubles, and likely money spent dealing with them.
I doubt this. Legislation is written by committee and passed by democracy. Most of the voting public don't look up the voting records which are available to them. Most of the voting public can't name a third of the members of parliament.
If there is a conspiratorial take, the one about regulatory capture is more believable.
Generally it's something along the lines of "a truck or van registered to a business is assumed to be a work vehicle, so pays less tax than a passenger car".
Of course you need to have a business to take advantage of that loophole, but it doesn't need to be a business that actually has any use for the truck- it could be a one-person IT consultancy.
- Configure forums using ranks so that new users can post but nobody will see their post until a moderator approves or other members vouch for them. Some forums already have this capability. It's high maintenance though and shady people will still try to warm up accounts just like they do here at HN.
- Small communities make their sites invite only and password protect the web interface. This is also already a thing but those communities usually stay quite small. Some prefer small communities. quality over quantity, or real friends over bloated "friends" lists which is common on big platforms.
- Move to Tor onion sites so that one has more time to respond to a flagged post. Non tor sites get abused by people running scripts that upload CSAM, then snapshot it despite them being the ones uploading it, automatically submit to registrars, server and CDN providers so the domains and rented infrastructure get cancelled. This pushes everyone onto big centralized sites and I would not be surprised if some of them were people with a vested interest in doing so.
Not really great options but they do exist. Some use these options to stay off the radar being less likely to attract the unstable people or lazy agents trying to inflate their numbers. I suppose now we can add to the list government agencies trying to profiteer of this new law. Gamification of the legal system, as if weaponization of it were not bad enough.
Hot water getting into the mains would be a concern anywhere; in particular, unless all equipment is in perfect working order, there's a legionnaires disease risk, but there are many other risks.
If I designed a site for 14-year-old girls to sext with 30-year-old men it would be rightfully shut down.
If I designed a site as a fun chat site but becomes in actual reality it became a sexting site for 14-year-old girls with adult men, should it be shut down?
Consumers want larger vehicles, and manufactures bend the rules to allow for such vehicles to be more easily build. Manufactures write the laws, after all. CAFE allows for SUVs and other "light trucks" to get worse fuel economy than a car. Since fuel economy allowances are based on vehicle footprint, and its easier to make a car larger than it is to improve fuel economy.
So while the fuel economy is higher in the UK, it isn't as high as it first appears.
Having said that, thanks for all the work you have done. I was (and maybe still am) a member of lfgss although I mostly lurked once in a long while without logging in and barely commented over the years.
It is sad to see all online communities slowly migrate to discord, reddit and other walled gardens.
I think what he fears is he has no control on how these individual forums moderate their content and how liable he would be as the hosting admin.
That costs money. The average person can't know every law. You have to hire lawyers to adjudicate every report or otherwise assess every report as illegal. No one is going to do that for free if the penalty for being wrong is being thrown in prison.
A fair system would be to send every report of illegal content to a judge to check if it's illegal or not. If it is the post is taken down and the prosecution starts.
But that would cost the country an enormous amount of money. So instead the cost is passed to the operators. Which in effect means only the richest or riskiest sites can afford to continue to operate.
All this because a negligible amount of web user upload CSAM?
If I recall correctly, Apple tried to do that and it (rightly) elicited howls of outrage. What you're asking for is for people's own computers to spy on them on behalf of the authorities. It's like having people install CCTV cameras their own homes so the police can make sure they're not doing anything illegal. It's literally Big Brother stuff. Maybe it would only be used for sympathetic purposes at first, but once the infrastructure is built, it would be a tempting thing for the authorities to abuse (or just use for goals that are not universally accepted, like banning all pornography).
The stories… people get really personally invested in their online arguments and have all sorts of bad behavior that stems from it.
> 1. Individual accountable for illegal content safety duties and reporting and complaints duties
> 2. Written statements of responsibilities
> 3. Internal monitoring and assurance
> 4. Tracking evidence of new and increasing illegal harm
> 5. Code of conduct regarding protection of users from illegal harm
> 6. Compliance training
> 7. Having a content moderation function to review and assess suspected illegal content
> 8. Having a content moderation function that allows for the swift take down of illegal content
> 9. Setting internal content policies
> 10. Provision of materials to volunteers
> 11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM
> 12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs
> ...
> the list goes on.
I bet you weren't the sole moderator of LFGSS. In any web forum I know, there is at least one moderator being online every day and much more senior members able to use a report function. I used to be a moderator for a much smaller forum and we had 4 to 5 moderators any time with some of them being among those that were online every day or almost every day.
I think a number of features/settings would be interesting for a forum software in 2025:
- desactivation of private messages: people can use instant messaging for that
- automatically blur post when report button is hit by a member (and by blur I mean replacing server side the full post by an image, not doing client side javascript).
- automatically blur posts when not seen by a member of the moderation or a "senior level or membership" past a certain period (6 or 12 hours for example)
- disallow new members to report and blur stuff, only people that are known good members
All this do not remove the bureaucracy of making the assessments/audits of the process mandated by the law but it should at least make forums moderable and have a modicum amount of security towards illegal/CSAM content.
These governments only want institutions to host web services. Their rules are openly hostile to individuals. One obvious benefit is much tighter control, having a few companies with large, registered sites, gives the government control.
It is also pretty clear that the public at large does not care. Most people are completely unaffected and rarely venture outside of the large, regulated platforms.
The point being to allow members of the public to submit a pull request and have their contributions incorporated into the officially-certified codebase if it's accepted, so the code ends up being actually good because the users (i.e. the public) are given the opportunity to fix what irks them.
Companies have legal departments, which exist to figure out answers to questions like that. This is because these questions are extremely tricky and the answers might even change as case law trickles in or rules get revised.
Expecting individuals to interpret complex rulesets under threat of legal liability is a very good way to make sure these people stop what they are doing.
I would never except personal liability for my correct interpretation of the GDPR. I would be extremely dumb if I did.
LFGSS is more culturally relevant than the BBC!
Of course governments and regulations will fail realize what they have till it's gone.
- Pave paradise, put up a parking lot.
How much money should he spend on a lawyer to figure this out for him?
Would you be willing to risk personal liability for your interpretation of this law? Obviously I would not.
https://projects.fivethirtyeight.com/polls/favorability/dona...
Apparently this isn't:
"Just seven electric-vehicle charging stations have begun operating with funding from a $5-billion US government program created in 2021, marking “pathetic” progress, a Democratic senator said on Wednesday."
https://nypost.com/2024/06/05/business/democratic-senator-bl...
Just email us.
The law worked the same way yesterday as it does today. It's not like the website run in Britain operated under some state of anarchy and in a few months it doesn't. There's already laws a site has to comply with and the risk that someone sues you, but if you were okay with running a site for 20 years adding a report button isn't drastically going to change the nature of your business.
It is plainly insulting to say that "adding a report button" is enough, obviously that is false. And investigating how to comply with this law is time consuming and comes with immense risk if done improperly. The fact that this law is new, means that nobody knows how exactly it has to be interpreted and that very well you might get it completely wrong. If a website has existed for 20 years with significant traffic it is almost certain that it has complied with the law, what absolutely is not certain is how complying with the law has to be done in the future.
I do not get why you have the need to defend this. "Just do X", is obviously not how this law is written, it covers a broad range of services in different ways and has different requirements for these categories. You absolutely need legal advice to figure out what to do, especially if it is you who is in trouble if you get it wrong.
2) "Any competent forum operator is already doing all of this [this = record keeping requirements described by the parent]".
These two assertions seem to conflict (unless good forum OPs are doing wrong record keeping). Are you willing to take another stab at it? What does good forum op record keeping look like?
A very large fraction of corporations are run on minimal margins. Some of them still do try and keep up with regulations and that is then (often) a very large part of their operating costs.
It's more about "accepting and publishing arbitrary content".
But, in practice, how hard is it to host a website anonymously? Or off-shore?
Obviously it is trivial, but so is shoplifting.
Both are illegal and telling people to commit crimes is not helpful.
Everything, ownership of the domain, codebase, digital assets for instance.
> I don't follow -- if Big Corp owns the site, won't it lose everything
Good question If Big Corp owns the domain, codebase, IP etc. and lets Web Corp operate a site using those assets, Big Corp is not responsible for Web Corps transgressions.
A simpler analogy. Big Corp owns a pub, rents it to Web Corp. Web Corp plays music too loud, opens too late and gets fined and loses its alcohol licence. Web Corp is insolvent, but Big Corp still owns the pub.
My outlook on doing this is that this is not the way to do it because these things exist:
- EU citizens living in non-EU countries (isn't GDPR supposed to apply EU citizens worldwide?)
- EU citizens using VPN with exit node to/IP address spoofing a non-EU country
Either comply with GDPR or just don't exist, period.
Having a modicum of rule enforcement and basic abuse protections (let's say: new users can't upload files) on it goes a long way
They do not have the resources to find out exactly what they need to do so that there is no risk of them being made totally bankrupt.
If that is all - please point to the guidance or law that says just having a report button is sufficient in all cases.
Employees costs money, and so do attorneys. When people won't limit scope, that can require extensive manual review.
You've spent 20+ posts misinforming about compliance costs in this thread alone so forgive me if I don't believe this is anything like a good faith query. If you know people who operate companies, it's easy to find cases.
$ because I'm American.
• A "large service" (more than 7 million monthly active UK users) that is at a medium or high risk of image-based CSAM, or
• A service that is at a high risk of image-based CSAM and either has more than 700000 monthly active UK users or is a file-storage and file-sharing service.
In Germany it is straight up illegal. The UK law has provisions where a services has to name a responsible person or report specific things to the government. Obviously those can not be accomplished anonymously. In any case hosting a website anonymously doesn't work if you want to work within the law, any lawsuit against you will identify you.
> I don't even have to provide my personal info to register a domain name, I can run it on an IP-address
Which is totally irrelevant. I can also go into a store and take something without paying. The question is whether that is legal or not and what you need to do to keep it legal.
What might make such a system work in practice is to only let a small randomly selected group of people vote for each issue. You still get a similar representation as a full vote, but with each person having much fewer votes to attend to it isn't overwhelming.
First, #2, #4, #5, #6, #9, and #10 only apply to sites that have more than 7 000 000 monthly active UK users or are "multi-risk". Multi-risk means being at medium to high risk in at least two different categories of illegal/harmful content.
If the site has been operating a long time and has not had a problem with illegal/harmful content it is probably going to be low risk. There's a publication about risk levels here [1].
For the sake of argument though let's assume it is multi-risk.
#1 means having someone who has to explain and justify to top management what the site is doing to comply. It sounds like in the present case the person who would be handling compliance is also the person who is top management, so not really much to do here.
#2 means written statements saying which senior managers are responsible for the various things needed for compliance. For a site without a lot of different people working on it this means writing maybe a few sentences.
#3 is not applicable. It only applies to services that are large (more than 7 000 000 active monthly UK users) and are multi-risk.
#4 means keeping track of evidence of new or increasing illegal content and informing top management. Evidence can come from your normal processing, like dealing with complaints, moderation, and referrals from law enforcement.
Basically, keep some logs and stats and look for trends, and if any are spotted bring it up with top management. This doesn't sound hard.
#5 You have to have something that sets the standards and expectations for the people who will dealing with all this. This shouldn't be difficult to produce.
#6 When you hire people to work on or run your service you need to train them to do it in accord with your approach to complying with the law. This does not apply to people who are volunteers.
#7 and #8 These cover what you should do when you become aware of suspected illegal content. For the most part I'd expect sites could handle it like the handle legal content that violates the site's rules (e.g., spam or off-topic posts).
#9 You need a policy that states what is allowed on the service and what is not. This does not seem to be a difficult requirement.
#10 You have to give volunteer moderators access to materials that let them actually do the job.
#11 This only applies to (1) services with more than 7 000 000 monthly active UK users that have at least a medium risk of image-based CSAM, or (2) services with a high risk of image-based CSAM that either have at least 700 000 monthly active UK users or are a "file-storage and file-sharing service".
A "file-storage and file-sharing service" is:
> A service whose primary functionalities involve enabling users to:
> a) store digital content, including images and videos, on the cloud or dedicated server(s); and
> b) share access to that content through the provision of links (such as unique URLs or hyperlinks) that lead directly to the content for the purpose of enabling other users to encounter or interact with the content.
#12 Similar to #11, but without the "file-storage and file-sharing service" part, so only applicable if you have at least 700 000 monthly active UK users and are at a high risk of CSAM URLs or have at least 7 000 000 montly active UK users and at least a medium risk of SCAN URLs.
[1] https://www.ofcom.org.uk/siteassets/resources/documents/onli...
As a fiat currency issuer, you have two options, you can create money for circulation (government spending) or you can destroy money and it’ll never circulate again (taxation).
https://wiki.archiveteam.org/index.php/ArchiveBot https://wiki.archiveteam.org/index.php/Wikibot
The famed section 230, passed in 1996, is an update to a section of the 1934 Communications Act, which is but one set of laws regulating many aspects of forums. Lawsuits in the early 90s led Congress to modify, but not abolish, the stack of laws regarding all communications technology.
Now that you know but 2 of the many laws affecting online forums, you can dig up plenty more yourself.
Those that do whist not seeking financial gain are impacted the most.
Regulatory capture. https://en.wikipedia.org/wiki/Regulatory_capture
My dude, I’m sorry to tell you, but the problem usually is law enforcement. For so many things. You try barely training people who already like beating people up and then give them a monopoly on legal violence.
Btw, the reason the cops were invented in Britain was to put down riots by the populace bc they were so poor[1], and in America it was to divide poor whites and poor blacks and turn the poor whites into slave catchers.[2]
[1] https://novaramedia.com/2020/06/20/why-does-the-police-exist...
[2] https://www.npr.org/2020/06/13/876628302/the-history-of-poli...
Feel free to address any specific points. Have you looked at the Ofcom guidance?
> penalty seems to still be "up to 18 million pounds".
Fines "up to" a certain amount allow flexibility in punishment, enabling courts to consider the specific circumstances of each case, such as the severity of the offence and the offender's financial situation. This discretion ensures that penalties are proportional and fair, avoiding undue hardship while still serving as a deterrent.
You cannot write in to legislation specific fines for every possible scenario, this is how the UK legislation works. Suggesting you need to shutdown a cycling forums because you don't have 18 million in the bank is ludicrous.
Mishandling personal data has a maximum fine of £18 million too, yet small/medium/large businesses still exists...
> So no, there is a deliberate bias against smaller sites.
I'm saying there is deliberate bias against smaller sites, smaller sites only have 13 minor provisions whereas larger ones have 30+.
The fear some have is not misunderstandings, but disgruntled types (the sort of people who blow up over a perfectly reasonable moderation decision) and common garden variety griefers reporting things to cause inconvenience. I know people who have in the past run forums and had to put up with spurious reports to their ISP/host or even on one occasion local law enforcement. If someone did this it would likely go nowhere in the end but not before causing much stress and perhaps cost via paying for legal advice.
> I don't get preemptively doing it other than giving up after a long duty of almost 30 years and using this as excuse.
Having been involved less directly with that sort of admin & moderation work I can see this change being the final straw after putting up with the people of the internet for years. Calling it “just an excuse” seems rather harsh.
> At least pass them to someone else that won't care about the liability.
Depending on the terms people agreed to when signing up and posting, passing on the reigns might not be nearly as legally/morally clear-cut as several in these comments are assuming.
There is no choice over number of users.`
Where exactly is this note on size?
Reading the document you gave there seems to be no saying that below a number of users there is no risk or even low risk. There is the paragraph "There are comprehensive systems and processes in place, or other factors which sufficiently reduce risks of this kind of illegal harm to users, and your evidence shows they are very effective. Even if taking all relevant measures in Ofcom’s Codes of Practice, this may not always be sufficient to mean your service is low risk"
Which implies that you must have comprehensive systems and processes in place - what exactly are these systems and their requirements? What happens if the site owner is ill for a week and isn't logged on?
https://russ.garrett.co.uk/2024/12/17/online-safety-act-guid... has a more comprehensive translation into more normal English.
You will need to assess the risk of people seeing something from one of those categories (for speciality forms, mostly low), think about algorithms showing it to users (again for forums thats pretty simple) Then have a mechanism to allow people to report offending content.
Taking proportionate steps to stop people posting stuff in the first place (pretty much the same as spam controls, and then banning offenders)
The perhaps harder part is allowing people to complain about take downs, but then adding a subforum for that is almost certainly proportionate[1].
[1] untested law, so not a guarantee
I don't want my browser to report me if I encounter illegal materials. I want the browser to anonymously report the website where they are, at most and even that, only if I don't disable reporting.
People do install cctv cameras in their homes but they are (or at least believe to be) in control of what happens with the footage.
> All this because a negligible amount of web user upload CSAM?
Still it's better to fix it in the browser than keep increasingly policing the entirety of the internet to keep it neglible.
An alternative might be, no regulation, but businesses are responsible for the costs of business to society (pollution, poor mental health, potential that it's a scam). After all, businesses benefit from these things, so they should gladly cover their cost to society.
Personally, I prefer less pollution.
For comparison imagine there was a new law against SQL Injection. Competent forum operators are already guarding against SQL Injection because they don't want to be owned by hackers. But they likely are not writing down a document explaining how they guard against it. If they were required to make a document which writes down "all SQL data updates are handled by Django's ORM" they might then think "would OfCom think this was enough? Maybe we should add that we keep Django up to date ... actually we're running an 18 months old version, let's sign up to Django's release mailing list, decide to stay within 3-6 months of stable version, and add a git commit hook which greps for imports of SQL libraries so we can check that we don't update data any other way". They are already acting against SQL injection but this imaginary law requires them to make it a proper formal procedure not an ad-hoc thing.
> "What does good forum op record keeping look like?"
Good forum operators already don't want their forums to become crime cesspits because that will ruin the experience for the target users and will add work and risk for themselves. So they will already have guards against bot signups, guards against free open image hosting, guards against leaking user private and personal information. They will have guards against bad behaviour such as passive moderation where users can flag and report objectionable content, or active moderation where mods read along and intervene. If they want to guard against moderators power tripping, they will have logs of moderation activities such as editing post content, banning accounts. There will be web server logs, CMS / admin tool logs, which will show signups, views, edits. They will likely have activity graphs and alerts if something suddenly becomes highly popular or spikes bandwidth use so they can look what's going on. If they contact the authorities there may be email or call logs of that contact, there will be mod messages records from users, likely not all in one place. If a forum is for people dealing with debt and bankruptcy they might have guards against financial scams targetting users of their service such as a sticky post warning users, a banned words list for common scam terms - second hand sales site https://www.gumtree.com has a box of 'safety tips' prominently on the right warning about common scams.
Larger competent forums with multiple paid (or volunteer) employees would likely already have some of this formalised and centralised just to make it possible to work with as a team, and for employment purposes (training, firing, guarding against rogue employees, complying with existing privacy and safety regulations).
Yes I think the new law will require forum operators to do more. I don't think it's unreasonable to require forum operators once a year to consider "is your forum at particular risk of people grooming children, inciting terrorism, scamming users, etc? If your site is a risk, what are you doing to lower the chance of it happening, and increase the chance of it being detected? And can you show OfCom that you actually are considering these things and putting relevant guards in place?".
(Whether the potiential fines and the vagueness/clarity are appropriate is a separate thing).