zlacker

[return to "In memoriam"]
1. amiga3+wc[view] [source] 2025-02-23 20:29:53
>>ColinW+(OP)
Charlie Stross's blog is next.

Liability is unlimited and there's no provision in law for being a single person or small group of volunteers. You'll be held to the same standards as a behemoth with full time lawyers (the stated target of the law but the least likely to be affected by it)

http://www.antipope.org/charlie/blog-static/2024/12/storm-cl...

The entire law is weaponised unintented consequences.

◧◩
2. tene80+Ye[view] [source] 2025-02-23 20:49:51
>>amiga3+wc
What standards would you want individuals or small groups to be held to? In a context where it is illegal for a company to allow hate speech or CSAM on their website, should individuals be allowed to? Or do you just mean the punishment should be less?
◧◩◪
3. Anthon+xf[view] [source] 2025-02-23 20:54:07
>>tene80+Ye
The obvious solution is to have law enforcement enforce the law rather than private parties. If someone posts something bad to your site, the police try to find who posted it and arrest them, and the only obligation on the website is to remove the content in response to a valid court order.
◧◩◪◨
4. tene80+wg[view] [source] 2025-02-23 21:01:23
>>Anthon+xf
I don't have a strong view on this law – I haven't read enough into it. So I'm interested to know why you believe what you've just written. If a country is trying to, for example, make harder for CSAM to be distributed, why shouldn't the person operating the site where it's being hosted have some responsibility to make sure it can't be hosted there?
◧◩◪◨⬒
5. manana+Lg[view] [source] 2025-02-23 21:03:11
>>tene80+wg
For one thing, because that person is not obliged to follow due process and will likely ban everything that even might even vaguely require them to involve a lawyer. See for example YouTube’s copyright strikes, which are much harsher on the uploader than any existing copyright law.
◧◩◪◨⬒⬓
6. tene80+bh[view] [source] 2025-02-23 21:06:45
>>manana+Lg
Your argument is that it's better to have the illegal stuff (say, CSAM) online than for a site owner to, for practical reasons, ban a lot of legal stuff too? Why?
◧◩◪◨⬒⬓⬔
7. Anthon+Gm[view] [source] 2025-02-23 21:52:32
>>tene80+bh
Let's consider two ways of dealing with this problem:

1) Law enforcement enforces the law. People posting CSAM are investigated by the police, who have warrants and resources and so on, so each time they post something is another chance to get caught. When they get caught they go to jail and can't harm any more children.

2) Private parties try to enforce the law. The people posting CSAM get banned, but the site has no ability to incarcerate them, so they just make a new account and do it again. Since they can keep trying and the penalty is only having to create a new account, which they don't really care about, it becomes a cat and mouse game except that even if the cat catches the mouse, the mouse just reappears under a different name with the new knowledge of how to avoid getting caught next time. Since being detected has minimal risk, they get to try lots of strategies until they learn how to evade the cat, instead of getting eaten (i.e. going to prison) the first time they get caught. So they get better at evading detection, which makes it harder for law enforcement to catch them either. Meanwhile the site is then under increasing pressure to "do something" because the problem has been made worse rather than better, so they turn up the false positives and cause more collateral damage to innocent people. But that doesn't change the dynamic, it only causes the criminals to evolve their tactics, which they can try an unlimited number of times until they learn how to evade detection again. Meanwhile as soon as they do, the site despite their best efforts is now hosting the material again. The combined costs of the heroic efforts to try and the liability from inevitably failing destroys smaller sites and causes market consolidation. The megacorps then become a choke point for other censorship, some by various governments, others by the corporations themselves. That is an evil in itself, but if you like to take it from the other side, that evil causes ordinary people chafe. So they start to develop and use anti-censorship technology. As that technology becomes more widespread with greater public support, the perpetrators of the crimes you're trying to prevent find it easier to avoid detection.

You want the police to arrest the pedos. You don't want a dystopian megacorp police state.

[go to top]