zlacker

[return to "Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)"]
1. Markus+6j[view] [source] 2024-12-16 19:11:45
>>buro9+(OP)
Is there some generalized law (yet) about unintended consequences? For example:

Increase fuel economy -> Introduce fuel economy standards -> Economic cars practically phased out in favour of guzzling "trucks" that are exempt from fuel economy standards -> Worse fuel economy.

or

Protect the children -> Criminalize activites that might in any way cause an increase in risk to children -> Best to just keep them indoors playing with electronic gadgets -> Increased rates of obesity/depression etc -> Children worse off.

As the article itself says: Hold big tech accountable -> Introduce rules so hard to comply with that only big tech will be able to comply -> Big tech goes on, but indie tech forced offline.

◧◩
2. FredPr+Mo[view] [source] 2024-12-16 19:44:14
>>Markus+6j
Politicians should take a mandatory one-week training in:

- very basic macro economics

- very basic game theory

- very basic statistics

Come to think of it, kids should learn this in high school

◧◩◪
3. wat100+jx[view] [source] 2024-12-16 20:33:50
>>FredPr+Mo
I think you’re being overly charitable in thinking this happens because they don’t understand these things. The main thing is that they don’t care. The purpose of passing legislation to protect the children isn’t to protect the children, it’s to get reelected.

If we can get the voters to understand the things you mention, then maybe we’d have a chance.

◧◩◪◨
4. ebiest+CA[view] [source] 2024-12-16 20:54:51
>>wat100+jx
I think you're being underly charitable. The vast majority of congress critters are pretty smart people, and by Jeff Jackson's account, even the ones who yell the loudest are generally reasonable behind closed doors due to incentives.

The problem is that the real problems are very hard, and their job is to simplify it to their constituents well enough to keep their jobs, which may or may not line up with doing the right thing.

This is a truly hard problem. CSAM is a real problem, and those who engage in its distribution are experts in subverting the system. So is freedom of expression. So is the onerous imposition of regulations.

And any such issue (whether it be transnational migration, or infrastructure, or EPA regulations in America, or whatever issue you want to bring up) is going to have some very complex tradeoffs and even if you have a set of Ph.Ds in the room with no political pressure, you are going to have uncomfortable tradeoffs.

What if the regulations are bad because the problem is so hard we can't make good ones, even with the best and brightest?

◧◩◪◨⬒
5. Anthon+dW[view] [source] 2024-12-16 23:26:43
>>ebiest+CA
> What if the regulations are bad because the problem is so hard we can't make good ones, even with the best and brightest?

To begin with, the premise would have to be challenged. Many, many bad regulations are bad because of incompetence or corruption rather than because better regulations are impossible. But let's consider the case where there really are no good regulations.

This often happens in situations where e.g. bad actors have more resources, or are willing to spend more resources, to subvert a system than ordinary people. For example, suppose the proposal is to ban major companies from implementing end-to-end encryption so the police can spy on terrorists. Well, that's not going to work very well because the terrorists will just use a different system that provides E2EE anyway and what you're really doing is compromising the security of all the law-abiding people who are now more vulnerable to criminals and foreign espionage etc.

The answer in these cases, where there are only bad policy proposals, is to do nothing. Accept that you don't have a good solution and a bad solution makes things worse rather than better so the absence of any rule, imperfect as the outcome may be, is the best we know how to do.

The classical example of this is the First Amendment. People say bad stuff, we don't like it, they suck and should shut up. But there is nobody you can actually trust to be the decider of who gets to say what, so the answer is nobody decides for everybody and imposing government punishment for speech is forbidden.

◧◩◪◨⬒⬓
6. nonran+XY[view] [source] 2024-12-16 23:48:16
>>Anthon+dW
> The answer in these cases, where there are only bad policy proposals, is to do nothing.

Or go further.

Sometimes the answer is to remove regulations. Specifically, those laws that protect wrongdoers and facilitators of problems. Then you just let nature take its course.

For the mostpart though, this is considered inhumane and unacceptable.

◧◩◪◨⬒⬓⬔
7. Anthon+c01[view] [source] 2024-12-16 23:58:11
>>nonran+XY
Sometimes we do exactly that. In general, if someone is trying to kill you, you are allowed to try and kill them right back. It's self-defense.

If you're talking about legalizing vigilantism, you would then have to argue that this is a better system and less prone to abuse than some variant of the existing law enforcement apparatus. Which, if you could do it, would imply that we actually should do that. But in general vigilantes have serious problems with accurately identifying targets and collateral damage.

◧◩◪◨⬒⬓⬔⧯
8. nonran+q21[view] [source] 2024-12-17 00:22:33
>>Anthon+c01
Not quite my line of thinking but appreciate the reply. There's definitely an interesting debate to be had there about the difference between "legalizing vigilantism" and "not protecting criminals" (one that's been done to death in "hack back" debates).

It gets messy because, by definition the moment you remove the laws, the parties cease to be criminals... hence my Bushism "wrongdoers" (can't quite bring myself to say evil-doers :)

One hopes that "criminals" without explicit legal protection become disinclined to act, rather than become victims themselves. Hence my allusion to "nature", as in "Natural Law".

"Might is right" is no good situation either. But I feel there's a time and place for tactical selective removal of protectionism (and I am thinking giant corporations here) to re-balance things.

As a tepid example (not really relevant to this thread), keep copyright laws in place but only allow individuals to enforce them.

◧◩◪◨⬒⬓⬔⧯▣
9. Anthon+r31[view] [source] 2024-12-17 00:34:40
>>nonran+q21
If you want a fun one in that line, allow piercing the corporate veil by default until you get to a human. Want to scatter conglomerates to the wind? Make the parent corporation fully liable for the sins of every subsidiary.
◧◩◪◨⬒⬓⬔⧯▣▦
10. wat100+V91[view] [source] 2024-12-17 01:39:11
>>Anthon+r31
I wonder what the world would be like if we took corporate personhood to its logical conclusion and applied the same punishments to corporations as we apply to people.

You can’t really put a corporation in jail, but you could cut it off from the world in the same way that a person in jail is cut off. Suspend the business for the duration of the sentence. Steal a few thousand bucks? Get shut down for six months, or whatever that sentence would be.

◧◩◪◨⬒⬓⬔⧯▣▦▧
11. Aerroo+sr1[view] [source] 2024-12-17 05:23:38
>>wat100+V91
>You can’t really put a corporation in jail, but you could cut it off from the world in the same way that a person in jail is cut off.

I have imagined a sci-fi skit where James works at CorpCo, a company that was caught doing something illegal and sentences to prison. As punishment James goes to work by reporting in at a prison at 8 am. He sits in his cell until his 'work day' is over and it's released at 5 pm to go home. It's boring, but hey, it pays well.

◧◩◪◨⬒⬓⬔⧯▣▦▧▨
12. thanks+WI1[view] [source] 2024-12-17 09:04:39
>>Aerroo+sr1
I think you can put the CEO and board in prison though.
[go to top]