zlacker

[return to "Debian Statement on the Cyber Resilience Act"]
1. pjmlp+Cm[view] [source] 2023-12-28 00:03:03
>>diyftw+(OP)
Small businesses and solo-entrepreneurs have to deal with liability and permits all the time in other fields, even actual street bazaars for that matter, exception being when there is some "flexibility" between the laws and how they happen to be applied.
◧◩
2. SOLAR_+Jn[view] [source] 2023-12-28 00:14:51
>>pjmlp+Cm
I’m curious what the liability and permits being discussed are here. Because the permit required to prevent some Joe Schmoe from selling me a tainted brownie off a street cart feels a little bit different and perhaps difficult to compare to software
◧◩◪
3. zmgsab+jp[view] [source] 2023-12-28 00:32:03
>>SOLAR_+Jn
What’s different between a baker liable for flour content and an SDE liable for packaged library vulnerabilities?
◧◩◪◨
4. giantg+1q[view] [source] 2023-12-28 00:38:10
>>zmgsab+jp
Standardized food safety practices, pre-approved and comparatively trivial recipes, state/county inspections, etc. None of which apply to software. One is fairly trivial and standardized. The other is massively complex, rapidly changing, and unable to be boiled down to a standard set of trivial procedures.

And to answer your question more directly, the flour itself causes the damage. The vulnerability is only damaging if a malicious actor takes advantage of it.

◧◩◪◨⬒
5. beedee+It[view] [source] 2023-12-28 01:14:01
>>giantg+1q
> Standardized food safety practices

Food safety practices only became standardized after regulation was enacted.

> pre-approved and comparatively trivial recipes

That sounds like most software development.

I think you are unwittingly making the case that software development is a lot like food production. Software development is only beginning to get regulated because it is only now reaching the level where it is hazardous to public safety, unlike food production which reached that a long time ago.

◧◩◪◨⬒⬓
6. giantg+Qy[view] [source] 2023-12-28 02:07:10
>>beedee+It
"Food safety practices only became standardized after regulation was enacted."

Because you actually can standardize them. Software isn't so simple.

"> pre-approved and comparatively trivial recipes

That sounds like most software development."

Lol no that does not. Why wouldn't high school graduates or drop outs work in software instead of at fast food? The number of languages, frameworks, patterns, etc are much more complex than basic sanitation and time/temp/acidity.

◧◩◪◨⬒⬓⬔
7. kube-s+5B[view] [source] 2023-12-28 02:30:18
>>giantg+Qy
> Because you actually can standardize them. Software isn't so simple.

It isn't simple due to choice, not due to the nature of software. Software is relatively simple compared to other meat-space engineering disciplines. Software engineering is an relatively immature engineering discipline, but it is implicated in enough safety critical systems these days that it is about time to start maturing.

It will be painful but I welcome more software regulatory standards, because it is necessary for our trade to mature.

◧◩◪◨⬒⬓⬔⧯
8. jandre+kR[view] [source] 2023-12-28 05:19:19
>>kube-s+5B
> Software is relatively simple compared to other meat-space engineering disciplines.

On what basis do you make this claim? If you do the same degree of optimization in software that is routinely applied in physical engineering disciplines that have some of most complex system dynamics problems, such as chemical engineering, the dynamics of software systems are qualitatively much more complex. We expect chemical engineering to design systems that asymptotically approach theoretically optimal efficiency along multiple dimensions. In software we rarely see anything approaching similar optimality except for small and exquisitely engineered components that are beyond the ken of most software engineers. In large software systems, the design problem is so complex that computational optimization to the degree we see in physical engineering is completely intractable, so similar approaches do not apply.

In chemical engineering, the measure of system complexity is roughly the size of the system of differential equations the govern the total dynamics of the system. Computers then solve for the system, which can be computationally intensive. We do this routinely, with some caveats. An optimal design is not computable but we can get asymptotically close via approximation.

In software engineering, the equivalent would be formal optimization and verification of the entire program. The complexity of doing this for non-trivial software is completely intractable. Software has so many degrees of freedom compared to physical systems that they aren’t even the same class of problem. It is arguable if it is even possible in theory to achieve similar degrees of design robustness and efficiency that we see in physical engineering systems.

Unlike physical engineering, where a computer takes a set of equations and constraints, crunches numbers, and produces an approximately optimal design, no such thing is possible in software.

◧◩◪◨⬒⬓⬔⧯▣
9. kube-s+8U[view] [source] 2023-12-28 05:52:53
>>jandre+kR
I wasn't really thinking "complexity" in terms of formal academic problem scope, but more so "complexity" in the surface of how it interacts with the rest of the world, which is more along the lines of what would be relevant to a regulator.

A regulator doesn't really care about the internal complexities of an LLM and whether or not that is more difficult than cracking petroleum. They care more about how those things interact with the rest of the world. Software is pretty limited in how it interacts with the rest of the world.

◧◩◪◨⬒⬓⬔⧯▣▦
10. davora+oO2[view] [source] 2023-12-28 19:53:46
>>kube-s+8U
> A regulator doesn't really care about the internal complexities

Seems like you are over simplifying the process and goals of those creating new regulations and law makers often have to care about the internal complexities because they care about the consequences new regulations will have.

When a law maker is making regulations for an industry they should care about the internal complexities since that determines the long term effects of the regulation. Law makes should care if new regulations kill small businesses or, in an extreme case that is not happening with the CRA, kills of an industry, since that effects the economy of the the country they are law makers for in addition to directly impact people represented by those law makers.

◧◩◪◨⬒⬓⬔⧯▣▦▧
11. kube-s+er3[view] [source] 2023-12-28 23:55:20
>>davora+oO2
No, they really don't give a hoot. They have an end goal they're trying to accomplish, and that's their priority.

They will seek feedback from industry experts to determine if their rules should be refined, which is what is happening. The details of any internal complexity of an industry is entirely delegated.

[go to top]