On what basis do you make this claim? If you do the same degree of optimization in software that is routinely applied in physical engineering disciplines that have some of most complex system dynamics problems, such as chemical engineering, the dynamics of software systems are qualitatively much more complex. We expect chemical engineering to design systems that asymptotically approach theoretically optimal efficiency along multiple dimensions. In software we rarely see anything approaching similar optimality except for small and exquisitely engineered components that are beyond the ken of most software engineers. In large software systems, the design problem is so complex that computational optimization to the degree we see in physical engineering is completely intractable, so similar approaches do not apply.
In chemical engineering, the measure of system complexity is roughly the size of the system of differential equations the govern the total dynamics of the system. Computers then solve for the system, which can be computationally intensive. We do this routinely, with some caveats. An optimal design is not computable but we can get asymptotically close via approximation.
In software engineering, the equivalent would be formal optimization and verification of the entire program. The complexity of doing this for non-trivial software is completely intractable. Software has so many degrees of freedom compared to physical systems that they aren’t even the same class of problem. It is arguable if it is even possible in theory to achieve similar degrees of design robustness and efficiency that we see in physical engineering systems.
Unlike physical engineering, where a computer takes a set of equations and constraints, crunches numbers, and produces an approximately optimal design, no such thing is possible in software.
A regulator doesn't really care about the internal complexities of an LLM and whether or not that is more difficult than cracking petroleum. They care more about how those things interact with the rest of the world. Software is pretty limited in how it interacts with the rest of the world.
Seems like you are over simplifying the process and goals of those creating new regulations and law makers often have to care about the internal complexities because they care about the consequences new regulations will have.
When a law maker is making regulations for an industry they should care about the internal complexities since that determines the long term effects of the regulation. Law makes should care if new regulations kill small businesses or, in an extreme case that is not happening with the CRA, kills of an industry, since that effects the economy of the the country they are law makers for in addition to directly impact people represented by those law makers.
They will seek feedback from industry experts to determine if their rules should be refined, which is what is happening. The details of any internal complexity of an industry is entirely delegated.
We may be working with different definitions here. If they did not care they would not delegate away the details of the internal complexity.