And to answer your question more directly, the flour itself causes the damage. The vulnerability is only damaging if a malicious actor takes advantage of it.
Food safety practices only became standardized after regulation was enacted.
> pre-approved and comparatively trivial recipes
That sounds like most software development.
I think you are unwittingly making the case that software development is a lot like food production. Software development is only beginning to get regulated because it is only now reaching the level where it is hazardous to public safety, unlike food production which reached that a long time ago.
Because you actually can standardize them. Software isn't so simple.
"> pre-approved and comparatively trivial recipes
That sounds like most software development."
Lol no that does not. Why wouldn't high school graduates or drop outs work in software instead of at fast food? The number of languages, frameworks, patterns, etc are much more complex than basic sanitation and time/temp/acidity.
It isn't simple due to choice, not due to the nature of software. Software is relatively simple compared to other meat-space engineering disciplines. Software engineering is an relatively immature engineering discipline, but it is implicated in enough safety critical systems these days that it is about time to start maturing.
It will be painful but I welcome more software regulatory standards, because it is necessary for our trade to mature.
On what basis do you make this claim? If you do the same degree of optimization in software that is routinely applied in physical engineering disciplines that have some of most complex system dynamics problems, such as chemical engineering, the dynamics of software systems are qualitatively much more complex. We expect chemical engineering to design systems that asymptotically approach theoretically optimal efficiency along multiple dimensions. In software we rarely see anything approaching similar optimality except for small and exquisitely engineered components that are beyond the ken of most software engineers. In large software systems, the design problem is so complex that computational optimization to the degree we see in physical engineering is completely intractable, so similar approaches do not apply.
In chemical engineering, the measure of system complexity is roughly the size of the system of differential equations the govern the total dynamics of the system. Computers then solve for the system, which can be computationally intensive. We do this routinely, with some caveats. An optimal design is not computable but we can get asymptotically close via approximation.
In software engineering, the equivalent would be formal optimization and verification of the entire program. The complexity of doing this for non-trivial software is completely intractable. Software has so many degrees of freedom compared to physical systems that they aren’t even the same class of problem. It is arguable if it is even possible in theory to achieve similar degrees of design robustness and efficiency that we see in physical engineering systems.
Unlike physical engineering, where a computer takes a set of equations and constraints, crunches numbers, and produces an approximately optimal design, no such thing is possible in software.