> The Debian statement appears to be based on an earlier version of the CRA.
> It for example says “Knowing whether software is commercial or not isn’t feasible, neither in Debian nor in most free software projects”. Under the CRA there is no need to figure that out for Debian.
> “Having to get legal advice before giving a gift to society will discourage many developers” - the final version of the CRA is clear that if you are giving a gift, the CRA does not apply to you anyhow. There is now a very clear statement on that (see above).
New OSS governance and runtime binary attestation (aka DRM) layers are being defined by the CRA, e.g. only specific attested binaries from open-source trees that follow specific development practices would be allowed to run in critical systems:
Open-source software stewards shall put in place and document in a verifiable manner a cybersecurity policy to foster the development of a secure product with digital elements as well as an effective handling of vulnerabilities by the developers of that product.
… Open-source software stewards shall cooperate with the market surveillance authorities, at their request, with a view to mitigating the cybersecurity risks posed by a product with digital elements qualifying as free and open-source software.
… security attestation programmes should be conceived in such a way that … third-parties, such as manufacturers that integrate such products into their own products, users, or European and national public administrations [can initiate or finance an attestation].
Legal liability and certification for commercial sale of binaries built from FOSS software will alter business models and incentives for FOSS development.Related:
Dec 2023, "What comes after open source? Bruce Perens is working on it" (174 comments), >>38783500
Important bits (10c and around):
* Libraries/non-end products are fine, unless monetized.
* Employee contributions seem to be fine.
* Foundations seem to be fine.
* Non-core developers are fine
Seems like significantly better version.
This article is a good step to explain what has changed.
(I was quite concerned as President of VideoLAN and involved in VLC and FFmpeg, since both projects would have been threatened by previous drafts)
That doesn't seem like what the CRA stipulates. I think it's more about manual attestation in its most traditional meaning, i.e, an organization attests that X software is secure.
Also, to avoid "dangerous" not yet professional amateurs having a chance against big editors.
What is this? Software is more secure then ever.
If we don't want poor regulation, we had better regulate ourselves first.
Bonus: regulating ourselves might fund Open Source. [1]
[1]: https://gavinhoward.com/2023/11/how-to-fund-foss-save-it-fro...
The only way for small actors is to move to... super small and simple tech... and they better be sure small tech<->big tech interop is hardcore regulated too or they will be zapped.
Yep, forget about those grotesquely and absurdely massive and complex web engines...
And now I am thinking about the hardware... they better come extra clean.
1. This adds barriers to sell OSS software, which helps solidify existing markets and prevents new competitors from stepping up
2. This won't change anything except forcing projects to waste money in legal BS, when the responsibility should be uniquely on the commercial entities USING and providing a service (and therefore making money) with the OSS software
3. This is only the first step, I'm sure they'll keep adding rules
4. I'm thinking they may have been heavy handed in the first draft just so that people would think at the end "oh, phew! the regulators didn't kill ALL OSS software in Europe, great!" without thinking why do we need this regulation or how it improves ANYTHING
Will it actually improve security? I don't think so.
If someone is paying for commercial support they likely already have security updates and, once vulnerabilities are known by the maintainers, the news spread.
The security problem with OSS is not that things are not communicated promptly, but that it's hard to make money with OSS so there is no staff working on security.
This would have not saved us from eg. OpenSSL vulnerabilities and it will be even harder to $NextOSSOrg to start charging for their product and improve their security.
All commercial software is included, I don't see how (commercial) OSS is somehow special. Did you read the article?
Meanwhile, we're way behind on updating much of our infrastructure and hardly ever check whether any of the open source libraries we use are up-to-date, nor whether they're reliable. I really hope this legislation pushes companies like mine to improve their software development practices, because I'm scared of the future.
CRA can require EU-wide recall of "products with digital elements" which are found to be non-compliant by national market surveillance. While we may analogize this requirement to the recall of slow-moving physical products with rare market withdrawal, software developers and attackers iterate more quickly.
Centralized software distribution like mobile app stores would have the ability to implement a kill switch (recall) on non-compliant products. Products which depend on centralized cloud services could have binaries verified before they are allowed to connect to an API. This would give regulators the tools to rapidly implement software "recalls".
(58) … significant cybersecurity risk or pose a risk to the health or safety of persons … market surveillance authorities should take measures to require the economic operator to ensure that the product no longer presents that risk, to recall it or to withdraw it …
(60) … market surveillance authorities should be able to carry out joint activities with other authorities, with a view to verifying compliance and identifying cybersecurity risks of products with digital elements.
(61) Simultaneous coordinated control actions (‘sweeps') are specific enforcement actions by market surveillance authorities that can further enhance product security.Restaurants, food trucks, consumer electronics, medical devices, clothing, products chain delivery,...
While I get they new draft has changed it so if you are non profit or accepting donations it doesn't apply (I think?) The biggest problem is that isn't a great model for OSS anyway.
A much better model imo is charging for a "pro" version with support included and maybe some extra features.
This regulation is likely to totally kill the viability of that model if you need to do expensive security audits.
In part, open-source software arose in response to opaque software.
Can opaque regulation equally govern open and opaque software?
Should open software have open (i.e. continuously evolving in public, not point-in-time negotiated) regulation that can keep up with open development and security research? Much will depend on the operational practices and transparency of national institutions tasked to implement EU CRA.
In a way, I don't think it's that much at odds, if someone comes up with a great open source project but not to 'give away' as a present or in a classic FOSS style, but instead as some sort of funnel to get paying customers (which includes pure support), you're already doing it commercial, and even without the CRA you'd probably be on the hook for doing it right anyway.
Courts and regulators, particularily European ones, understand when there's a "will" to follow the law. It's one of the differences between "rules-based" and "principles-based" regulations.
Here in Eastern Europe we are having fewer and fewer of those during the last 30 years. My favorite cheese maker closed her small shop and started selling direct from home since local authorities started demanding test and workshop inspections (bribes really). She's planning to switch to selling the milk directly to one of those big name supermarket diary processors soon. Less money but fewer headaches.
Open source liability is coming - >>38808163 - Dec 2023 (218 comments)
Debian Statement on the Cyber Resilience Act - >>38787005 - Dec 2023 (144 comments)
Can open source be saved from the EU's Cyber Resilience Act? - >>37880476 - Oct 2023 (12 comments)
European Cyber Resilience Act [Discussion] - >>37580247 - Sept 2023 (4 comments)
There is only one of those in my whole country. Not much competition there if I'd ever want to do an IPO.
The CRA requires manufacturers to ensure vulnerabilities are handled effectively for the expected product lifetime or 5 years, whichever is shorter.So exactly like any other regulation.
Not too bad really.
[1] https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-...
Other good takes in recent regulations:
- Unraveling the EU Digital Markets Act https://ia.net/topics/unraveling-the-digital-markets-act
- The truth about the EU AI Act and foundation models, or why you should not rely on ChatGPT summaries for important texts https://softwarecrisis.dev/letters/the-truth-about-the-eu-ac...
Lots of big players are already providing shit software to millions of customers especially through government contracts because they've hired armies of legal and sales teams, squashing the little guy in the process.
If just providing some small web service built on top of open source now requires hiring a huge legal team, well goodbye to any entrepreneurship.
I know this because i've seen big players win contracts over actually talented people 9 of 10 times because they can play this regulation game, and i've seen small companies burn 100s of thousands in consultancy fees over GDPR that made zero difference for their Wordpress setup that a talented coder could have used 10 hours to fix.
That said the intentions are good, but for some reason EU thinks small players should have the same extreme measures as Facebook, Google, ie the actual reasons this regulation was made in the first place. Bizarre.
As the author states "regulations are never fun" bjt this is as good as it gets.
I'm an optimist and hope that this will somewhat dampen the voices on the internet and (unfortunately) on HN that claim the EU is only filled with near evil idiots acting to destroy European industry.
I guess we will see how it goes next time (admittedly my hope is small).
- If you run a commercial kitchen on your own (or, let's say, with a staff of 2-3 people), can you ignore the food safety regulations? The fire regulations?
- If you run a one-man plumbing company, can you ignore safety regulations? Water regulations? Sewage regulations?
etc.
Why is it than when it comes to "commercial software" it is inevitably "oh my god these laws are so hard, why should I as one-man company be forced to comply with them". Because that is literally your job.
Sooo.... Because of that you should be exempt even though you're expecting to sell that software?
How does this make sense?
First of all, most of the software companies do SaaS, meaning they also provide the service. And then, even if they don't, the users will just hand down the paperwork to the companies developing the software. Because those know what was put in, security and components, and want to have this in legal writing.
Secondly, imagine your average IoT seller. They should not be liable for their bad product because they don't run it themselves? "The user" is liable? In most cases the "user" can't even do anything about their insecure device.
I think developers are rightly responsible here. It's pretty comparable to other industries where the products have to be safe when getting sold, think pharma, food, toys, cars, etcpp.
> Will it actually improve security? I don't think so.
Think B2C. It will improve things there, and massively so. Software in B2B was already somewhat regulated via audits and certifications.
Follow best practices and demonstrate that you care goes a long way (that has been demonstrated time and time again in courts throughout the union).
Also it differentiates between what kind of product you are building (see the annexes).
Most of the requirements (look them up) are best software dev practices unless you are in one of the specific "critical" categories of products.
Then, to be honest I don't really care that you are a one person (commercial) shop when my car gets steered off the road because of a preventable security hole.
That's good to know about as a security consultancy.
Whenever we found an issue in software made by a third-party vendor, we already recommend reporting it and offer to do it for them (unpaid time on our part, but it gets both the finder and our company publicity, and when leaving it up to the customer then it might not happen which is also bad for everyone else), but now we can say it's required and not just a recommendation. And if there is patching on the customer's part, we get to check the fix if they give it to us for reporting, which in turn makes them more secure.
For us, the situation doesn't really change, but for the tech industry as a whole I see only upsides (at least of this part) :)
- There are rules, and clear established practices that allow you to follow these rules. In software the rabbit hole goes so deep that your average developer cannot even be aware of all the risks.
- You do not have to rely on millions of lines of code you have no control on.
As a simple example, if you are using network communications, you are probably using OpenSSL, GnuTLS or one of the few other TLS implementations. All of them have regular security issues, and simply selling support on an Open Source software you built using one of them will make you liable for these issues. There is no choice: you need TLS, and you're not going to implement it yourself. What are you supposed to do?
The fact that a solo developer selling 100€/month of support is treated the same way than a billion dollar company demonstrates the complete insanity of this act.
[1] https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-...
I haven't read the actual legalize in the final version. Which kind of responsibility is putting unbearable burden on the average web developer who slaps together few input fields and makes a nice CSS job?
Add: Auditing all the million dependencies in node_modules comes to mind, but maybe it's a good incentive to not.
But they're not treated the same way. Both by the law itself and the standards courts and regulatory agencies use throught Europe.
In other words, an array of mini-dieselgates.
As I read it, and with the caveat that the exact requirements are not yet determined: You will need a SBOM stating you use openssl, and how you plan to update openssl if it contains security bugs.
Update: found it, paragraph 46a: In relation to small and micro enterprises, in order to ensure proportionality, it is appropriate to alleviate administrative costs without [...]
Look at everything that is included (VPN, OS, anything related to security,...). This regulation forces to have full declaration of the identity of the editor/manufacturer and more. Any other product that is not under the control of the authority will be illegal.
The regulations are designed to deepen the software moat, and security theater, and I say this as an InfoSec professional.
IANAL but Annex III Class 1.2 states: "Standalone and embedded browsers" which would implicate every electron app. Class 1.5 states: "Products with digital elements with the function of virtual private network (VPN)" is so vague it could apply to video game chat messages.
The problem with regulations like this is they're so vague and will be selectively enforced. They won't affect Big Corp but will affect small business and solo developers.
I don't follow how rules for software with VPN functions could apply to a video game chat, but as with all laws intend and interpretation matters. Successfully convincing a judge that your game chat is a Class I critical product is unlikely.
I also don't think that the CRA is too vague. Rules that are too specific will just be circumvented. Enforcement works like any other market rule. You can sell all sorts of non-compliant products in the EU but if you are found out you pay a fine. It won't be any different with the CRA.
Not to mention, in this very case the "whole perimeter" does include the client program (the OS is tied to the hardware), aka a Big Tech web browser. And since this is not small tech (which would be noscript/basic (x)html), this will de facto exclude anything which is not Big Tech for most "legal" projects which wants some ultra heavy and fancy "web". Because near 100% of the project managers out there won't even take the risk anymore with such act.
Yep, those who are not Big Tech better be ready to REALLY, and I mean REALLY get close to metal and use REALLY small and lean tech, and namely to do NOT use Big Tech open source web software (blink|geeko/webkit+SDK).
This is weird because that will kill economically any attempts at Big Tech alternatives, ALL OF THEM.
Big Tech is BILLIONS OF $ OF CASH WITH THE BACKUP FROM INVESTMENT FUNDS WORTH TENS OF THOUSANDS OF BILLIONS OF $: THERE IS NO FG&* ECONOMIC COMPETITION OR ANYTHING, WORLDWIDE AND THEY GET EU WIDE LAWS ONLY FOR THEM???
The first thing is to get ultra hardcore regulation on small tech<->big tech interop, and I really mean _small_ and _lean_ tech (the second you have Big Tech web engine or a massive SDK with an ultra complex language, you are done for).
Not to mention, EVERYBODY KNOWS COMPUTER SECURITY IS A FANTASY: IT DOES NOT EXIST, IT IS ONLY A PROCESS, NOT A DELIVERABLE WARANTY. And as far as I know, metrics to know if the "process" was good enough do not exists, and in such complex system it is just BS.
Regulations can make sense for software that could cause physical harm - like the software in an implanted medical device - but most software doesn't fall in that category. The CRA is about "security" not about "physical harm" - they are two different things. Regulations for the latter would likely receive less pushback.
I have mixed feelings about CRA, but I am satisfied with FOSS protections. I wish it could allow for more commercialization though,not just donating.
As for commercial work, it's good to have a lighter regime for small, low risk products, but it's still alot of head scratching and uncertainty on our part. Also ditto for independent HR and payroll systems, as they aren't low risk. I wonder if their VPN/VM setups they always included work towards security of the app? Again, more work figuring that out.
Around me, meaning DACH countries, Iberian Penisula and some Mediterranean countries.
Even the fact that you have abundant clean water and good food that you can enjoy in your electrified and heated house and you can order an overnight delivery for hundreds of things that will just fit and/or work in your house is the direct result of thousands of regulations.
Your distinction is without meaning
If you cook for your friends, but then decide to open a commercial kitchen, do you think you will be exempt fromfood safety regulations?
No. No, you haven't. GDPR was literally a non-issue for micro companies, because all micro companies had to do with GDPR is not gather data they didn't need.
Same here: all you'll need to do is to do due diligence you already should have been doing to begin with
Lots of tiny businesses on that list too. Also a bunch of local governments, weirdly.
Feels like if we’re at kebab shop levels of granularity for 88 pages of rules governing the entire planet, “a lot of work” is unavoidable, no?
So if you wanted to release an open source product, but try to monetize it in some way by providing extra services on top of it (i.e. backup / sync across devices service), this totally applies?
What if the open source product is used as a marketing asset of a commercial product but otherwise is not commercial by itself?
A recall was issued therefore there is already regulatory oversight where it counts. The CRA is at best redundant and at worst a prime example of regulatory capture [1].
I wish people would actually read the links they post.
That "poor kebab shop" was fined for this:
--- start quote ---
CCTV was unlawfully used. Sufficient information about the video surveillance was missing. In addition, the storage period of 14 days was too long and therefore against the principle of data minimization. Addendum: Fine has been reduced to EUR 1500 by court,
--- end quote ---
GDPR is there only because of the data storage. Illegal CCTV is covered by different laws that, in a twist that should surprise no one, you shouldn't break even if you are a kebab shop.
The actual first business listed there is a "betting place", and it was fined for illegal use of CCTV, too.
> Also a bunch of local governments, weirdly.
It's not weird. It's how laws are supposed to work: governments are not exempt from them.
Just because a recall was issued doesn't imply that there's a regulatory oversight. And even that oversight exists in that particular case doesn't mean it is applicable to other areas.
What it does mean is that you're weak attempt to paint software being exempt because it "doesn't lead to food poisoning" is weak and uninformed at best.
> The CRA is at best redundant
It's not
> at worst a prime example of regulatory capture
Again, it's not.
Just because you engage in FUDing, doesn't make your words true.
First you tried to pretend that software is somehow different because it "doesn't do any physical harm".
I addressed that directly with a very specific example of physical harm.
(Besides, there are many more concerns beyond just physical harm, and my example of food poisoning was just an example that you must follow safety regulations even if you're a "one-person" company)
So your next counter-claim was a non-sequitur that "since it was recalled it means that there are regulations" which doesn't make sense even logically, which I addressed as well.
And the rest is just unsubstantiated claims that the law is redundant at best and bad at worst which is pure FUD.
How's that for good faith argument?
With this, I remove myself from this discussion. Adieu.
You're trying to carve out an exception for you yourself specifically because you assume that your special case is too special.
1. Laws don't usually work that way
2. There are innumerable cases when "innocuous" software is used as an attack vector precisely because "we don't do nothing why would we keep our software secure"
3. In EU you're safe until you really screw up. More discussion in this thread: >>38819780
The mind boggles.
By definition regulation adds restrictions and obligations making their life a little (or a lot) harder and closing down the ones who'd rather focus on making stuff than deal with bureaucratic rules.
Still waiting for an example of a regulation which directly resulted in more of the thing it regulated.
What I see, in any way, CRA will open new page in history of OSS.
Now OSS will divide to two parts - software for fun (may be for education, arts and science, read more), and serious software with liability.
I'm pretty sure, for example, ISP and big tech using OSS and contributing to OSS will become much stricter in what they allow in PR's, and in what they use as dependencies.
For about education, EU policies for toddlers/teenagers are stricter than for adults, so, possible appear of restricted teen versions of smartphones software, and one of restrictions could be only liable software.
Also, some business entities will be prohibited to use software which avoid liability, so, most current OSS could become prohibited (because many dependencies are not liable).
It's now hard to predict exact, but I'm sure, will be restrictions in air/space and in transportation; manufacturing involving dangerous substances and dangerous environments; may be restrictions in HoReCa businesses.
Science/arts usually have exceptions for many restrictions, they allowed to free use copyrighted or prohibited for public/commercial/wide usage things, like Nazi symbols, but only while "enough to explain idea", or "enough data for research", nothing more.
That's easy. From juridical point, all software without liable owner except mentioned science/arts, will be prohibited in all businesses working at restricted markets.
For listed above, could add (cellular) communication companies, energy (electric or gas) companies, water pipes, and other infrastructure critical, medical, emergency services.
And from Democratic experience, ANYBODY could become steward, just need to claim responsibility and may be conduct some bureaucracy procedures to prove ability to be liable.
From this my conclusion, if NOBODY will claim to be steward for some software, it will automatically become prohibited for mentioned businesses.
Because OSS organizations just not created for this.
Next I will say about typical OSS org, not some daughter of commercial corporation like Apple, or Google, or Microsoft (or any other FAANG member, or how it called now).
So, exists huge number of just unregistered tiny OSS producers, who do it just for fun.
Some OSS producers become medium entities (mostly, non-profit), and some even large.
But they not intended to do this for money! This is just hobby, even when this hobby gives rock-star like popularity.
And as I looked on internal regulations of OSS, they usually avoid liability at any cost.
This is problem even in large commercial entities, but in non-profit this is just nightmare, nobody want to be responsible.
Fortunately for them, modern bureaucracy gives some methods to avoid DIRECT responsibility - they use Board of Directors; mimicry to Direct Democracy methods - conduct plebiscites on all important questions; and use all other tools of big businesses, to avoid direct decision making and responsibility.
Well, in past, when software was not really important, all these things was totally normal. But unfortunately, these large decentralized entities are uncontrollable, in sense, they could long time maintain way, on which they step when decentralized structure built, but for them impossible to reform this structure, to turn it to other way, to match changed environment.
And when I said, commercial entities have same issues, yes, they literally same, with just one difference - commercial usually made to make profit, but money is not just profit, they are equivalents of resources, mean, reserves, which CEO of commercial entity could direct to build new structure, matching changed environment. And in commercial, very popular form centralized, with powerful responsible CEO, who could after built new structure, fire members of old (this is just impossible in near all OSS projects, as they usually have distributed ownership).
Few words about daughters OSS entities of large commercial. Their difference, while they also like to play Democracy games, but all money still at hands of father entity, and these are extremely powerful levers.
When for some reason, daughter OSS entity become uncontrolled, or just father entity decided, that it will be cheaper to create new daughter entity than to reform old, they just create new daughter entity and make it structure as need.
This is really easy for them, because for commercial entities, just normal to have processes division (department), which just constantly modify internal regulations of entity, to match current CEO view. So as I hear, in modern entities, typical to rebuild structure every 1.5-2 years.
And yes, sure, will be transition from current state to a new one. And who knows, this could be like Y2K.