OK, so let's say you bought a special computer monitor that had screen reading technology built in so it could read out or describe anything displayed on it regardless of operating system, even a raw video feed. And one day it catches fire and burns your house down.
Most people would think it was acceptable to sue the manufacturer of the hardware device. But if using NVDA somehow ended up making your laptop catch fire and your house burned down, in that case, oh well, it's just tough luck, caveat downloador etc?
What if it came out in discovery that the author was previously made aware via numerous emails that their application had a tendency to cause laptops to dangerously overheat, and they chose to disregard the problem? Is that still the consumer's financial and legal responsibility?
(Not saying there's any right answer, just wondering if I understand your position properly.)
EDIT: Just read other comments that clarified that OSS isn't subject to this new directive, so this a moot issue I suppose.
Are you from the US? In New Zealand sueuing is mostly a foreign idea and very rarely occurs.
Occasionally criminally negligent behaviour gets spanked - but even there it's often an idiotic scapegoating farce (local examples: CTV building, fund fraud, Royal Commission of Inquiry into the terrorist attack on Christchurch masjidain).
One alternative system is government insurance against harm e.g. New Zealand has a no-fault ACC system for helping victims of industrial accidents.
OSS is infrastructure and trying to scapegoat an individual developer or company for unforeseen harm is insanity. Finger pointing and a culture of blame seem to be unproductive.
A good place to start thinking about policy would be to look at log4j. What policy would prevent that? Would a culture of victimising creators have prevented that vulnerability?
> sue the manufacturer of the hardware device [that starts a fire].
There's the implicit philosophy that we can use reductionism to find a cause.
Finding cause is getting more difficult as we complexify the world. Read reports on disasters, and then try to imagine how to prevent them? There's an almost Christian religious belief that penalising the person who makes a mistake will fix the system.
Cue blaming the pilot. We still often blame the pilot even after decades of work in aviation management to try and produce safety systems that try to apply a fix in the correct place.