We expect professionals to behave ethically. Doctors and companies working on genetics and cloning for instance are expected to behave ethically and have constraints placed on their work. And with consequences for those behaving unethically.
Yet we have millions of software engineers working on building a surveillance society with no sense of ethics, constraints or consequences.
What we have instead are anachronistic discussions on things like privacy that seem oddly disconnected from 300 years of accumulated wisdom on surveillance, privacy, free speech and liberty to pretend the obvious is not obvious, and delay the need for ethical behavior and introspection. And this from a group of people who have routinely postured extreme zeal for freedom and liberty since the early 90's and produced one Snowden.
That's a pretty bad record by any standards, and indicates the urgent need for self reflection, industry bodies, standards, whistle blower protection and for a wider discussion to insert context, ethics and history into the debate.
The point about privacy is not you, no one cares what you are doing so an individual perspective here has zero value, but building the infrastructure and ability to track what everyone in a society is doing, and preempt any threat to entrenched interests and status quo. An individual may not need or value privacy but a healthy society definitely needs it.
Programmers are just a loosely-defined group of tinkerers, labourers, and the odd scientist or engineer. How do you expect to impose a structure on that? A teenager can tinker around with software in his bedroom and nobody gives a damn. If he were to conduct medical experiments on his little sister, on the other hand, he'd go to jail. That is the difference.
Programmers (as individuals) can't be ethically audited, but what we can do is regulate the data which is allowed to be collected. You regulate it like any other industry. Sigma-Aldrich is a corporate company that sells pharmaceutical grade precusors. I was dating a girl who was doing a post-doc in o-chem, in her office waiting to finish up something, and flipped through their catalog. I saw a precursor that was heavily flagged by the DEA which could be used to synthesize massive amounts of a recreational drug. Curious, I asked her the procedure for procurement, and she delineated it. In short, she could get it with a sign-off from the PI and a few other things fairly easily [she would never do that, she's far too ethical - but her PI was famous enough that a request on his letterhead with "Veritas" on it would have been enough] but there's a chain of custody and auditing system in that just like there is with doctors who are issued DEA numbers. If I call up S-A and ask for the same chemical not only would I be laughed off the phone, but they'd likely submit my information to the DEA to flag me for further investigation.
What am I getting at? You can't regulate people, but you can regulate systems. If that precursor was ordered and that drug happened to pop-up, the DEA could easily call up any of the suppliers of those precursors and figure out when it was dispensed fairly easily. We need to regulate any institution that collects data in the same way. When it's at a point where the institution is large enough to collect information at a level like that, issue compliance terms. In the same way publicly traded companies have to release financial information to the SEC and comply with numerous reporting terms (look at EDGAR to see how extensive it is), open up another branch of the government that is in charge of regulating the companies that collect data. That way, your engineer with loosely-defined morals who is capable of doing whatever will be prosecuted just like amoral doctors.