We expect professionals to behave ethically. Doctors and companies working on genetics and cloning for instance are expected to behave ethically and have constraints placed on their work. And with consequences for those behaving unethically.
Yet we have millions of software engineers working on building a surveillance society with no sense of ethics, constraints or consequences.
What we have instead are anachronistic discussions on things like privacy that seem oddly disconnected from 300 years of accumulated wisdom on surveillance, privacy, free speech and liberty to pretend the obvious is not obvious, and delay the need for ethical behavior and introspection. And this from a group of people who have routinely postured extreme zeal for freedom and liberty since the early 90's and produced one Snowden.
That's a pretty bad record by any standards, and indicates the urgent need for self reflection, industry bodies, standards, whistle blower protection and for a wider discussion to insert context, ethics and history into the debate.
The point about privacy is not you, no one cares what you are doing so an individual perspective here has zero value, but building the infrastructure and ability to track what everyone in a society is doing, and preempt any threat to entrenched interests and status quo. An individual may not need or value privacy but a healthy society definitely needs it.
Programmers are just a loosely-defined group of tinkerers, labourers, and the odd scientist or engineer. How do you expect to impose a structure on that? A teenager can tinker around with software in his bedroom and nobody gives a damn. If he were to conduct medical experiments on his little sister, on the other hand, he'd go to jail. That is the difference.
Programmers (as individuals) can't be ethically audited, but what we can do is regulate the data which is allowed to be collected. You regulate it like any other industry. Sigma-Aldrich is a corporate company that sells pharmaceutical grade precusors. I was dating a girl who was doing a post-doc in o-chem, in her office waiting to finish up something, and flipped through their catalog. I saw a precursor that was heavily flagged by the DEA which could be used to synthesize massive amounts of a recreational drug. Curious, I asked her the procedure for procurement, and she delineated it. In short, she could get it with a sign-off from the PI and a few other things fairly easily [she would never do that, she's far too ethical - but her PI was famous enough that a request on his letterhead with "Veritas" on it would have been enough] but there's a chain of custody and auditing system in that just like there is with doctors who are issued DEA numbers. If I call up S-A and ask for the same chemical not only would I be laughed off the phone, but they'd likely submit my information to the DEA to flag me for further investigation.
What am I getting at? You can't regulate people, but you can regulate systems. If that precursor was ordered and that drug happened to pop-up, the DEA could easily call up any of the suppliers of those precursors and figure out when it was dispensed fairly easily. We need to regulate any institution that collects data in the same way. When it's at a point where the institution is large enough to collect information at a level like that, issue compliance terms. In the same way publicly traded companies have to release financial information to the SEC and comply with numerous reporting terms (look at EDGAR to see how extensive it is), open up another branch of the government that is in charge of regulating the companies that collect data. That way, your engineer with loosely-defined morals who is capable of doing whatever will be prosecuted just like amoral doctors.
I feel like this is too wide. Everyone collects data. I don't mean all tech companies collect data, I mean, for example, your friends have copies of the emails you've sent them. They have photos with you in them of places you've been with timestamps and GPS coordinates. Your coworkers have access to your calendar. Your mechanic has the service history on your car. Your librarian knows which books you have checked out.
These aren't problematic situations because they each only have a little piece of your data, and you trust each those people with that little piece, and if you don't then you don't have to give it to them.
The problem is when you don't have that choice. Which is what happens when you're dealing with a government or a monopoly (or some other concentrated market where you can't trust any of the players). You can't reasonably choose to not have your location collected by your mobile carrier, or the traffic cameras in front of your home. If all your friends use Facebook, then Facebook Facebook Facebook.
But we don't really want to regulate Facebook. I mean holy cow, what is that even supposed to look like?
I think we can separate the problem into two pieces. The first is collection by, let's call it, unavoidable monopolies. Telecommunications carriers and other utility companies. This is where we know exactly what to do, because these entities should not be collecting any information about people at all. There is no reason Verizon needs to know anything about you other than whether you've paid your bill. So regulation here can be useful, e.g. make it unlawful for carriers to triangulate a cellphone's location without a warrant, or collect anything whatsoever about the contents of IP packets. But we also have a strong technical solution here. Encrypt all the things. Fully deprecate HTTP in favor of HTTPS. We need to build, for example, DNS query privacy. Things like that.
The other part of the problem is what you might call avoidable monopolies. There is no fundamental reason why Facebook has to be as centralized as it is. You have a phone which has all your photos on it and is connected to the internet 24/7. Why is there a copy of your photos on Facebook's servers? If one of your friends wants to see one of your photos, why are they not getting it directly from you? Then you don't have to trust Facebook with a copy of it. So the solution for this half of the problem is, disintermediate the avoidable monopolies.