We expect professionals to behave ethically. Doctors and companies working on genetics and cloning for instance are expected to behave ethically and have constraints placed on their work. And with consequences for those behaving unethically.
Yet we have millions of software engineers working on building a surveillance society with no sense of ethics, constraints or consequences.
What we have instead are anachronistic discussions on things like privacy that seem oddly disconnected from 300 years of accumulated wisdom on surveillance, privacy, free speech and liberty to pretend the obvious is not obvious, and delay the need for ethical behavior and introspection. And this from a group of people who have routinely postured extreme zeal for freedom and liberty since the early 90's and produced one Snowden.
That's a pretty bad record by any standards, and indicates the urgent need for self reflection, industry bodies, standards, whistle blower protection and for a wider discussion to insert context, ethics and history into the debate.
The point about privacy is not you, no one cares what you are doing so an individual perspective here has zero value, but building the infrastructure and ability to track what everyone in a society is doing, and preempt any threat to entrenched interests and status quo. An individual may not need or value privacy but a healthy society definitely needs it.
However, I'd also like to see general software development think more closely about the role it has in normalising these things. Next time you start to create an account system for your project, ask yourself whether you really need it. Could you engineer around it, perhaps by letting the user store their data, or using a stored key to identify them? Let's go beyond don't store what you can't protect, and aim for don't store what you don't strictly need.
Pretty compelling talk, culminating in:
I believe there should be a law that limits behavioral data
collection to 90 days, not because I want to ruin Christmas
for your children, but because I think it will give us all
better data while clawing back some semblance of privacy.The government can get your gmail, facebook, verizon, amazon data because those companies keep that data about you. The NSA doesn't need to spy on you, google already does. I don't think the NSA is reading my email, but I know Google is.
Not to mention that when all these tech companies are spying on your for profit, your privacy is already destroyed.
Or companies that deploy ad sense or otherwise depend on companies like Google or Facebook.
The idea that we could get the majority of the industry to agree on ethics is pretty far-fetched when a large portion think surveillance is making their country safe.
And now Microsoft decided they also want a piece of the pie.
The kicker? I see people still defend Google all the time, nowadays with bullshit arguments like "I am tired of this you are the product meme" and still excuse Facebook because they need it to keep in touch with others. And they found startups based on advertising and tracking, they work for them and generally support analytics as an inalienable right of software development.
They understand fully that their data is collected and they expect nothing less than the top result of their Google, Amazon, and Facebook queries to match exactly what they are looking for.
Programmers are just a loosely-defined group of tinkerers, labourers, and the odd scientist or engineer. How do you expect to impose a structure on that? A teenager can tinker around with software in his bedroom and nobody gives a damn. If he were to conduct medical experiments on his little sister, on the other hand, he'd go to jail. That is the difference.
Keineswegs weiß man bestimmt, wie die Fetischisierung
der Technik in der individuellen Psychologie der
einzelnen Menschen sich durchsetzt, wo die Schwelle ist
zwischen einem rationalen Verhältnis zu ihr und jener
Überwertung, die schließlich dazu führt, daß einer, der
ein Zugsystem ausklügelt, das die Opfer möglichst
schnell und reibungslos nach Auschwitz bringt, darüber
vergißt, was in Auschwitz mit ihnen geschieht.[0]
In short: the fetishization of technology makes its creators forget for which purposes their wonderfully efficient tools will finally be put to use.[0] Erziehung zur Mündigkeit, S. 91
Programmers (as individuals) can't be ethically audited, but what we can do is regulate the data which is allowed to be collected. You regulate it like any other industry. Sigma-Aldrich is a corporate company that sells pharmaceutical grade precusors. I was dating a girl who was doing a post-doc in o-chem, in her office waiting to finish up something, and flipped through their catalog. I saw a precursor that was heavily flagged by the DEA which could be used to synthesize massive amounts of a recreational drug. Curious, I asked her the procedure for procurement, and she delineated it. In short, she could get it with a sign-off from the PI and a few other things fairly easily [she would never do that, she's far too ethical - but her PI was famous enough that a request on his letterhead with "Veritas" on it would have been enough] but there's a chain of custody and auditing system in that just like there is with doctors who are issued DEA numbers. If I call up S-A and ask for the same chemical not only would I be laughed off the phone, but they'd likely submit my information to the DEA to flag me for further investigation.
What am I getting at? You can't regulate people, but you can regulate systems. If that precursor was ordered and that drug happened to pop-up, the DEA could easily call up any of the suppliers of those precursors and figure out when it was dispensed fairly easily. We need to regulate any institution that collects data in the same way. When it's at a point where the institution is large enough to collect information at a level like that, issue compliance terms. In the same way publicly traded companies have to release financial information to the SEC and comply with numerous reporting terms (look at EDGAR to see how extensive it is), open up another branch of the government that is in charge of regulating the companies that collect data. That way, your engineer with loosely-defined morals who is capable of doing whatever will be prosecuted just like amoral doctors.
Does anyone remember that angry email they sent 5 years ago where they were criticizing their boss? Google does. What kind of profile can you build from thousands and thousands of such emails, messages and queries, and location data and pictures, videos, actions on social networks?
I think some companies have a better idea about who some people are than those people themselves.
But more importantly, sure, there are people who think they deserve to be mistreated, there are people who are drug addicts to the point of barely being anything else and still would fight anyone who gets between them and their dealer, and of course and there are plenty of people who have no problem with all sorts of messed up things up to murder as long as they themselves are not on the receiving end of it. Yet even if 99% of all people regressed to that station, that wouldn't do one bit to diminish my own human rights. That some or even a lot of people are fine with certain things, whether they understand them "fully" or, which I find more likely, "not in the least", is the problem, not the solution.
Drive to the extreme: he right of people to do what the White Rose did will always outweigh the right of people to not be part of the White Rose. It's dissidents and persecuted minorities who define the boundaries of these things, not the people who are living in comfort in exchange for not standing up for anything or against anyone. They exist, and their opinion matters as a problem to be solved or worked around, but that's the extent of it. Some things can not be justified by anyone agreeing to them, people do not have that power even when numbering billions.
[0] https://en.wikipedia.org/wiki/Data_Protection_Directive
[1] https://en.wikipedia.org/wiki/International_Safe_Harbor_Priv...
But I think it's already too late. The genie is out of the bottle and it's already doing it's darker things in many parts of the world.
One of the great things about technology, is that it knows no boundaries. But that is also it's biggest danger. Because powerful technology in the hands of weak people can lead to disaster.
Intelligence is the ability to create something from nothing, while Wisdom is the ability to choose (wisely) how to apply intelligence to reality - what to create and what not to create. If intelligence is the engine, then wisdom is the driver.
We have trained lots of engines but very few drivers. And that is what worries me most..
Pretty much. I don't use Facebook at all, and give in to Google only on technical searches (Which DDG still isn't good at), mapping and when forced to by work (GCE etc.), so you're preaching to the choir, but lets not try for an overnight coup here.
Magically, should a bill/resolution be introduced to the floor and not be stomped on immediately, enforcing it internationally would be about as difficult as say, enforcing international oil embargoes or a ruling by the ICC (i.e., nearly impossible - you don't see any proceedings against Cheney or Rumsfeld for war-crimes within the Hague, now do you?). Domestically, however, the US has (or had, historically from, say, 1930 until the mid 90s) the economic/political influence to effectively enforce their agendas fairly effectively. The new US gov't entity formed would have to have the intent to limit data collection then exhibit the willingness to penalize those institutions for violating those data collection policies (e.g. similar to an FDA fine issued for a multi-national drug company who has a presence within the US).
Again, too many financial interests opposed to see this happening, but the refusal to adhere to the legislation would mean (in theory) loss of US business, which would be catastrophic for most industries. HackerNews user:grellas (or was, I haven't seen him post in a couple years now) is an attorney specializing in tech affairs who'd be able to make a better response, but from a strictly political POV, even domestic legislation limiting data collection would never occur.
I don't think there's any need to rehash the debate here. Simply, I and many others do not believe that any western government is going to use information gathered by tech companies to preempt threats to entrenched interests and the status quo. I've seen the same arguments made here for years, and none of it is convincing.
It's admirable that you are so certain in your beliefs. If you don't like what the tech sector is doing, please by all means continue to advocate. Shout it from the mountain tops, go to work for the EFF. But don't discount people that legitimately disagree with you as being irresponsible. At least some of us have made the effort to understand your point of view. The least you could do is to try to understand ours.
So in regards to privacy we treat it like property. Governments which don't support good property rights are not going to care one whit about privacy and those who come after privacy will eventually tread on property rights.
People need to understand it as something to be protected just as you would your physical stuff and work towards having it treated similarly in government.
Which sector is building startup after startup for data mining, tracking, building profiles? This in addition to the already established companies. Then you're trying to downplay the issue to trivial actions such Facebook likes or tracking of IP addresses, a toy version of the state of the art. Finally, the sarcasm, showing how reasonable you are and putting the OP in a bad light for not being "more understanding".
It's quite simple: the topic of privacy is central to a free society and it's enshrined in the Universal Declaration of Human Rights. In the past, we have seen a rich history of abuses, lies and deceit from huge organizations with massive resources at their disposal. Private or not.
The majority of people go on with their lives without caring, as long as they have their basic needs met. The very few that take a stand, pay the price. Otherwise, some criticism of the behavior of these organizations can be found online, but not much because of:
1) Chilling effects. Funny how I had to think before posting this message, living comfortably in a democratic country, with freedom of thought and freedom of speech.
2) "Helpful people", quick to jump to the defense of said organizations, explaining away abuses, making up excuses, muddying the waters, asking for fairness and understanding their point of view.
So thanks for keeping the balance karmacondon. They might have mountains of money, lawyers, shills, PR people and most resources imaginable really, BUT we wouldn't want to unfairly hurt their feelings. I do apologize for that.
Yes, I believe we should have an "Hippocratic Oath" ([1]) for technology workers.
Why not? You may disagree, that doesn't mean you can't be flat-out wrong. Having an opinion does not automatically give that opinion equal weight when history has proven to us again and again that that particular opinion ends up with making society either dangerous or at a minimum uncomfortable.
I'm sure there were border guards in former East Germany that were entirely convinced that their state was the greatest and that's why they had to keep people in at all costs, including shooting them if they persisted in believing otherwise and tried to simply leave. After all, that was best for them. But that particular opinion turned out to be very wrong in the long term.
People can rationalize the most absurd stuff to themselves and to others, especially when their pay-check depends on it, but that's not a requirement.
All those that try to pretend that there is some kind of 'reasonable disagreement' possible about the erosion of privacy and that directly and indirectly help to rush in the surveillance state have quite possibly not thought as carefully and have not considered these things with the degree of gravity required as they claim they have. Having a mortgage to pay may factor in there somewhere too.
Usually this is a combination of being too young, too optimistic and in general living too sheltered a life to know what can happen to you, your family and your friends when the tide turns. And the tide always turns, nothing is forever.
> Simply, I and many others do not believe that any western government is going to use information gathered by tech companies to preempt threats to entrenched interests and the status quo.
I hope you're right but history is not on your side in this case.
> I've seen the same arguments made here for years, and none of it is convincing.
Yes, it isn't going to convince you any more than that border guard would be convinced that his job is a net negative to society. Every stream, no matter how reprehensible will always have its fans and cheerleaders. And later on they will never remember that they had agency all along and were perfectly capable of making a different decision. Responsibility is weird that way.
> It's admirable that you are so certain in your beliefs.
It is not admirable that you are so certain in yours. May I suggest a couple of talks with some holocaust survivors to get a better feel for what the true power of information can get you?
Or maybe the family members of some people that were killed while trying to flee the former SovBlock?
Or maybe some first generation immigrants to the US or Canada or wherever you live to give you some eye witness accounts on what it was like to live in those countries before the wall fell down?
'It can't happen here' is an extremely naive point of view.
http://jacquesmattheij.com/if-you-have-nothing-to-hide
Agreed with your advocacy advice.
> The least you could do is to try to understand ours.
That's 'mine' not 'ours', you speak for yourself.
NSA, GCHQ, BSI/BND, etc. aren't the "bad guy" in theory.
It's within a nation's interest to, within the extent of law and respect for human rights, try as thoroughly has it can to know what's going on in the world. Electronic intelligence is part of that, and a growing part.
In practice, the permissive reactions many/most/all governments have to allegations (or proof) that a comms intel agency has broken the law, that's what the trouble is. That these groups have been allowed to break the law or ghostwrite laws that allow them to violate what would generally not be approved by a citizenry, that needs to be addressed.
I'm not sure how ruining the careers of software developers and computer scientists who've worked for these organizations does anything other than remove from circulation some brilliant members of our community.
Ostracize the middle managers, bureaucrats, politicians that allow the trampling of our rights.
But don't arrest the guy designing the home theater system for El Chapo's vacation house and tell me you've taken down the Mexican drug cartel.
You talk about it like it's necessarily a bad thing, by default, for everyone. Why?
It's simply hard to take your stance as one made in good faith.
The US government has a long history of using its national police, the FBI, to infiltrate and subvert domestic political movements that the powers that be found unpleasant -- including using their police powers against modern groups such as the Occupy movement.
Further, we know that the US government has used records held by tech companies to create massive cross-referenced databases of people, including domestic activities. The recent leaks about surveillance programs has made that abundantly clear.
Your position is literally that an organization with a history of doing this kind of activity won't use the technology we already know the government possesses to keep doing the same thing.
So I think there is a need for you to rehash the debate here, because it's not clear how you sincerely hold that position.
Because rather than a rational view, what you describe sounds like irrational denial.
"We do not know how to determine how the technology fetish in individual people leads to the point at which a rational relationship changes into one of over-valuing, which eventually leads to someone designing a train system to get the victims as fast and smooth as possible to their destination in Auschwitz, but who forgets what it is that happens to them once they arrive there"
I understand what you're saying, and I think I get where you're coming from. But like the GGP post, you're begging the question and assuming that your beliefs are so correct that anyone who disagrees with them must be insincere.
I don't think there's anything inherently wrong with the government monitoring potentially criminal groups or building databases. That's what we pay them to do. If they get out of hand then we, the people, will deal with it.
The people that I know in real life that hold views like these are best described as scared, rather than ignorant. They feel that they price they pay is a small one as long as it gives them an un-specified increase in perceived security in return.
Fear is a very powerful tool when it comes to getting people to choose against their self-interest.
This sounds more like a uncompromising proclamation instead of thorough analysis.
It's not about mistakes. Mistakes are - usually - a sign that someone needed to learn. They do not as a rule include wanton intent.
And if a person were to make too many mistakes then they probably should not be trusted.
> I understand what you're saying, and I think I get where you're coming from. But like the GGP post, you're begging the question and assuming that your beliefs are so correct that anyone who disagrees with them must be insincere.
No, that's the opposite. You have beliefs that you state are so correct that they stand on their own, in spite of a bunch of historical evidence to the contrary, starting roughly at the time that we invented writing going all the way into the present. That's a pretty gullible position.
> I don't think there's anything inherently wrong with the government monitoring potentially criminal groups or building databases. That's what we pay them to do. If they get out of hand then we, the people, will deal with it.
Potentially criminal groupls: everybody.
You're apparently one of the people where the 'fear' button has been pressed, don't let your fear get the better of you.
Btw, I note that you write all these 'reasonable disagreement' things from the position of an anonymous coward which makes me think that maybe you do realize the value of your privacy after all.
The ancients had it as 'power corrupts', the abuses are plentiful and that every company that engages in these practices (and the government agencies as well) do this to ostensibly make our lives easier or keep us 'safe' is very well known and advertised. If you have evidence to the contrary feel free to share it but that's where we currently stand.
( http://acm.org/about-acm/acm-code-of-ethics-and-professional... )
I can't think of a case where stable and mature democratic bureaucracy has ever used surveillance to influence the majority of its populace. Germany in the early 20th century was a very instable government in a bad economic situation. Soviet East Germany was communist, which isn't quite the kind of democratic that I meant. It's true that any government could turn bad, in the same way that anything is possible. But there's very little evidence for that in the current context.
So my position is this: Given that I live in the United States in 2016, I'm not worried about the government randomly deciding to screw with me by looking at my electronic communications and acting on them. It just doesn't make sense. I'm not significant relative to the scale of the US government, the government itself just doesn't work that way and all of the negative scenarios I've heard seem to be very contrived.
If you really think that it's possible that the government of a modern western nation could turn into communist East Germany, then it seems like your problem might be with governance, not privacy. If it's possible for the government to go all Walter White and just turn evil over night, then no amount of personal privacy is going to save any of us. And until it seems like that's a thing that's actually possible, I'm going to make practical decisions about my own privacy.
A better paraphrase would be "We should suspect that the US government will act in a way similar to how it has acted repeatedly over the span of decades."
I think this is a perfectly fair standard, and actually am held to that standard all the time, including professionally. If I had a continual, systemic habit of flaws in my work, for instance, I would be fired.
Your phrasing suggests that these are things that "just happen", instead of a pattern of decades of intentional programs with the same kinds of aims and behaviors.
> But like the GGP post, you're begging the question and assuming that your beliefs are so correct that anyone who disagrees with them must be insincere.
I actually think you're insincere because you're minimizing and denying a pattern of sustained behavior as a few mistakes, rather than an intentional, continuing program.
That insincerity can be directly seen above when you switch from "did things I didn't agree with" to "made mistakes". No one is talking about the US government making mistakes, and decades of intentional programs operated with similar strategies is hardly "making mistakes".
Your entire analogy was insincere and meant to elicit an emotional response.
> If they get out of hand then we, the people, will deal with it.
Will we?
I'm actually very skeptical that we'll deal with it in any meaningful way, and find it much more likely that we'll surrender a great deal of control over the country to an autocratic government with a good social control program, precisely because people like you don't want to sincerely discuss the likelihood of that happening by stages.
The ability of power or authority to lock you up, take your property or worse your life is protected by rule of law and due process. Having a debate of the rule of law or due process is similar to having a debate on privacy or a surveillance state. The consequences are negative for the individual and society as a whole, even though they may benefit some stakeholders in the short term who will of course advocate for it but on the whole it's not a social good.
The only thing we have to come to this conclusion is history, a wide body of knowledge and reason.
We can thus say with some degree of confidence that a society without rule of law or due process is not a good thing similar to a society with surveillance is not a good thing. We don’t use the ‘moral high ground’ but reason and historical experience to make these conclusions. This is not a moral issue but a practical one that has consequences for our societies. The ethical issue is the social good for the people who build these systems.
Since we are discussing the social good the alternative view needs to be backed by reason on how surveillance can be good for society as a whole, beyond offering naive presumptions suggesting people are good and will not abuse the power, or how knowing details of everyone’s activities may be beneficial to an individual or company because while that may be true they do not address the social good.
And the only thing we use in these discussions is reason, let's not make it personal.
While this may be true, certain crimes are seen as worse than others. And, as un-PC as it may sound, certain demographics are many times more likely to commit certain crimes.
Homicide: http://www.cdc.gov/mmwr/preview/mmwrhtml/mm6227a1.htm
Also, some government monitoring can be "for your own good":
Youth Risk Behavior Surveillance: http://www.cdc.gov/mmwr/preview/mmwrhtml/ss5704a1.htm
But, maybe the CDC is different than the NSA.
Let's assume for the sake of argument that the above events are unlikely though. When a few actors have access to the information of tens of thousands to billions of people though, this has an impact on a societal level. As jaquesm said, information is power and when one has so much information and lots of money to boot, they can begin to covertly influence policy and behavior and harass and marginalize their opponents. And they can do that directly, or by using the information of a third party, like a doctor, lawyer, religious leader, or even someone insignificant which happens to be a relative, etc. Moreover, companies can be sold, together with their databases, they can be forced to hand them over or they can be hacked. A treasure trove of data held by an otherwise principled company, might end up in the hands of an unsavory party.
Why is this a bad thing? History has shown again and again how such imbalances of power are abused. Here's a rather harmless example of data mining a mobile device + social network combined with social engineering to scam people out of money: http://toucharcade.com/2015/09/16/we-own-you-confessions-of-... If a game producer can do this, what are the pros doing?
They don't have to. History only happens once, if you refuse to learn from history because it is not an exact repetition of the past then you can never make any progress.
> Saying "A thing happened in the past" can be instructive, especially to people who didn't even realize that the thing was possible.
You seem to think it isn't possible because of "insert magical reason why everything is different now here", not just that it can't happen again for whatever reason. That's an impossible position to argue with. All the weight of history would not be able to sway you from that position because nothing can counter magic.
> But what's much more practically useful is to say "I think what happened before will happen again in the current context and for these reasons". An example from the past is only useful if it can be tangibly connected to the current situation, right now in the present.
No practical examples will counter your magic. You will either say 'that's not the same exactly' or 'that's too long ago to be relevant' and so on.
The only thing that will convince you is when you're lifted out of your bed at 3 am and we never hear from you again. By then it will be a bit too late, but you too will be a believer in government abuse if and when that happens.
Until then you're going to head straight for the last stanza of Martin Niemoeller's most quoted lines. The vast majority of the people living in the former DDR were never lifted from their beds at 3am for interrogation. To them life was just a-ok.
> I can't think of a case where stable and mature democratic bureaucracy has ever used surveillance to influence the majority of its populace.
That's not a bad thing per se. Meanwhile, you're trying hard to change that number from '0' into '1' by allowing the present level of abuse to spread unfettered, which invariable leads to an escalation. Each and every click that you hear is one of a ratchet, it will not voluntarily click back again, it can only go forward until on that scale between '0' and 'police state' you've gotten close enough to 'police state' that there is no relevant difference.
It can't happen here is a very dangerous line of thought. See the movie 'the wave' for some more poignant illustrations of how that thought is a dangerous thing all by itself. It can happen here, it might happen here, and it likely will happen here unless we're vigilant.
> Germany in the early 20th century was a very instable government in a bad economic situation.
ok
> Soviet East Germany was communist, which isn't quite the kind of democratic that I meant.
Yes, and like that there will always be one last thing that is not quite the same which will allow you to look the other way.
> It's true that any government could turn bad, in the same way that anything is possible.
I would consider that progress, hold that thought.
> But there's very little evidence for that in the current context.
That depends on where you are looking. There is plenty of evidence that pressure is being applied, but the pressure is applied subtly enough and in places far enough away from the focal points where change is effected that you'd be hard pressed to connect the dots. That's the beauty of having a lot of information at your disposal.
A nice example is the Iraq war, the run up to that saw massive world wide resistance in the populations of the countries of the 'coalition of the willing' whereas later on this was described as the coalition of the 'gullible, the bribed and the coerced'.
> So my position is this: Given that I live in the United States in 2016
The United States does not hold a privileged position in the world, and it does not matter whether it is 2016, 1938 or 1912. For everybody living in the past in places where these experiments went wrong they could have written "given that I live in X in Y" and they'd be accurate about that.
> I'm not worried about the government randomly deciding to screw with me by looking at my electronic communications
They might have substituted 'electronic' with 'written'.
> and acting on them. It just doesn't make sense. I'm not significant relative to the scale of the US government, the government itself just doesn't work that way and all of the negative scenarios I've heard seem to be very contrived.
They again would not have used US government but whatever place they lived in. And they would have been dead wrong, and in some cases, when the fog lifted they'd have simply been dead.
What seems contrived for you, living in a country that has never seen actual war on its own soil (sorry, your civil war does not count), that exports war on an ongoing basis, that uses IT to kill people by remote control, that used telephone taps, burglary and threats to affect they inner workings of its own government to me seems to be willful blindness.
For some reason it is more convenient to you to re-write all of history up to and including the present rather than to see that maybe your government is not all that benign, neither on the world stage (where they are a bit more overt about their intent) and internally (where they are out of necessity a lot more cautious). Have the Snowden relevations really not managed to at least peg your evidence meter that maybe not all is as it should be? That your constitutional rights were trampled and that the protections afforded you appeared to be of no value whatsoever?
> If you really think that it's possible that the government of a modern western nation could turn into communist East Germany, then it seems like your problem might be with governance, not privacy.
No, I think that we may be reaching a stage where influence can be wielded subtly enough that someone like you could convince themselves that there is none of it at all. And that's the true prize, to wield that power but in such a way that it can be applied selectively enough that as long as the bread is on the table and the games keep going nobody will notice how rotten the core has become.
> If it's possible for the government to go all Walter White and just turn evil over night, then no amount of personal privacy is going to save any of us.
It will never be that overt. It will be more along the lines of parallel construction and other nice little legal tricks such as selective enforcement. Never enough for you to cross that threshold.
> And until it seems like that's a thing that's actually possible, I'm going to make practical decisions about my own privacy.
You're more than free to do that. Unfortunately, those of us living outside of your beautiful country don't even get to have a vote in there. Your personal well-being trumps the rights of everybody that is not you, and like that we race ahead down the hole.
It looks to me like the US agencies, and the Five Eyes in general, are capable people who are just doing their jobs. They aren't bothering me and I'm not bothering them. The past actions of the US government or hypothetical scenarios based on historical examples just aren't very convincing. Anything could happen. But I'm not going to concern myself with it until I see some evidence.
I'm not making that mistake. I fully realize that the US government is comprised of many arms that even though some of those arms might have our collective best interests at heart this may not be the case for all of it.
> We're talking about hundreds or thousands of individuals spanning multiple generations.
So what. That only increases the chances of abuse, it does not diminish them at all. Just like in Nazi Germany there were plenty of people still fighting the good fight and at the same time employed by government. No government will ever be 100% rotten. But it does not have to be like that to do damage.
> I'm not going to worry about government metadata collection because of something that happened during the Eisenhower administration.
Because, let me guess that was too long ago and now it's different?
> Each person and group of people should be evaluated based on their own behavior and merits, not the reputation of the organization that they are affiliated with.
This is where you're flat-out wrong. Governments (and big corporations) have a life-span much longer than that of the individuals that are making it up, and as such we should look at them as entities rather than as collections of individuals.
If you'd be right then North Korea would not exist today as we know it (and neither would China, Iran and a bunch of other countries). The way these things work is that the general course will be slightly affected by the individuals but the momentum in the whole machinery is enormous. Think of it as a cable in which individual strands are replaced but the identity and purpose of the cable remains. Eventually you have a completely new cable and yet, nothing has changed. And in this case the entity has a huge influence on which parts of it will be replaced by who.
> It looks to me like the US agencies, and the Five Eyes in general, are capable people who are just doing their jobs.
That's a very very scary thing to say. "Just doing my job" has been used time and again historically to distance oneself from the responsibility taken when performing certain actions. Just doing your job is not the standard that needs to be met.
> They aren't bothering me and I'm not bothering them.
And most likely they never will.
"The Only Thing Necessary for the Triumph of Evil is that Good Men Do Nothing"
> The past actions of the US government or hypothetical scenarios based on historical examples just aren't very convincing.
Of course they aren't. After all, it's not you that is personally inconvenienced in any way.
> Anything could happen. But I'm not going to concern myself with it until I see some evidence.
And none that will convince you will ever come. Because if it did it would be too late for you to change your stance anyway.
Very hard to suggest they aren't supporting the police state.
It's unquestionable that the tech sector is directly culpable for supporting the cops and the politicians to spy on us... to affirm otherwise is counterfactual. The moral high ground belongs to the people who don't collaborate with those who would rather have us dumb and controlled.
It's pretty hard to respect the pro-surveillance view because it seems flatly head-in-sand ignorant of reality time and time again. We have evidence of surveillance state wrongdoing in hand, and no successes to point to while simultaneously experiencing multiple terror attacks, and yet the pro-surveillance types are steadfast in their position, as though it's a religion.
The Snowden files showed us explicitly that disrupting political groups is actually done via GCHQ! This is very far from protecting the citizens, and is instead stifling them purposefully.
https://en.wikipedia.org/wiki/Joint_Threat_Research_Intellig...
I actively am discounting the opinion of people that do not understand this threat realized, currently unfolding threat to our democracy. An informed opinion doesn't sound like one passed via the government through the media.
They don't just monitor criminals-- that's why the anti-surveillance folks are anti surveillance! They monitor everyone, and create criminals as needed, and nobody can question them for fear of ending up on the chopping block.
They are currently very far out of hand, and "we the people" are doing somewhere between jack and shit because of how little the people understand the problem.
Why must we accommodate their subservience? Following orders is no excuse.
It's also quite foolish to try to evaluate people in a vacuum... would you extend the same privilege to a member of a criminal gang or jihadi group? No.
https://en.wikipedia.org/wiki/Joint_Threat_Research_Intellig...
https://theintercept.com/2014/02/24/jtrig-manipulation/
"Campaigns operated by JTRIG have broadly fallen into two categories; cyber attacks and propaganda efforts. The propaganda efforts (named "Online Covert Action"[4]) utilize "mass messaging" and the “pushing [of] stories” via the medium of Twitter, Flickr, Facebook and YouTube.[2] Online “false flag” operations are also used by JTRIG against targets.[2] JTRIG have also changed photographs on social media sites, as well as emailing and texting work colleagues and neighbours with "unsavory information" about the targeted individual.[2]"
https://en.wikipedia.org/wiki/COINTELPRO
https://en.wikipedia.org/wiki/SEXINT
https://en.wikipedia.org/wiki/Optic_Nerve_%28GCHQ%29
https://en.wikipedia.org/wiki/PRISM_%28surveillance_program%...
There's your evidence-- it's been here all along. These programs are targeted at US citizens, some with the explicit aim of discrediting them, blackmailing them, or propagandizing them. These are not the actions of a friendly nanny state but rather a malevolent surveillance state.
No, we don't.
We have probably a few hundred doing hard-core surveillance. We have another few thousand functioning as enablers by making social media and ad networks really attractive. We have a whole lot of non-engineers insisting on placing ads and tracking on their websites.
And then there's the mass bulk of software engineers that have nothing to do with it, and nothing they do will stop it.
50% of doctors decide to stop doing something, and it gets noticed. 99% of software engineers decide to take enormously strong stands against surveillance even at great personnel cost, and surveillance continues on as if nothing happened, except maybe those who work on it get paid a bit more to make up for decreased supply.
It may, in that weird 20th/21st century fashionable-self-loathing way, feel really good to blame the group you're a part of, but basically what you're proposing won't do anything at all. You're imputing to "software engineers" in general abilities they don't collectively have. You've got to attack it at the demand level, you will never be able to control the supply. This also matters because if you waste your energy with that approach, you might decide you've done something about the problem and stop trying when in fact you've done nothing.
You know, this is not coming from software developers. There's a group of people out there whose living is made by manipulating the public perception and speech. This group is not the software developers.
[1] https://www.google.ie/search?q=sal+colusi&oq=sal+colusi&aqs=...
In the modern era it is information asymmetry that we should worry about. How to prevent such a thing pragmatically is tricky.
Yes, you would be unhappy. But this is not about whether you are unhappy, but whether you should have control over military, police, our tax money, and thus everyone's lives.
It simply is a very well established fact that concentrations of power are extremely dangerous, and that they are extremely hard to break up once you recognize they are heading in the wrong direction. Just look at what the problem is in countries where people are doing badly, both historically and right now, and why things are so extremely hard to improve once they have gone bad. Which is why we have built structures that try to prevent such concentrations of power from forming. That is essentially the whole point of democracy and the separation of powers: To build distrust into the system. Dictatorships are the opposite of that (only one power, and no mechanism to remove the person in office). Yes, democratically elected officials certainly are unhappy when they are voted out - but that is the price we pay to prevent concentrations of power from forming.
And surveillance is undermining democratic decisionmaking. Having a democracy now does not guarantee you a democracy tomorrow if you aren't careful in who and what you vote for.
> If they get out of hand then we, the people, will deal with it.
Yes, "we" will. If history can teach us something, we can expect that it will take about a decade at least, with many unhappy lives, maybe millions of deaths, until foreign military gets into it to "deal with it".
Sure, maybe that won't happen. But given the prospects, wouldn't it be wise to use our experience from history, to try and make predictions where things will lead, and to then try and prevent things from happening in the first place?
You are aware, for example, that Hitler was democratically elected into office, and all his powers were given to him democratically? And you are aware what it took to remove him from office afterwards?
This only works in the US and even there I have no illusions at all about the ability of a present day militia being able to fight off a trained army, it's a pacifier for overgrown toddlers. The only people that have to fear from citizens with guns are other citizens (with or without guns), the military would have absolutely no problem whatsoever dispatching those if it was decided that their lives and the resulting PR fall-out are less important than whatever objectives they were given.
> In the modern era it is information asymmetry that we should worry about.
Note that there are always provisions in the law to protect the lawmakers from having the laws applied to them.
> How to prevent such a thing pragmatically is tricky.
I think it can't be done unless you simply outlaw it wholesale and are prepared to follow up on it. And from a practical point of view this is now a rear-guard action, fall-back bit by bit and try to push back the point in time where we will have to conclude the battle was lost. This is not a problem that will simply go away, it has already gone way too far for that.
So, you are drinking battery acid until you see evidence that it's not good for you?
Or do you maybe take the evidence of other people's experience into account?
If so, how about you take into account the evidence of hundreds of societies that have dealt with massive surveillance (where "massive" still was "almost none" in comparison to today's and tomorrow's technical possibilities) and with oppression (those two empirically tend to go hand in hand).
If those are your sincere beliefs, I really would recommend you pick up a few books about recent German history. How Hitler came to power, how the state functioned once he was in power, how people tried to get rid of him but failed, and what it took to finally remove him. And then continue with the history of the GDR, how surveillance by the Stasi influenced everyday life, how people tried to reform the political system but failed, and what it took to finally reunite Germany.
The history of other countries might teach you similar things, but Germany is a good example because it is culturally a rather "western country", so it's easier to recognize similarities.
I feel like this is too wide. Everyone collects data. I don't mean all tech companies collect data, I mean, for example, your friends have copies of the emails you've sent them. They have photos with you in them of places you've been with timestamps and GPS coordinates. Your coworkers have access to your calendar. Your mechanic has the service history on your car. Your librarian knows which books you have checked out.
These aren't problematic situations because they each only have a little piece of your data, and you trust each those people with that little piece, and if you don't then you don't have to give it to them.
The problem is when you don't have that choice. Which is what happens when you're dealing with a government or a monopoly (or some other concentrated market where you can't trust any of the players). You can't reasonably choose to not have your location collected by your mobile carrier, or the traffic cameras in front of your home. If all your friends use Facebook, then Facebook Facebook Facebook.
But we don't really want to regulate Facebook. I mean holy cow, what is that even supposed to look like?
I think we can separate the problem into two pieces. The first is collection by, let's call it, unavoidable monopolies. Telecommunications carriers and other utility companies. This is where we know exactly what to do, because these entities should not be collecting any information about people at all. There is no reason Verizon needs to know anything about you other than whether you've paid your bill. So regulation here can be useful, e.g. make it unlawful for carriers to triangulate a cellphone's location without a warrant, or collect anything whatsoever about the contents of IP packets. But we also have a strong technical solution here. Encrypt all the things. Fully deprecate HTTP in favor of HTTPS. We need to build, for example, DNS query privacy. Things like that.
The other part of the problem is what you might call avoidable monopolies. There is no fundamental reason why Facebook has to be as centralized as it is. You have a phone which has all your photos on it and is connected to the internet 24/7. Why is there a copy of your photos on Facebook's servers? If one of your friends wants to see one of your photos, why are they not getting it directly from you? Then you don't have to trust Facebook with a copy of it. So the solution for this half of the problem is, disintermediate the avoidable monopolies.
The U.S. government has several orders of magnitude more information about the private lives and communications and beliefs and activities of its citizens than East Germany ever had. This is also incontrovertible and undeniable.
How can either of you talk about abuses that happened in the past as if those were the only abuses? Why would you need to?
RE regulation on software engineers, Its impossible. For a software written, its PURPOSE and AUTHORS are subjective interpretations. It is much much harder to get common consensus if the software is surveillance, malware etc. So any regulation would do nothing but increase the already-so-complex-and-huge set of laws.
Yes, it is. But these are not the people that the argument is about and that's precisely the problem here. They don't feel that it concerns them at all, it is always others who need to worry about what is done with that data, they have nothing to hide and absolutely nothing to fear.
> Literally billions of dollars will be spent on that purpose this year alone.
10's to 100's of billions of dollars.
> There are also the government agencies of a dozen or two other countries which the U.S. government agencies work with and share data with to a greater or lesser extent.
Yes.
> Literally thousands of newspaper articles have been written about this.
Indeed. But since this has not yet resulted in mass arrests on US soil this evidence amounts to nothing in the eyes of those that see it as a 'good thing', these people are keeping us all safe and are merely doing their jobs. Incredible to you, to me and lots of others but still that's a position that quite a few people hold and not much that you will say or do will persuade them from that point of view.
So, I don't need to use the past as a reference. But it is strange to see a person that would refuse to learn from history to be able to apply the lessons to todays environment. I'm working on a second part of that blog post about 'if you've got nothing to hide' that concentrates on the present (I think the past has been dealt with), but I still feel that those are such enormously important reminders that they serve as a good backgrounder for why all this stuff matters.
So this is a simple choice grounded in the 'those that refuse to learn from history are bound to repeat it' line.
> The U.S. government has several orders of magnitude more information about the private lives and communications and beliefs and activities of its citizens than East Germany ever had.
This is true. But the mere possession is not enough to sway a die-hard denier of danger and supporter of the surveillance state. All that data by their reckoning is in good hands it is there merely to protect them from unseen dangers.
Obviously I disagree strongly with that position but that's probably because (1) I've lived for a bit in a country that was a police state by most definitions and (2) I've seen how the various layers of that society would deal with this (the majority were just like karmacondon here, only a very small minority dared to take a stance, the rest saw the whole thing as essentially beneficial, which retrospectively may seem very hard to understand. In fact even today there are still those that yearn for the communist days when life was orderly, everybody had a job and everybody had a pension waiting for them at the end of the line).
Well, then logical thing would be not to give anyone any power, ever.
My point is, if you take general principles and blindly apply it with "no analysis involved", you're likely to get to a pretty ridiculous state.
Don't you see any logical problems with this line of reasoning?
That's one of my favorite author quotes. The greatest evil in this world is done by those who can see their own work and tell themselves that it is good.
Just like any other tool such insights can be (and are) abused but it need not be like that.
The conclusion to reach is not to give anyone any power ever, clearly that's not feasible. The conclusion you're supposed to reach is that you can give power to people but you'll need oversight in place. Effectively you'll end up with checks and balances, pretty much the way most governments are set up.
And what history tells us - again - is that this isn't always sufficient to prevent abuses and our newspapers and other media seem to tell us that our current set of checks and balances have outlived their usefulness in the information age.
This flows from 'power corrupts' because it appears that those placed in power have - surprise - again abused their privileges.
Think of it as a warning beamed down from historical times to our present day that does not need more embellishment and is all the more powerful for its brevity, it is something so inherent in human nature that we need to be vigilant of it at all times, no matter who we end up placing trust in.
I'm worried about both, and I can't say which I'm worried about more. What are the reasons you are concerned about one more than the other?
There is only one positive outcome of concentrations of power, and that is efficiency in execution. Which is extremely scary when combined with huge power.
This is really just the democracy discussion with different terms. It is well known that dictatorships are much more efficient at executing their plans. The inefficiency we voluntarily introduce when establishing and maintaining a democracy (and if you have ever been involved in democratic decisionmaking, the inefficiency can be really frustrating) is the price we pay to insure us against the efficient abuse of power as we have witnessed it countless times in human history.
Work in the greater D.C. area. Within a 150-200 mile radius, there are literally tens of thousands of developers working directly on surveillance. Probably even more. How do I know this? From random sampling. Go to any tech event, talk to any program manager at any government contractor. The work and money is in surveillance.
And, that's just government surveillance. All that tech is then spilling over into corporate surveillance. Location and behavioral tracking is big money. How do I know this? Because, sadly, that's how I have to make my money. The problem is that there's always another grunt like me willing to create the systems that enable this.
The solution: Use all of this surveillance tech and data to expose all of the VIPs. Publicly post where they are and where they've been, who they've been with, what they read, and what they buy. You do this and laws will be created pretty quickly.
So far, it seems pretty much like your belief that surveillance is not a problem for you is unfalsifiable, that you will believe that it is a problem for you only when the secret police is actually coming for you or maybe your family.
One thing which became apparent to me when I began to focus on this issue was the fact there are countless other services which provide services to other services, all of which have some degree of access to upstream customer data. For example, if you log to a hosted logging service, some of your customer's data is sent to them. If that service use AWS, then data is sent to Amazon. And so on.
http://www.stackgeek.com/blog/kordless/post/a-code-of-trust
Arguing efforts to make things better is pointless is a very dangerous thing to do, assuming we actually want things to be better. Cognitive dissonance is a powerful force, especially when there are startups to be built!
Everyone knows how the invasion of Iraq was a complete mistake. Has someone gone to jail?
The public is not going to shutdown anything they are wholly complicit in and benefit from. Which is why empires eventually fall.
This has happened in the past and the reaction from the individual people has been to 180 completely on their opinion of surveillance (there was a recent post with sources, but I don't have it handy). This could work.
For instance, I find a user control that prevents the user from changing focus whenever the input is invalid to be unethical, or at least severely impolite. It's the equivalent of grabbing someone's face while you're talking to them. Me: "The control you propose is hostile to the user." Customer: "Do it the way we want, or your company loses the contract."
As it turns out, the customer would love to grab someone's face, not just while they talk, but also as they yell, with a light rain of spittle falling gently onto the target's visage. That's because they assume everyone is a complete idiot, whose only salvation is absolute obedience to those officially certified as more capable. They fervently believe that you can order someone to not make mistakes. So it should be no surprise that my ethical objection was meaningless to them.
The people paying for software and hardware enabling Panopticon-style universal surveillance have a completely alien system of ethics, and more than enough money to ignore your personal morality. There will always be someone around in desperate enough financial straits that they will quash their own opinions and take the paycheck.
A cartel enforcer for software workers is the only way to significantly slow down technologies (you can't actually stop progress) that the majority of those workers find to be unethical. That enforcer has to be able to tell its members that they cannot do such work, no matter how well it pays, because otherwise, the buyers, for whom budget size is no obstacle, simply pay the higher price to those who need cash now more than self-respect later.
As long as there are mouths to feed and rent to be paid, the guy with deep pockets will be able to pay another to do his dirty work.
It isn't the ethical training that makes the difference in medicine, but the ethical enforcement. Doctors and lawyers can be decertified by their peers and elders, such that they cannot be rehired as a member of that profession. That means that an employer cannot demand unethical behavior, unless it is willing to compensate to the tune of all the money those people could theoretically make over all the remaining years of their careers.
I would hope that enough software workers could agree that it is unethical to casually collect and retain information from anyone without their fully informed consent, which is diligently confirmed, and revocable on demand. I further hope that we could agree that it is unethical to gather information to support any criminal investigation without reasonable suspicion that the target has actually committed a crime. Those people who believe that adding more hay to the stack makes the needles easier to find can form their own cartel.
I happen to believe that ethically-limited surveillance is more efficient and effective than the heavy-handed dragnet approach. I also think it is unethical to use an O(N^3) brute-force algorithm when an O(N log N) alternative is available. But most customers only care whether something works, and is delivered on time and under budget. They won't ever care about our opinions regarding quality, ethics, or best practices until after we are capable of making them pay dearly for not caring.
Yup. And the magic of digital content, software being a kind of it - it's infinitely copyable. It takes one guy to write a surveillance package and open-source it or have their company sell it, and everyone can now use it.
It's not engineers who make the decision to use surveillance technology. Hell, for most of the work a software engineer does, most of the data coming from surveillance tech doesn't even matter.
Sure, it turns out using centralized web services has helped the government with things such as PRISM, but that doesn't mean we should blame people for those development practices rather than the government.
Prior to PRISM, pretty much any reasonable person would assume that the blobs you store in S3 aren't going to be looked at anyone or, worst case, metadata will be seen by AWS employees for debugging stuff.
What we have done is make things a ton better for developers; we can make things quicker and more easily which empowers society/humanity. The fact that it's incidentally contributed to a surveillance society through no intent of the developers in a way you wouldn't reasonably expect does not make the developers culpable.
It is true that customer data is trusted with a lot of services-of-services nowadays, but do you want to go back to the stone age where the only people who can store anything must run their own hardware with their own databases and so on?
The right thing to do here is to call for better use of encryption where possible and, for surveillance issues, to reign in the unreasonable government programs that make this practice result in such problems.
There are lots of bad things that the government could do. But it just hasn't happened. They've had mass surveillance technology in place for over a decade now. The world hasn't fallen apart, Hitler hasn't risen from the dead and everything is pretty much the same as it was before.
I guess we can check back in another ten years to see if your apocalyptic visions have come to pass yet.
I'd like to think that'd be the case, but consider one of the more-recent privacy intrusions with "The Fappening" ... very little became of that, despite the wealthy, high-profile individuals involved. I realize they weren't the politically connected, but they were certainly what society considers "VIPs".
> Why is there a copy of your photos on Facebook's servers? If one of your friends wants to see one of your photos, why are they not getting it directly from you? Then you don't have to trust Facebook with a copy of it. So the solution for this half of the problem is, disintermediate the avoidable monopolies.
It's because decentralization like that is stupidly, stupidly inefficient. Not to mention that the assumption that your phone is actually on-line 24/7 is unrealistic, and that's before we notice we're not on IPv6 yet, or that people also use cameras, or that they change their phones, go out of service range or simply want to free up space on SD card for something else.
So the fundamental reasons are a) efficiency, and b) availability. That's not to say things couldn't be improved wrt. privacy. I don't know that much about crypto yet (that's about to change, for work-related reasons), but I vaguely recall that there are encryption schemes that would let only you and your friends access the data stored on third party servers, and that would make the data unreadable for said third party.
Define 'random'...
http://www.huffingtonpost.com/peter-van-buren/parallel-const...
http://www.reuters.com/article/us-dea-sod-idUSBRE97409R20130...
> Something like "Private Citizen X criticized the government and embarrassing information about his life was revealed as a consequence."
https://theintercept.com/2014/02/18/snowden-docs-reveal-cove...
You mean like that?
> There are lots of bad things that the government could do.
Does, not could do.
> But it just hasn't happened.
It happens, but it just does not manage to cross your threshold for worry because you personally are not inconvenienced.
> They've had mass surveillance technology in place for over a decade now.
For longer than that, and it has been abused for longer than that too.
> The world hasn't fallen apart
It will not 'fall apart' because of this. But it will change because of this, and not for the better.
> Hitler hasn't risen from the dead and everything is pretty much the same as it was before.
Yes, we still have willfully blind people that would require things to get so bad that they would no longer be able to avert their eyes before they would consider maybe things have gone too far. But by then they would have indeed gone too far.
> I guess we can check back in another ten years to see if your apocalyptic visions have come to pass yet.
It will never be a moment in time, we will just simply keep on creeping up to it, just like the frog in the pot of water.
What fascinates me is that there are people that are obviously reasonably intelligent that manage to actually see the pot, the stove and all that it implies and they still tell other frogs to jump in, the water is fine.
I'm less pessimistic about that. I'm a big fan of gun control laws but I also think that the one positive thing that has come from the ongoing middle-east conflicts is that a determined militia can be genuinely problematic.
> Note that there are always provisions in the law to protect the lawmakers from having the laws applied to them.
To my original point about asymmetry, this is what we should be devoting our energy fighting.
> simply outlaw it wholesale
Outlaw what wholesale? I'm personally of the opinion that the long term end state will fall more on the side of honesty (combined with increased acceptance) than secrecy.
“How do you know the chosen ones? ‘No greater love hath a man than he lay down his life for his brother.’ Not for millions, not for glory, not for fame. For one person. In the dark, where no one will ever know, or see.”
— Babylon 5, Season 2, episode 21, Comes the Inquisitor, 1995
(Also: "random citizen" or "private citizen"? A citizen who criticizes the government is barely a "random citizen".)
Any kind of abuse of power. The penalties for that should be severe. It's one of the few cases where I think that the penal system should be used as a means of discouragement rather than as one of education and rehabilitation.
EDIT:
But then again, fascination with "the other guys" is also a thing. See: the intellectual world of the West being in love with Soviet Union well into the Cold War.
http://slatestarcodex.com/2015/08/11/book-review-chronicles-...
https://www.asc.upenn.edu/news-events/publications/tradeoff-...
... the survey reveals most Americans do not believe that ‘data for discounts’
is a square deal.
... Rather than feeling able to make choices, Americans believe it is futile to
manage what companies can learn about them. The study reveals that more than half
do not want to lose control over their information but also believe this loss of
control has already happened.Laws will be created pretty quickly, but only to protect VIPs.
I beg to differ. Where you concentrate power, you have to expect abuse.
> It is true that customer data is trusted with a lot of services-of-services nowadays, but do you want to go back to the stone age where the only people who can store anything must run their own hardware with their own databases and so on?
That's a false dichotomy. Yes, I want to go back to people running their own hardware with their own databases. And to have that work as easily as your favourite cloud service. There isn't anything inherent in running your own hardware that requires that to be a major burden.
They made lots of promises that they never delivered on (or even planned to deliver on).
One of the more interesting ones:
I have hardware in my pocket that is hundreds of times more powerful than the first web servers I every worked on. There is no technological reason why that same hardware couldn't be used. I'd love to have a PAN based around my phone (which is way more local to me that much of the "hardward with their own databases" that I've ever worked on. Federation to Facebook/Google/Instagram/whatever the next big thing is would be amazing. And the reason it hasn't happened even though powerful hardware is everywhere isn't due to lack of technology.
Also, the very positive tone of the historical article was refreshing. I know it's pure propaganda, but still, we could use some positive articles in the news every once in a while.
"The Earth belongs in usufruct to the living; the dead have neither powers nor rights over it." --Thomas Jefferson, to James Madison, Sep 6, 1789
For all the influence exerted by people like Mahatma Gandhi and Martin Luther King, Jr., they effected change with their lives, and not their deaths. One should not choose to die for a cause, or against one. Rather, live for your own principles, and teach them to those others who wish to learn. Those who sacrifice themselves, expecting no reward, grow no greater in my eyes. They become memory, and immediately begin to fade, except to the extent that they are renewed by those who still live.
What manner of scoundrel would I be to suggest that another to sacrifice for my benefit, that I may treasure the memory of it? What sort of fool would assent? That is the mentality of the beehive, where the workers die to protect their queen. In a society of equals, for anyone to die unnecessarily is a tragedy. For someone to choose to die, it is a horror.
Attributing some nobility to self-sacrifice is an ethic for hierarchies, to convince the lesser people, against their own interests, to hurtle headstrong into situations where they may be killed. It makes pawns of people who might otherwise be greater. It is not fitting to convince anyone to believe they are so unworthy that the best way they might serve others is by throwing themselves into fires that need never have been lit.
Theory is not really relevant when the practical reality is monstrous. The five eyes are not redeemable.
Disagree. If you're Netflix wanting to distribute Jessica Jones then you want something like a CDN (although in that context BitTorrent is also "something like a CDN").
But think about wanting to share photos with your friends. There are only thirty people who actually want to see the photos. Twenty five of them live in the same city as you, which makes direction connections to you about as efficient as a local CDN node, and the other five live in four different cities, so in all but one case there is nothing to be gained from caching in any of those places because there will only ever be one copy requested. In that one last case the CDN would conserve just one long-distance copy, and that's assuming we can't make P2P software smart enough to have the second person in Timbuktu get the photos from the first person there.
> we're not on IPv6 yet
This one is probably the main reason why this hasn't actually happened yet, but it's not like we don't know what to do -- how about we get on IPv6 already?
> or that people also use cameras
You seem to be implying there is some reason why a photo taken with a camera couldn't still be distributed using a mobile device (or plug server or PC or whatever you like).
> or that they change their phones
And then they can copy the stuff from one to the other.
> Not to mention that the assumption that your phone is actually on-line 24/7 is unrealistic
Availability is a different tack. OK, your phone doesn't have twelve nines of uptime, but it probably is actually online upwards of 90% of the time. And we know how to build reliable systems out of mostly-reliable pieces.
We're assuming that there is a piece of software on your device which already knows who your friends are. So now it just needs a check box that says "cache things for my friends if they cache things for me" and now your friends can get your photos from your other friends (or from their own device) even when your device is occasionally incommunicado.
> or simply want to free up space on SD card for something else.
I think there's a law of physics that says your photos, to exist, have to exist somewhere. I suppose "I would rather give my private data to Facebook than buy an SD card big enough to hold it" is the sort of thing you have to decide for yourself.
I would argue the most important changes they affected were for themselves. You don't risk your health by helping someone who gets attacked to earn their gratitude, but to be able to look in the mirror. That's the only thing that gives enough energy to sustain certain things for years and decades. And Rosa Parks for example didn't plan to end segregation, she was sick of putting up with it. Nothing more, nothing less. How great other people are in your eyes is does not matter for what value their own acts of moral hygiene have to them, and people don't need "expect" a reward for such things because the deed itself IS the reward. They already have it. And since you brought up MLK:
I say to you this morning, that if you have never found something so dear and so precious to you that you aren't willing to die for it then you aren't fit to live.
[..]
You may be 38 years old, as I happen to be. And one day, some great opportunity stands before you and calls you to stand up for some great principle, some great issue, some great cause. And you refuse to do it because you are afraid... You refuse to do it because you want to live longer... You're afraid that you will lose your job, or you are afraid that you will be criticized or that you will lose your popularity, or you're afraid someone will stab you, or shoot at you or bomb your house; so you refuse to take the stand.
Well, you may go on and live until you are 90, but you're just as dead at 38 as you would be at 90. And the cessation of breathing in your life is but the belated announcement of an earlier death of the spirit.
http://www.youtube.com/watch?v=pOjpaIO2seY&t=18m26s
> In a society of equals, for anyone to die unnecessarily is a tragedy. For someone to choose to die, it is a horror.
Here's a secret: everybody dies, either way. The only choice you have is how you live. From John J. Chapman's commencement address to the graduating class of Hobart College, 1900:
If you wish to be useful, never take a course that will silence you. Refuse to learn anything that implies collusion, whether it be a clerkship or a curacy, a legal fee or a post in a university. Retain the power of speech no matter what other power you may lose. If you can take this course, and in so far as you take it, you will bless this country. In so far as you depart from this course, you become dampers, mutes, and hooded executioners.
> It is not fitting to convince anyone to believe they are so unworthy that the best way they might serve others is by throwing themselves into fires that need never have been lit.
People who are great don't need to be convinced of anything. People who aren't are impossible to convince. And it's not "fitting" to justify stoking fires because otherwise others would do it, either. Then let those others do it? And hey, for all you know, they all might be doing it because otherwise you would do it.
And who is actually sacrificing? People who aren't sacrificing their ideals and their morals, or people who sacrifice them for some food and a few decades more?
This statement is in conflict with itself logically. It's arguing that diminished trust levels for data are rationalized to achieve a savings in time and cost to run the infrastructure for the application. The conflict comes about when you start assuming the data has acceptable levels of trust requirements for a given customer. The fact is, you can't speak for my trust levels, which is exactly what is being discussed in the link.
I get to say what trust levels I want for my software and data. Not being able to use the software because I can't trust it is an unacceptable proposition, so I challenge our abilities to build something better than what we have today, and do so without rationalizing why we aren't building it.
Sure there is. Climate control, redundancy/backups, and power consumption/reliability, to name a few, are all concerns that we get to delegate to "the cloud," that are 100% "inherent in running your own hardware."
I applaud your usability argument, but there are most certainly inherent burdens to running your own hardware that don't exist for cloud services.
A 10 watt server doesn't need climate control.
> redundancy
Is mostly a matter of software.
> backups
Is also mostly a matter of software. With some simple peering mechanism, you can store backups on your friends' servers (and they on yours). Though a standardized pure backup storage API for cloud storage of encrypted backups at one (or more) of a multitude of providers might be a useful option to have.
> power consumption
Is a matter of plugging a plug into a socket in the wall.
> reliability
Is also mostly a matter of software.
Now, I am not saying that running your own datacenter is no work, but running a server or two for your personal needs or for the needs of a small company should be possible to make almost a no-brainer.
There is no technical reason why you shouldn't be able to buy a bunch of off-the-shelf mini-servers for a hundred bucks or so a piece that you can peer by connecting them with an ethernet or USB cable or whatever might be appropriate and that you then connect to the internet wherever you like and that automatically replicate their data among each other and allow easy installation of additional services via a web interface, with automatic software upgrades, and allow you to rebuild the state of a broken server by connecting a new one and clicking on a few buttons in the web interface ... well, there are many ways to solve the details, but my point is: cloud providers also don't employ one admin per machine, but rather automate the management of their machines to make things efficient--there isn't really any reason why much of the same automation strategies (which are mostly software, after all) shouldn't be usable on decentralized servers in people's homes.
Good companies use information they collect to provide better services. Bad companies use it to rip people off. The problem of bad companies doing bad things is independent of companies having information about people.
If the government is not corrupt, I have optimism in being able to get through a personal attack.
It's so important to have activism and structures in place to protect whistle blowers and others not comfortable with our current direction to take this difficult path. Respect!
Just because I mentioned specific individuals does not mean that I agree with them. I only acknowledge that they produced an effect that propagated beyond their own deaths through the actions of the devotees they acquired while living. I might also have mentioned prophets of various religions, though I may not follow any of them.
Skilled as I am at seeing the fnords, in the MLK address you quoted, under the obvious text, lies this subtext: Is my cause not great enough that you might be willing to die for it? If you are not, and have no greater cause to hold your loyalty, then you are more a walking corpse than a living man, and unworthy of my regard. It is very similar to "Crouch down and lick the hands which feed you. May your chains set lightly upon you, and may posterity forget that ye were our countrymen." It is a recruiting speech. And every time a young black person gets "the talk", it is contradicted. According to MLK, every time black kids submerge their will in a police encounter, and come away from it alive, but humiliated, they will be dead inside until their bodies finally catch up. According to me, they will live long enough to either vote in comprehensive reform or to organize and rebel from a dearth of it.
Nonviolent resistance depends in whole upon the oppressors' general unwillingness to murder nonviolent protesters. Willingness to die only works insofar as the opposition is unwilling to kill. Gandhi's protests worked only because British forces in India were unwilling to massacre Indians wholesale. MLK's protests worked only because the segregationists were unwilling to kill in public, before the typewriters and cameras of nationally-published journalists.
If you are willing to die, and the other is willing to kill you, you would be prudent to arrange your affairs in advance, such that other people are positioned to impose meaningful consequences as a result. Otherwise, you are gifting your enemy with a tiny victory.
If you quit a job in the military-industrial complex for which you have some ethical concerns, such as one which enables dragnet surveillance, what is the meaningful consequence? Every failing of the project in recent months is scapegoated to you. The contractor hires a replacement butt-in-seat. The work goes on. Your sacrifice yields nothing. No one rises in gratitude to pay your bills. When you mention in job interviews that you left due to ethical conflicts with the former employer, you never seem to be a good "cultural fit".
Why then would anyone choose to do that?
I'll take the food and the decades. I won't go willingly to my grave, if doing so wouldn't be more meaningful than what I believe I could accomplish with the entire remainder of my natural life. Sometimes, you can't avoid it, but you should always try to not die as you work towards your goals. Don't fear death, but don't ask it out on romantic dates, either.
> According to MLK, every time black kids submerge their will in a police encounter, and come away from it alive, but humiliated, they will be dead inside until their bodies finally catch up. According to me, they will live long enough to either vote in comprehensive reform or to organize and rebel from a dearth of it.
Right, so when does the rebellion come? Why would you rebel ever when "someone will do it anyway", like that is some law of nature? According to you, hypothetical black kid should snitch on others when threatened to get beaten or arrested, and why wouldn't they -- if they don't snitch, someone else will do it, and the only difference would be their life being worse. Leaflet #3 of the White Rose comes to mind: "Do not hide your cowardice under the cloak of cleverness!" And I think we'll have to agree to disagree.
> If you quit a job in the military-industrial complex for which you have some ethical concerns, such as one which enables dragnet surveillance, what is the meaningful consequence?
I already said what it is for me and in my opinion, personal moral hygiene. The consequence is that you are no longer part of that. That is plenty meaningful to me. As Frankenstein said in The Death Race, (paraphrasing), "You can't save the world, you can maybe save a part of it, yourself". Well, I don't remember the exact quote, but that's how I feel about it. I don't even believe in something like a soul, but still, I would say saving your soul, retaining what little remains of our innocence, is the best anyone can achieve.
And as many found out, death doesn't always immediately follow making a stand. George Carlin found himself entertaining people he didn't like, the establishment, with cute things, and he pivoted. Had a long career, had a family, was heard, never sold out, never compromised. Noam Chomsky also has plenty haters, and I'm sure plenty who would love to see him hurt, but he is still rocking on.
> When you mention in job interviews that you left due to ethical conflicts with the former employer, you never seem to be a good "cultural fit".
Then either don't mention it, or don't interview for jobs with assholes. Get another job, and help take the assholes down. Do whatever you want, of course, but I don't see the dilemma here. It's not that black and white, i.e. either you go along or you're screwed. Actually, plenty people get screwed even though they're very obedient and have no flavour and no stance of their own. And as Lily Tomlin said, "The trouble with the rat race is, even if you win, you're still a rat." And you know, I don't quote this to put anyone else down, it's how I feel inside. Man, it's not just a feeling, it's a pretty solid thing. I had a lot of shit broken for me for trying to do the right thing, and had a lot of frustration and sadness for not just "popping soma" and going along, for questioning things. Yet I would not do it differently, given then the chance to do it again. I might be smarter or more patient about some things, but in general, I feel I got way more out of it than I lost. It's not just what it does to how I feel inside, it's also what it does to my perception, which is muddled, but less muddled as it would otherwise be. I see and speak with people who made and are making different decisions every day, and I don't envy a single one of them.
> Sometimes, you can't avoid it, but you should always try to not die as you work towards your goals.
Nobody (or hardly anybody) just keels over dead and thinks that advances any cause or does any good. It's usually "doing something or saying something, and then not stopping to do or say it even though others threaten you". You can hardly say "don't fear death" after arguing it's fine to fear quitting a job over ethical concerns, which is so much less than death.
It's easy for us to sit at our desks and churn out our work and be mad. And there's things to be mad about for sure. The wanton disregard for civil liberty and protection is simply irredeemable. And to be sure, I've been a fan of your country's very public responses over the last few years to personal privacy. I hope the US legislature can learn a thing or two.
But It's not the "five eyes". It's the entire world. Any country with an interest in protecting their sovereignty also has some form of information gathering operation.
When that operation gets big and exposes itself, folks get upset because, yeah, being spied on isn't a comfortable thing. Do some countries go about gathering this information more morally than others? Something tells me we'd have to be in the secret inner sanctums of the biggest opponents to really know, and I think the answer would be "a spade is a spade."
Does it help our countries protect themselves? I honestly don't know.
But I do know that "grey hatting" in the general development community doesn't garner this sort of bile and venom. I don't know why being a grey hat for a government should be treated differently.
Thanks, it was a difficult decision and took me a while to come to, but I knew I couldn't continue working there in good conscience.