zlacker

Why privacy is important, and having “nothing to hide” is irrelevant

submitted by syness+(OP) on 2016-01-06 01:48:14 | 697 points 312 comments
[view article] [source] [go to bottom]

NOTE: showing posts with links only show all posts
◧◩◪
24. CaptSp+nf[view] [source] [discussion] 2016-01-06 05:26:50
>>grecy+zc
And, I think it's important to point out: This isn't just a scary story to make people side with you. This actually happened, albeit before the internet: https://en.wikipedia.org/wiki/Red_Scare
51. raminf+Ok[view] [source] 2016-01-06 07:20:20
>>syness+(OP)
The issue with the 'nothing to hide' argument is that it puts the burden of proof and determination of whether something is 'hide-worthy' on the target of the inquiry.

If you subscribe to the 'presumed innocent' premise of the law (https://en.wikipedia.org/wiki/Presumption_of_innocence) then the burden of proof is on the inquisitor.

Either you believe in presumed innocence or you don't. Pick one.

◧◩◪
58. vixen9+ll[view] [source] [discussion] 2016-01-06 07:37:21
>>enrage+Kf
Yes! Everybody should read https://www.schneier.com/blog/archives/2006/05/the_value_of_... who quotes Cardinal Richelieu "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged."

As Schneier puts it: two proverbs sum it up: "Who watches the watchers?" and "Absolute power corrupts absolutely."

◧◩
68. laughi+Gm[view] [source] [discussion] 2016-01-06 08:07:49
>>tobbyb+Bl
Have you seen the Idle Words talk, "Haunted by Data"? http://idlewords.com/talks/haunted_by_data.htm

Pretty compelling talk, culminating in:

    I believe there should be a law that limits behavioral data
    collection to 90 days, not because I want to ruin Christmas
    for your children, but because I think it will give us all
    better data while clawing back some semblance of privacy.
◧◩◪◨
92. chongl+Eq[view] [source] [discussion] 2016-01-06 09:30:21
>>iheart+Cp
Very informative reply, thanks! How do we regulate data-collecting institutions internationally? Look at the EU's Data Protection Directive[0]. As extensive as it is, it's struggling in the wake of the failure of the Safe Harbour Decision[1].

[0] https://en.wikipedia.org/wiki/Data_Protection_Directive

[1] https://en.wikipedia.org/wiki/International_Safe_Harbor_Priv...

◧◩
109. ameliu+Yw[view] [source] [discussion] 2016-01-06 11:18:51
>>tobbyb+Bl
> We expect professionals to behave ethically. Doctors and companies working on genetics and cloning for instance are expected to behave ethically and have constraints placed on their work.

Yes, I believe we should have an "Hippocratic Oath" ([1]) for technology workers.

[1] https://en.wikipedia.org/wiki/Hippocratic_Oath

◧◩
111. jmarti+by[view] [source] [discussion] 2016-01-06 11:40:02
>>kmonad+uo
I'm reminded of an article from last year which is only tangentially related, but it has a perspective I think could be used to explain the severity to average Joes:

http://www.nytimes.com/2015/03/08/magazine/the-loser-edit-th...

112. mattlu+cy[view] [source] 2016-01-06 11:40:09
>>syness+(OP)
I forget why now, but I was reading this article[0] by Moxie Marlinspike, from 2013, just a few days ago.

Interesting how the same argument can take so long to take hold, and how long some truths need to be told before they gain the traction needed to make a change.

0: http://www.wired.com/2013/06/why-i-have-nothing-to-hide-is-t...

◧◩◪
113. jacque+ey[view] [source] [discussion] 2016-01-06 11:40:15
>>karmac+Is
> But don't discount people that legitimately disagree with you as being irresponsible.

Why not? You may disagree, that doesn't mean you can't be flat-out wrong. Having an opinion does not automatically give that opinion equal weight when history has proven to us again and again that that particular opinion ends up with making society either dangerous or at a minimum uncomfortable.

I'm sure there were border guards in former East Germany that were entirely convinced that their state was the greatest and that's why they had to keep people in at all costs, including shooting them if they persisted in believing otherwise and tried to simply leave. After all, that was best for them. But that particular opinion turned out to be very wrong in the long term.

People can rationalize the most absurd stuff to themselves and to others, especially when their pay-check depends on it, but that's not a requirement.

All those that try to pretend that there is some kind of 'reasonable disagreement' possible about the erosion of privacy and that directly and indirectly help to rush in the surveillance state have quite possibly not thought as carefully and have not considered these things with the degree of gravity required as they claim they have. Having a mortgage to pay may factor in there somewhere too.

Usually this is a combination of being too young, too optimistic and in general living too sheltered a life to know what can happen to you, your family and your friends when the tide turns. And the tide always turns, nothing is forever.

> Simply, I and many others do not believe that any western government is going to use information gathered by tech companies to preempt threats to entrenched interests and the status quo.

I hope you're right but history is not on your side in this case.

> I've seen the same arguments made here for years, and none of it is convincing.

Yes, it isn't going to convince you any more than that border guard would be convinced that his job is a net negative to society. Every stream, no matter how reprehensible will always have its fans and cheerleaders. And later on they will never remember that they had agency all along and were perfectly capable of making a different decision. Responsibility is weird that way.

> It's admirable that you are so certain in your beliefs.

It is not admirable that you are so certain in yours. May I suggest a couple of talks with some holocaust survivors to get a better feel for what the true power of information can get you?

Or maybe the family members of some people that were killed while trying to flee the former SovBlock?

Or maybe some first generation immigrants to the US or Canada or wherever you live to give you some eye witness accounts on what it was like to live in those countries before the wall fell down?

'It can't happen here' is an extremely naive point of view.

http://jacquesmattheij.com/if-you-have-nothing-to-hide

Agreed with your advocacy advice.

> The least you could do is to try to understand ours.

That's 'mine' not 'ours', you speak for yourself.

119. rplnt+Uz[view] [source] 2016-01-06 12:16:23
>>syness+(OP)
This is an interesting post on the topic from reddit:

https://www.reddit.com/r/changemyview/comments/1fv4r6/i_beli...

◧◩◪◨
131. puppet+WA[view] [source] [discussion] 2016-01-06 12:35:38
>>iheart+Cp
This web site is for you, it insulting to programmers, calling them hacks.

( http://acm.org/about-acm/acm-code-of-ethics-and-professional... )

◧◩◪◨⬒⬓
142. snydly+nC[view] [source] [discussion] 2016-01-06 12:57:05
>>jacque+AA
> Potentially criminal groupls: everybody.

While this may be true, certain crimes are seen as worse than others. And, as un-PC as it may sound, certain demographics are many times more likely to commit certain crimes.

Homicide: http://www.cdc.gov/mmwr/preview/mmwrhtml/mm6227a1.htm

Also, some government monitoring can be "for your own good":

Youth Risk Behavior Surveillance: http://www.cdc.gov/mmwr/preview/mmwrhtml/ss5704a1.htm

But, maybe the CDC is different than the NSA.

◧◩◪◨⬒
145. blub+yC[view] [source] [discussion] 2016-01-06 12:58:13
>>golerg+8z
Most people are not persons of interest and nothing particularly bad will happen to them if various entities have access to their private info. Still, they might have their identity stolen, get scammed (e.g. Dell) or pranked (e.g. swatting, disconnecting utilities) or have their house broken into if they have bad luck. They might pay a premium on insurance for having the wrong friends on FB or get fired for holding certain opinions. Might get mobbed by the internet, get harassed by salesmen or silly ads for herpes.

Let's assume for the sake of argument that the above events are unlikely though. When a few actors have access to the information of tens of thousands to billions of people though, this has an impact on a societal level. As jaquesm said, information is power and when one has so much information and lots of money to boot, they can begin to covertly influence policy and behavior and harass and marginalize their opponents. And they can do that directly, or by using the information of a third party, like a doctor, lawyer, religious leader, or even someone insignificant which happens to be a relative, etc. Moreover, companies can be sold, together with their databases, they can be forced to hand them over or they can be hacked. A treasure trove of data held by an otherwise principled company, might end up in the hands of an unsavory party.

Why is this a bad thing? History has shown again and again how such imbalances of power are abused. Here's a rather harmless example of data mining a mobile device + social network combined with social engineering to scam people out of money: http://toucharcade.com/2015/09/16/we-own-you-confessions-of-... If a game producer can do this, what are the pros doing?

◧◩◪
146. tombro+JC[view] [source] [discussion] 2016-01-06 13:00:53
>>karmac+at
I think your question was answered at a link provided elsewhere in this thread[0].

What you might or might not need to hide cannot be reliably determined in advance. It is not a constant, it is a variable and you don't get to pick which way it goes. Consider the plight of the gay Russian blogger using LiveJournal, which was later sold to a Russian company.

[0]https://news.ycombinator.com/item?id=10849248

◧◩◪
156. cryosh+ZE[view] [source] [discussion] 2016-01-06 13:35:42
>>karmac+Is
What about Palantir, then?

Very hard to suggest they aren't supporting the police state.

It's unquestionable that the tech sector is directly culpable for supporting the cops and the politicians to spy on us... to affirm otherwise is counterfactual. The moral high ground belongs to the people who don't collaborate with those who would rather have us dumb and controlled.

It's pretty hard to respect the pro-surveillance view because it seems flatly head-in-sand ignorant of reality time and time again. We have evidence of surveillance state wrongdoing in hand, and no successes to point to while simultaneously experiencing multiple terror attacks, and yet the pro-surveillance types are steadfast in their position, as though it's a religion.

The Snowden files showed us explicitly that disrupting political groups is actually done via GCHQ! This is very far from protecting the citizens, and is instead stifling them purposefully.

https://en.wikipedia.org/wiki/Joint_Threat_Research_Intellig...

I actively am discounting the opinion of people that do not understand this threat realized, currently unfolding threat to our democracy. An informed opinion doesn't sound like one passed via the government through the media.

157. blaze3+5F[view] [source] 2016-01-06 13:38:08
>>syness+(OP)
If you are interested in this topic, I highly recommend you read Tradition of Freedom by Bernanos (original title in French: La France contre les robots).

Written in 1944, there is a specific passage where he argues against this "but I have nothing to hide!" argument, only criminals benefits from hiding, right ? He talks about how a simple citizen who never had trouble with the law should stay perfectly free to conceal his identity whenever he likes for whatever reason, and laments how this very idea already died.

The extract is available online [1] in French, the google translation [2] is not that good.

[1] http://www.books.fr/quand-bernanos-predisait-une-societe-sou... [2] https://translate.google.com/translate?hl=en&sl=auto&tl=en&u...

◧◩◪◨⬒⬓⬔
162. cryosh+WF[view] [source] [discussion] 2016-01-06 13:50:06
>>karmac+2E
Don't bring up the Eisenhower administration, it isn't relevant.

It's also quite foolish to try to evaluate people in a vacuum... would you extend the same privilege to a member of a criminal gang or jihadi group? No.

https://en.wikipedia.org/wiki/Joint_Threat_Research_Intellig...

https://theintercept.com/2014/02/24/jtrig-manipulation/

"Campaigns operated by JTRIG have broadly fallen into two categories; cyber attacks and propaganda efforts. The propaganda efforts (named "Online Covert Action"[4]) utilize "mass messaging" and the “pushing [of] stories” via the medium of Twitter, Flickr, Facebook and YouTube.[2] Online “false flag” operations are also used by JTRIG against targets.[2] JTRIG have also changed photographs on social media sites, as well as emailing and texting work colleagues and neighbours with "unsavory information" about the targeted individual.[2]"

https://en.wikipedia.org/wiki/COINTELPRO

https://en.wikipedia.org/wiki/SEXINT

https://en.wikipedia.org/wiki/Optic_Nerve_%28GCHQ%29

https://en.wikipedia.org/wiki/PRISM_%28surveillance_program%...

There's your evidence-- it's been here all along. These programs are targeted at US citizens, some with the explicit aim of discrediting them, blackmailing them, or propagandizing them. These are not the actions of a friendly nanny state but rather a malevolent surveillance state.

◧◩◪◨⬒⬓⬔
165. dkerst+OG[view] [source] [discussion] 2016-01-06 14:03:46
>>snydly+nC
Government monitoring "for your own good" is pretty scary. We already have situations where people are attacked by the government or its agents because they did something that only harms themselves. For example [1]. Mass surveillance, if left unchecked, will eventually expand for whatever purposes the government wishes. Power is easy to incrementally grow (or in the case of the NSA, they simply ignore the laws) and very difficult to shrink again. We shouldn't think that this wouldn't be used against us sometime in the future and who can truly say that they never did anything harmless-but-illegal (take drugs? gambling? copyright infringement?) and as [1] shows, people have died for these "crimes".

[1] https://www.google.ie/search?q=sal+colusi&oq=sal+colusi&aqs=...

◧◩◪
205. kordle+MT[view] [source] [discussion] 2016-01-06 16:16:31
>>jerf+oG
Every time one of millions of developers commits code to a centralized service, they have a hand in exposing individual's data to the surveillance society we live in today. Data exposure can come from being publicly available on the site itself, being obfuscated through aggregation in the service's APIs, leaked through security holes in implementation of said services, risk of theft by hackers looking for high value targets with private data, or misappropriated by the company itself or its upstream providers. The idea that customers can somehow understand the scope of their exposure to privacy violations is laughable.

One thing which became apparent to me when I began to focus on this issue was the fact there are countless other services which provide services to other services, all of which have some degree of access to upstream customer data. For example, if you log to a hosted logging service, some of your customer's data is sent to them. If that service use AWS, then data is sent to Amazon. And so on.

http://www.stackgeek.com/blog/kordless/post/a-code-of-trust

Arguing efforts to make things better is pointless is a very dangerous thing to do, assuming we actually want things to be better. Cognitive dissonance is a powerful force, especially when there are startups to be built!

216. CurtMo+FX[view] [source] 2016-01-06 16:48:55
>>syness+(OP)
Bravo!

I've been making the chilling effects argument for several years:

http://www.dbms2.com/2013/07/08/privacy-data-use-chilling-ef...

http://www.dbms2.com/2013/07/29/very-chilling-effects/

http://www.dbms2.com/2015/06/14/chilling-effects-revisited/

This article makes it with reasonable, appropriate breadth.

◧◩◪
217. CurtMo+jY[view] [source] [discussion] 2016-01-06 16:53:39
>>rhino3+Lm
That's a big part of why restrictions on governmental collection and retention of data will never suffice. (The other big part is terrorism fears. Whether or not you or I agree with the plan -- as a practical political matter, a lot of surveillance will be done to try to identify in-country baddies.)

http://www.dbms2.com/2010/07/04/fair-data-use/

◧◩◪◨⬒⬓⬔
221. jacque+h01[view] [source] [discussion] 2016-01-06 17:07:01
>>karmac+jW
> Something bad would have to happen to random citizens as the result of government surveillance.

Define 'random'...

http://www.huffingtonpost.com/peter-van-buren/parallel-const...

http://www.reuters.com/article/us-dea-sod-idUSBRE97409R20130...

> Something like "Private Citizen X criticized the government and embarrassing information about his life was revealed as a consequence."

https://theintercept.com/2014/02/18/snowden-docs-reveal-cove...

You mean like that?

> There are lots of bad things that the government could do.

Does, not could do.

> But it just hasn't happened.

It happens, but it just does not manage to cross your threshold for worry because you personally are not inconvenienced.

> They've had mass surveillance technology in place for over a decade now.

For longer than that, and it has been abused for longer than that too.

> The world hasn't fallen apart

It will not 'fall apart' because of this. But it will change because of this, and not for the better.

> Hitler hasn't risen from the dead and everything is pretty much the same as it was before.

Yes, we still have willfully blind people that would require things to get so bad that they would no longer be able to avert their eyes before they would consider maybe things have gone too far. But by then they would have indeed gone too far.

> I guess we can check back in another ten years to see if your apocalyptic visions have come to pass yet.

It will never be a moment in time, we will just simply keep on creeping up to it, just like the frog in the pot of water.

What fascinates me is that there are people that are obviously reasonably intelligent that manage to actually see the pot, the stove and all that it implies and they still tell other frogs to jump in, the water is fine.

◧◩◪◨⬒⬓⬔
235. TeMPOr+G71[view] [source] [discussion] 2016-01-06 17:58:27
>>nsns+TU
I wonder if it isn't because the darkest regimes, when they're just starting, show the most promise for progress and positive changes. Didn't Nazis offer the Germans their wealth and their honor back, in the times they were most desperately in need of both?

EDIT:

But then again, fascination with "the other guys" is also a thing. See: the intellectual world of the West being in love with Soviet Union well into the Cold War.

http://slatestarcodex.com/2015/08/11/book-review-chronicles-...

◧◩◪◨
240. pdkl95+3c1[view] [source] [discussion] 2016-01-06 18:27:45
>>chisha+8o
The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers and Opening Them Up to Exploitation

https://www.asc.upenn.edu/news-events/publications/tradeoff-...

    ... the survey reveals most Americans do not believe that ‘data for discounts’
    is a square deal.

    ... Rather than feeling able to make choices, Americans believe it is futile to
    manage what companies can learn about them. The study reveals that more than half
    do not want to lose control over their information but also believe this loss of
    control has already happened.
◧◩◪◨⬒⬓⬔⧯▣▦
254. jacque+Ck1[view] [source] [discussion] 2016-01-06 19:29:34
>>TeMPOr+cf1
That was more of a temporary thing right up until the point that the Nazi party abolished large chunks of the civil rights granted by the German Constitution in 1933.

They made lots of promises that they never delivered on (or even planned to deliver on).

One of the more interesting ones:

http://www.bytwerk.com/gpa/vw.htm

271. secfir+iH1[view] [source] 2016-01-06 22:31:50
>>syness+(OP)
(Apologies for the blatant plug)

On this subject, if anyone is interested, we just launched a free, open source, Android mobile app to help people manage the complex issues of digital and physical security. It's got simple lessons on everything from sending a secure mail to dealing with a kidnap.

https://play.google.com/store/apps/details?id=org.secfirst.u...

◧◩◪◨⬒⬓
272. Pavlov+eJ1[view] [source] [discussion] 2016-01-06 22:47:32
>>logfro+mt1
> For all the influence exerted by people like Mahatma Gandhi and Martin Luther King, Jr., they effected change with their lives, and not their deaths.

I would argue the most important changes they affected were for themselves. You don't risk your health by helping someone who gets attacked to earn their gratitude, but to be able to look in the mirror. That's the only thing that gives enough energy to sustain certain things for years and decades. And Rosa Parks for example didn't plan to end segregation, she was sick of putting up with it. Nothing more, nothing less. How great other people are in your eyes is does not matter for what value their own acts of moral hygiene have to them, and people don't need "expect" a reward for such things because the deed itself IS the reward. They already have it. And since you brought up MLK:

I say to you this morning, that if you have never found something so dear and so precious to you that you aren't willing to die for it then you aren't fit to live.

[..]

You may be 38 years old, as I happen to be. And one day, some great opportunity stands before you and calls you to stand up for some great principle, some great issue, some great cause. And you refuse to do it because you are afraid... You refuse to do it because you want to live longer... You're afraid that you will lose your job, or you are afraid that you will be criticized or that you will lose your popularity, or you're afraid someone will stab you, or shoot at you or bomb your house; so you refuse to take the stand.

Well, you may go on and live until you are 90, but you're just as dead at 38 as you would be at 90. And the cessation of breathing in your life is but the belated announcement of an earlier death of the spirit.

http://www.youtube.com/watch?v=pOjpaIO2seY&t=18m26s

> In a society of equals, for anyone to die unnecessarily is a tragedy. For someone to choose to die, it is a horror.

Here's a secret: everybody dies, either way. The only choice you have is how you live. From John J. Chapman's commencement address to the graduating class of Hobart College, 1900:

If you wish to be useful, never take a course that will silence you. Refuse to learn anything that implies collusion, whether it be a clerkship or a curacy, a legal fee or a post in a university. Retain the power of speech no matter what other power you may lose. If you can take this course, and in so far as you take it, you will bless this country. In so far as you depart from this course, you become dampers, mutes, and hooded executioners.

> It is not fitting to convince anyone to believe they are so unworthy that the best way they might serve others is by throwing themselves into fires that need never have been lit.

People who are great don't need to be convinced of anything. People who aren't are impossible to convince. And it's not "fitting" to justify stoking fires because otherwise others would do it, either. Then let those others do it? And hey, for all you know, they all might be doing it because otherwise you would do it.

And who is actually sacrificing? People who aren't sacrificing their ideals and their morals, or people who sacrifice them for some food and a few decades more?

282. mirimi+f52[view] [source] 2016-01-07 03:53:35
>>syness+(OP)
OK, so now David Chaum is proposing PrivaTegrity.[0] It's "meant to be both more secure than existing online anonymity systems like Tor or I2P and also more efficient". But it includes a "carefully controlled backdoor that allows anyone doing something 'generally recognized as evil' to have their anonymity and privacy stripped altogether". Just exactly how the bloody hell can a backdoored design be styled as more secure than Tor and I2P?

There's no fool like an old fool, as they say. Sad :(

[0] http://www.wired.com/2016/01/david-chaum-father-of-online-an...

◧◩
312. joeyes+I29[view] [source] [discussion] 2016-01-11 19:26:25
>>kmonad+uo
I'm sure there are more direct examples, but I watched a video about Sesame Credit recently and it paints a rather dystopian picture of what's possible when you mix surveillance and gamification. https://www.youtube.com/watch?v=lHcTKWiZ8sI
[go to top]