zlacker

[parent] [thread] 24 comments
1. dsacco+(OP)[view] [source] 2016-01-06 04:43:22
I think your opinion is valid and should be fairly represented, but consider that your reasons for not caring about privacy may be flawed or inconsistent.

Assuming that you don't care about privacy because you're apathetic, do you also not care about free speech because you don't say anything controversial? Do you care about your right to assembly even if you don't protest anything? As an extreme example upon which to build a baseline, would you mind if a neighbor had unmitigated access to watching you lounge in your underwear, take a shower or have sex?

Why do you not care about privacy? Do you feel that you don't need it because you have nothing to hide, or are you willing to sacrifice it for some greater good (e.g. terrorism etc.)? Are you merely indifferent or do you aggressively oppose the concept?

replies(1): >>blitzp+24
2. blitzp+24[view] [source] 2016-01-06 05:50:38
>>dsacco+(OP)
First of all thank you for respecting my opinion. I appreciate it.

1.) Free speech is a completely different topic. Snowden's quote on this page makes no sense to me no matter how often I re-read it. If free speech didn't exist I wouldn't be able to express my opinion about privacy :)

2.) Privacy means hiding the truth. Hiding what really happened. Hiding who you really are. I believe it is a flaw of the human personality that makes us want to hide information and eventually lie about it.

I don't care if Google or the government knows that I'm searching "[insert embarassing keywords for you here]" or if Facebook knows my location, or if Twitter knows what I like based on the people I follow.

Who is the government? It's people. People like you and me. If people decide to make assumptions based on data they collected and the assumptions aren't correct it's their own fault for assuming something in the first place (because...you know...it's an assumption...it can be wrong).

I am not aggressively opposing the concept of privacy. I respect other people's opinion.

replies(8): >>NhanH+d4 >>Swizec+q4 >>ericdy+05 >>spdust+75 >>mutati+s5 >>9fQmdW+H5 >>mirimi+p8 >>mixmas+h9
◧◩
3. NhanH+d4[view] [source] [discussion] 2016-01-06 05:55:08
>>blitzp+24
> I believe it is a flaw of the human personality that makes us want to hide information and eventually lie about it.

At the very basic, we want to hide things because other people do not like it (which leads to reaction from shaming to prosecuting and stoning). Fundamentally, the only way for it to not happen is to have a completely homogeneous society, or all human to turn into saints. I will just assert the former to be bad, and the latter to be impossible .

replies(1): >>sdoeri+u8
◧◩
4. Swizec+q4[view] [source] [discussion] 2016-01-06 05:58:51
>>blitzp+24
This is all true and I agree wholeheartedly.

And when The People incorrectly decide that based on data you raped a 15 year old, you will be in prison for the duration of the trial, you will be on the sex offender list forever, and you will be inconvenienced with anything requiring a background check. You, not The People.

Ideologically, I agree, privacy is a lame side-effect of how groups of people work. Pragmatically, please don't take it away.

replies(2): >>blitzp+95 >>rhino3+7b
◧◩
5. ericdy+05[view] [source] [discussion] 2016-01-06 06:08:09
>>blitzp+24
> Who is the government? It's people. People like you and me. If people decide to make assumptions based on data they collected and the assumptions aren't correct it's their own fault for assuming something in the first place (because...you know...it's an assumption...it can be wrong).

What if the assumptions they make raise the premium on your health insurance because someone sells your data? People (or, more likely, algorithms) making wrong assumptions, even if it is their own fault, can affect you negatively.

replies(1): >>blitzp+i6
◧◩
6. spdust+75[view] [source] [discussion] 2016-01-06 06:10:11
>>blitzp+24
I believe it is a flaw of the human personality that makes us want to hide information and eventually lie about it.

Who said anything about lying being a part of a desire for privacy?

I don't care if Google or the government knows that I'm searching "[insert embarassing keywords for you here]

Would you care if a prospective insurer knows you're (hypothetically) searching for "atrial fibrillation management" or "opiate addiction"? Or a prospective employer who knows you're (hypothetically) searching for "corporate firewall security exploits"? Or a prospective romantic partner who knows you're (hypothetically) searching for "genital rash"? Any of those searches could be legitimately borne of pure, unadulterated curiosity, but taken out of that context by people with whom you're hoping to establish some kind of relationship, they could easily doom that relationship before it begins. Hell, those searches may not even be made by you but by someone in your household, but if decisions are made and opinions are formed based in that information, you've suffered an unnecessary loss.

Who is the government? It's people. People like you and me.

Indeed, people like you and me, except those people have the authority and/or power to incarcerate you, or impinge on your rights in other (less direct/more insidious) ways. Privacy isn't about hiding the truth from those who have a need to know it, it's about controlling the context of that truth, or at the very least, having a say in any response that comes from the truth being discovered.

replies(2): >>blitzp+X6 >>cthalu+ec
◧◩◪
7. blitzp+95[view] [source] [discussion] 2016-01-06 06:11:04
>>Swizec+q4
I see your point. If society ends up believing in the fact that assumptions based on collected data have suddenly turned into "facts" then we will be truly...let me say it frankly...we are done for.

I believe when this happens Hacker News won't exist anymore because the intelligence of human beings will be comparable to that of a fly.

Luckily...this didn't happen yet because I can still have intellectual discussions, even on the internet.

I like your separation of "ideologically" and "pragmatically". I agree, it's not a pragmatical approach.

replies(1): >>Swizec+A6
◧◩
8. mutati+s5[view] [source] [discussion] 2016-01-06 06:17:38
>>blitzp+24
I understand that you believe that privacy is hiding the truth. It appears that you believe that the only reason someone would hide any information is because it only allows one to lie. Thus you conclude that since lying is bad, privacy is also bad because it promotes lying. If the above chain of reasoning is accurate, then let's do a thought experiment. What if you personally hold a belief that is contrary to public opinion, in fact, let's say it's a crime to believe this, but you still believe it? And for some reason you decided to make mention of it to someone and you are outed for holding a belief. Do you think that even though you disagree with society at large, you should be punished for that belief? Who is correct in this scenario? You? The people? ... privacy isn't just about lies, it's about being able to have space to have thoughts and develop concepts that may not be ready for public consumption. It's about freedom to think about concepts or beliefs without State retribution for not holding the party line. It's not about withholding truth. It's about being able to control the information that you personally generate without fear of judgement from external parties.
replies(2): >>ghaff+a6 >>karmac+th
◧◩
9. 9fQmdW+H5[view] [source] [discussion] 2016-01-06 06:19:52
>>blitzp+24
Privacy is not deceit. Privacy is the right to be left alone.
◧◩◪
10. ghaff+a6[view] [source] [discussion] 2016-01-06 06:29:42
>>mutati+s5
Dave Eggers' The Circle is perhaps worth reading in this context. I don't actually think it's a very good book and is mostly tolerable if it's read in the vein of a deliberately exaggerated "if this goes on" cautionary tale. But there are a number of speeches by one of the characters (Bailey?) in the vein of why radical transparency is good.
◧◩◪
11. blitzp+i6[view] [source] [discussion] 2016-01-06 06:32:09
>>ericdy+05
Wouldn't you agree that the real solution is to fix the a) algorithms and b) assumptions rather than c) hiding the data?
replies(1): >>ericdy+M6
◧◩◪◨
12. Swizec+A6[view] [source] [discussion] 2016-01-06 06:40:22
>>blitzp+95
It's not even about "facts". Suspicion is enough because "innocent until proven guilty" is true in theory, but the period between "suspected" and "proven innocent" can be very ... inconveniencing.

And that's IF the internet or real lynch mob doesn't decide to go after you. If it does, then the being proven innocent part is the least of your concerns.

replies(1): >>sdoeri+H8
◧◩◪◨
13. ericdy+M6[view] [source] [discussion] 2016-01-06 06:44:52
>>blitzp+i6
I can't fix someone else's algorithms and assumptions, but I can hide my own data. Even if I agree with your premise, if someone else is in control of the "real solution," then it's not a real solution for me, is it?

There's also the chance that those algorithms and assumptions are "correct" from a business standpoint (it would cost more to "fix" them than the monetary benefit of fixing them) even if they're not correct for consumers, meaning nobody that's actually in control of them has any motivation to fix them.

◧◩◪
14. blitzp+X6[view] [source] [discussion] 2016-01-06 06:47:51
>>spdust+75
Recognize the real source of the problem.

Like you said, someone trying to get information about the topics you mentioned could simply be doing this out of curiosity. Now person A from the government says you are X. However you are not X, you are Y.

Think again, what is the actual problem? The actual problem is not the data which is 100% correct.

The actual problem is people's prejudices and assumptions. This is what we need to fix. If someone searches about topic Z we should think very carefully about the consequences of drawing an assumption.

However, this view is very ideological. Your view on the current state is more practical. I do not disagree with your statements, I simply wish that we can address the real issue here in the future. Even if it takes us centuries.

replies(3): >>Cakez0+On >>sravfe+qo >>dayon+ZY1
◧◩
15. mirimi+p8[view] [source] [discussion] 2016-01-06 07:19:30
>>blitzp+24
> I don't care if Google or the government knows that I'm searching "[insert embarassing keywords for you here]" ...

If you were living in western China, you might care, because you might end up an involuntary organ donor. Or if you were searching for gay porn in Saudi Arabia.

◧◩◪
16. sdoeri+u8[view] [source] [discussion] 2016-01-06 07:22:46
>>NhanH+d4
I also feel, that some form of privacy is also needed to develop new ideas. To mull them, to test them out, before everybody with their own interpretations get's a say. How will we as a society develop further, if everybody will be more or less homogenized by means of ubiquitous surveillance and self censorship (or punishment for transgressions)?

We need save (and private) spaces. At least in my view of the world, where I am on your side, not believing all people will turn into saints. Not even most of people.

So killing privacy and upping surveillance of everybody, we as society will shoot ourselves in the foot and killing new ideas before they are even thought i fear.

◧◩◪◨⬒
17. sdoeri+H8[view] [source] [discussion] 2016-01-06 07:29:12
>>Swizec+A6
There are enough examples (at least here in Germany), where peoples lives got uprooted and destroyed exactly because false accusations or false interpretations of "facts" happened. Even after the where acquitted lots of people distrusted them, bullied them and such, because the press had already told everybody what awful people these people were.

And hey, if it is in the news, it has to be true - doesn't it?

We will never fix these idiots (myself totally included). Because even if we do not believe these things we will have them forever at the back of their heads, when presented with a name of someone because: "maybe they did do the thing non the less, even if the court acquitted them".

This is just human nature. You cannot actively un-know something you heard and this will sadly inform your inherent biases non the less - even if you intellectually know it to be untrue.

◧◩
18. mixmas+h9[view] [source] [discussion] 2016-01-06 07:41:55
>>blitzp+24
> I personally do not care about privacy.

> Free speech is a completely different topic.

((James Madison rolling over in grave))

Oh, but freedom is not a different topic. These two types are enshrined in the US Constitution after centuries of experience in the old world.

Imagine that you are too young or too lucky so far to have information used against you or your family. Yet history shows that it happens again and again, and will again.

◧◩◪
19. rhino3+7b[view] [source] [discussion] 2016-01-06 08:29:28
>>Swizec+q4
How is privacy really a good solution to the problem of mistaken convictions?

The lack of privacy may very well reduce the amount of false convictions. Sure, you looking up pix of teen boys might look suspicious. But the lack of privacy might catch the real criminal too.

If we had accurate gps for all people all of the time, it would probably reduce false conviction rates.

Plus, the way the system works now is that once you are a suspect, you really don't have privacy anymore. That's how the Constitution works. Once there is probably cause, the state will rifle through your stuff, ask your friends and family, etc.

On the mistaken conviction issue, I'd probably rather live in a privacy free state than a state with privacy. Assuming I was innocent.

Though I prefer privacy for other reasons.

◧◩◪
20. cthalu+ec[view] [source] [discussion] 2016-01-06 08:52:28
>>spdust+75
> Any of those searches could be legitimately borne of pure, unadulterated curiosity, but taken out of that context by people with whom you're hoping to establish some kind of relationship, they could easily doom that relationship before it begins. Hell, those searches may not even be made by you but by someone in your household, but if decisions are made and opinions are formed based in that information, you've suffered an unnecessary loss.

I think the negative effects there are largely due to how private we are. If we were constantly confronting these things that seem embarrassing or concerning, we'd come to realize how normal they are.

It would require a completely shift in how we view privacy, one so large I doubt it would ever happen, but I think those are ultimately a symptom of the current system, where we often keep things private for the sake of societal or cultural norms, sometimes to personal detriment.

I'm not particularly arguing that either way is inherently right or wrong - but I do think the consequences you speak of are only meaningful in a world where a large measure of privacy, at least between most people in their day to day interactions, exists.

◧◩◪
21. karmac+th[view] [source] [discussion] 2016-01-06 10:26:46
>>mutati+s5
The underlying assumption to your hypothetical is that thought crimes exist. I would say that someone in that situation doesn't have a privacy problem, they have a governance problem. Either a dictator has seized power or their fellow citizens have voted to make it illegal to express certain ideas. And in either case, encryption isn't going to be much of a solution. It'll only delay the inevitable. Someone who talks about illegal ideas is taking a big risk anyway.

It seems common that the arguments for privacy trumping other values depend on bad behavior by state actors. In which case, reforming the state by whatever means necessary would probably do more good than advocating for philosophical concepts.

replies(1): >>mutati+5Q
◧◩◪◨
22. Cakez0+On[view] [source] [discussion] 2016-01-06 12:20:07
>>blitzp+X6
You might be searching for those things out of curiosity, but if (in the insurance hypothetical) statistically more people searching for "opiate withdrawal" are addicted to opiates, then it's going to affect your health insurance premiums regardless of your intentions.

Or more generally, you can't choose how people interpret data they gather about you and that can adversely affect you.

◧◩◪◨
23. sravfe+qo[view] [source] [discussion] 2016-01-06 12:31:37
>>blitzp+X6
> The actual problem is people's prejudices and assumptions. This is what we need to fix.

Right, so the whole premise of your indifference or opposition to the privacy argument is that people should not have prejudices or (wrong) assumptions. Isn't that too idealistic and to rid people of the prejudices and figure out right moral standard for behaviour - will it not take many more generations, if at all it happens? Till then; till we figure out the right _prejudices_; till all of humanity naturally elevates to the right moral standard, shouldn't we be wary of those bad agents who can abuse others by breaking into their private matters?

Your premise, in short, assumes an ideal world where none is troubling others for their private acts, which unfortunately isn't the case yet.

◧◩◪◨
24. mutati+5Q[view] [source] [discussion] 2016-01-06 17:22:45
>>karmac+th
Fair. I was just trying to take the problem to the hypothetical edge of having no privacy at all; to a case where you do not even enough privacy to share a thought without fear of retribution. I was also trying to align the idea with their understanding of freedom of speech, they do agree freedom of speech is ok, so if you can tie speech into thought and then also into privacy, maybe there would be a logical connection that allows them to understand the need for privacy as a type of freedom.

The situation is very complex because privacy has been implicit in our daily lives for so long, it's really difficult to map out the ways it would reduce our personal freedom. If we want to remove privacy, then we need to make it impossible for anyone to keep anything private from anyone else.

If privacy isn't important; then we should all live in proverbial glass houses where everyone can see everyone else's lives. Why should we trust the government with that power, why not everyone?

◧◩◪◨
25. dayon+ZY1[view] [source] [discussion] 2016-01-07 05:51:09
>>blitzp+X6
I forget the term for this, but you've poisoned the discussion by leading it to a dead end with an impossible goal: ridding humanity of prejudice and assumption. Since that isn't remotely possible, we might as well throw our hands up in the air. And forget about data, it's not at fault here.

Like you, Snowden's freedom of speech line never impacted me... until I read this article. It suddenly hit me. The reason I was missing his point is because I was framing it in terms of what's in it for me rather than looking at it as what's in it for us. Someone who doesn't care about freedom of speech doesn't care because he doesn't see what's in it for him. But I doubt you'd argue the benefits of the first amendment.

Similarly, privacy is very important. You might not care (even though you really do), but defending privacy is about ensuring security. Privacy is important for all of us, just like freedom of speech is.

As for what the actual problem is, the problem for the most part is ignorance and a failure to quench it. We need more privacy / cyber-security advocates who can educate people on why they ought to care. It's like teaching people why it's important to lock their doors at night or why they should put their letters into envelopes instead of just using post cards. It's why my mom had to drill into my brain the importance of not giving out my social security number willy nilly. Are you so liberal with your SSN? You don't care about privacy, so would it bother you if Facebook or Google asked for it. After all, they just want to make sure you are who you say you are.

Things aren't obvious to us until they're obvious, and then it feels like common sense. DUH, lock your door! DUH, encrypt your messages!

[go to top]