zlacker

[parent] [thread] 64 comments
1. dr_dsh+(OP)[view] [source] 2023-11-02 12:22:21
Why should it be illegal? I don’t understand the moral threat. Personally I feel that privacy gets too much airtime as a value — I see lots of other more direct issues (like political manipulation) that will remain an issue even with “strong privacy.”
replies(16): >>mcv+q >>hedora+H >>lewhoo+H1 >>fsflov+N1 >>poison+V1 >>tjoff+B2 >>mjburg+43 >>Clumsy+V5 >>signal+Ya >>nunez+eg >>falsab+ik >>equals+yl >>lm2846+Bn >>__Matr+yp >>swatco+qs >>tsukik+Hw
2. mcv+q[view] [source] 2023-11-02 12:24:12
>>dr_dsh+(OP)
Political manipulation is a lot easier if the manipulator knows everything about you.
replies(1): >>Nextgr+w1
3. hedora+H[view] [source] 2023-11-02 12:25:49
>>dr_dsh+(OP)
Modern political manipulation techniques rely on the same data as ad targeting.

Ad targeting infrastructure is expensive and hard to hide, so banning it would defeat many political manipulation attacks.

◧◩
4. Nextgr+w1[view] [source] [discussion] 2023-11-02 12:31:40
>>mcv+q
And doesn't even have to build & fund the infrastructure themselves.

Same for government surveillance - adtech/marketing is a boon to it because they don't even have to build/maintain their own surveillance infrastructure anymore.

replies(1): >>thfura+hb
5. lewhoo+H1[view] [source] 2023-11-02 12:32:39
>>dr_dsh+(OP)
Maybe not illegal but optional (opt in not opt out). More issues are of course at hand but strong privacy should be a step in the right direction regardless. Let's not forsake traffic violations just because there are killers on the loose.
6. fsflov+N1[view] [source] 2023-11-02 12:33:14
>>dr_dsh+(OP)
https://en.wikipedia.org/wiki/Nothing_to_hide_argument
replies(1): >>sensan+06
7. poison+V1[view] [source] 2023-11-02 12:33:47
>>dr_dsh+(OP)
Wait until you read about Cambridge Analytica.
replies(4): >>shadow+M9 >>cm2012+ft >>briand+rz >>arrows+LE
8. tjoff+B2[view] [source] 2023-11-02 12:36:56
>>dr_dsh+(OP)
Why should X be allowed to track me against my will?
9. mjburg+43[view] [source] 2023-11-02 12:39:23
>>dr_dsh+(OP)
The ability to aggregate personal information of large numbers of people is a form of political power. Facebook can, if it so wishes, provide a list of all gay people in an area; all supporters of gaza or of israel; all people who have recently commented on an article about drugs.

The very ability to provide that list previously required an expensive secret police; today it does not.

This is an extremely dangerous ability for anyone to have -- human rights (such as that to privacy) were won against oppression. They aren't optional, they're the system by which we prevent those previous eras from reoccurring.

This is why i'm suspicious of the being a meaningful sense of 'consent' here -- if enough people consent, then the tracking agency acquires a novel political power over everyone. This is why the infrastructure of tracking itself is a concern.

replies(2): >>lefsta+t5 >>dr_dsh+ND1
◧◩
10. lefsta+t5[view] [source] [discussion] 2023-11-02 12:54:14
>>mjburg+43
Yes, as can Apple, Google, every cable company, every telecom, the credit card companies, Grindr, health clinics
replies(6): >>mjburg+K7 >>staunt+18 >>kibwen+N9 >>renega+Xc >>nunez+wg >>Noboru+0r
11. Clumsy+V5[view] [source] 2023-11-02 12:57:24
>>dr_dsh+(OP)
> Why should it be illegal? I don’t understand the moral threat

Okay, then it should be legal for me to use Facebooks/Google's 'Intellectual Property' however I want.

Why should it be legal for them to steal my data, but illegal for me to use their's?

replies(1): >>pb7+Vh
◧◩
12. sensan+06[view] [source] [discussion] 2023-11-02 12:58:03
>>fsflov+N1
https://www.cs.ru.nl/~jhh/pub/secsem/solove-nothing-to-hide....
replies(1): >>teddyh+09
◧◩◪
13. mjburg+K7[view] [source] [discussion] 2023-11-02 13:06:59
>>lefsta+t5
Hence, the GDPR's making a distinction between data essential for operation of the basic service vs. more broad collection.

I take your point that some level of this power exists, necessarily, within internet-based companies with online users.

But I think there's a big difference between, say, signing up to Grindr where you submit a basic form with limited information (and presumably) can retain some minimal anonymity in how you use the app --- and a system whereby the history of all your actions across your online life (banking, social media, dating apps, etc.) is collectable by a centralised agency.

With laws like GDPR, broad datasets have become a liability for companies like telecoms, banks, etc. They don't want it. Accidentally forming 'rich user profiles' based on non-annoymous data is a legal liability.

This is exactly the incentive structure needed. Rather than have companies with an existential profit motive to build mass surveillance systems.

As far as whether a relational database that takes user data from a form is different to a whole system of streaming live event databases with massive streams of user monitoring across websites --- well, I think it wouldnt be hard to write a law against the latter.

These are political, moral, legal and technical distinctions that can be drawn.

◧◩◪
14. staunt+18[view] [source] [discussion] 2023-11-02 13:08:14
>>lefsta+t5
Your point being?
replies(1): >>goodpo+nq
◧◩◪
15. teddyh+09[view] [source] [discussion] 2023-11-02 13:13:26
>>sensan+06
The Eternal Value of Privacy, by Bruce Schneier in 2006: <https://www.wired.com/2006/05/the-eternal-value-of-privacy/>
◧◩
16. shadow+M9[view] [source] [discussion] 2023-11-02 13:17:47
>>poison+V1
Cambridge Analytica basically didn't work. For all the data it collected, it amounted to trying to build political influence by reading tea leaves.

The strongest decider in the outcome of the 2016 election is that the Trump campaign spent something like a factor of three more on advertising across the board than the Clinton campaign. Cambridge Analytica was more representative of the notion that the Republicans were willing to spend money on anything that might work than on the efficacy of that specific approach.

At the end of the day, that election came down to a combination of sexism in the voting base (Clinton's gender had a demonstrable effect on turnout among non-voters to vote against her; Americans don't want to admit it but in their hearts they're still pretty sexist) and good old fashioned, well understood rules of how spending on ads can move an election by a percentage point or two. The Democratic party as a whole believed Trump to be so unelectable that they pulled money from the presidential campaign to push it into campaigns down ticket in an attempt to win a massive political coup by controlling the House, Senate, and Presidency at the same time; they underestimated the political position of their opponents and it backfired spectacularly.

replies(2): >>crtasm+Lg >>mcpack+k41
◧◩◪
17. kibwen+N9[view] [source] [discussion] 2023-11-02 13:17:49
>>lefsta+t5
Yes, and the handling all of that personal data should be strictly regulated. Ideally, companies would be treating all of it as toxic waste, and disposing of it as soon as possible.
replies(1): >>thmsth+4O
18. signal+Ya[view] [source] 2023-11-02 13:24:05
>>dr_dsh+(OP)
The actual risk is data brokers, who aggregate this data and can use it for anything.

Using it to sell you shampoo isn’t terrible (it can be super annoying though). The problem is using that data to eg figure out who might be in the market for pregnancy related products. Or, ominously, who have stopped buying pregnancy related products early.

Or correlating interest in something they browse with voting intentions. Or interest in political action. There’s a lot of dodgy things you can do with that data. And little of this is being shared with *informed consent*.

Ads are NOT the problem. I love browsing through ads in magazines I buy. If online ads worked like outdoor ads or magazine ads, I suspect a lot less people would have a problem.

◧◩◪
19. thfura+hb[view] [source] [discussion] 2023-11-02 13:25:40
>>Nextgr+w1
Not only that, but they can now just buy information that they wouldn't legally be able to collect themselves.
◧◩◪
20. renega+Xc[view] [source] [discussion] 2023-11-02 13:33:57
>>lefsta+t5
Credit company does not have microphone near your bed, they do not know which posts you like, or not, they do not have all your mails, and not necessarily know where you drive and how often.

Keeping all of the data under one company umbrella is vulnerable, target for hackers, and easy target for governments.

Your post is not correct.

replies(4): >>corned+Vj >>garden+ak >>s3p+cr >>IX-103+zS
21. nunez+eg[view] [source] 2023-11-02 13:51:39
>>dr_dsh+(OP)
Because i don't want some private entity using my habits and preferences (which I didn't ask them to collect, btw, and, no, throwing a five page manifesto typed in 8-pt don't and making me agree to it or else doesn't count) against me and manipulating how I think

it's more than the ads. imagine if hacker news used ML to determine what articles you see on the front page based on whatever ad campaigns they think will result in a click from you. that would suck, right?

that's what these platforms do though, and that's not okay

◧◩◪
22. nunez+wg[view] [source] [discussion] 2023-11-02 13:52:20
>>lefsta+t5
Apple Google and Grindr aside, those industries are heavily regulated to prevent exactly this
replies(1): >>j16sdi+Jq
◧◩◪
23. crtasm+Lg[view] [source] [discussion] 2023-11-02 13:53:08
>>shadow+M9
What's your view on their effect on elections in other countries?

https://en.wikipedia.org/wiki/Cambridge_Analytica#Elections

replies(3): >>shadow+hk >>hef198+Xk >>cm2012+xt
◧◩
24. pb7+Vh[view] [source] [discussion] 2023-11-02 14:00:17
>>Clumsy+V5
They’re not stealing it, you’re giving it to them in exchange for using the products for free.
replies(3): >>dotnet+Wj >>Santal+uz >>Clumsy+Fa1
◧◩◪◨
25. corned+Vj[view] [source] [discussion] 2023-11-02 14:09:12
>>renega+Xc
> and not necessarily know where you drive and how often.

They do tough. They know where you refueled/charged your car, hotels you've booked. Not only that, but they also know if you donate money to your local mosque/synagogue, spend just a bit too much at a liquor store, etc.

replies(2): >>Anders+Wk >>gpvos+6r
◧◩◪
26. dotnet+Wj[view] [source] [discussion] 2023-11-02 14:09:13
>>pb7+Vh
This is not true, as can be seen from Facebook maintaining 'shadow profiles' on people who don't have actual Facebook accounts but can in any way be connected to them (eg through third party data).
replies(1): >>xigoi+5s
◧◩◪◨
27. garden+ak[view] [source] [discussion] 2023-11-02 14:09:59
>>renega+Xc
While I am all for privacy, which companies enable the microphone next to your bed and collect personal data with it?
replies(3): >>sidlls+im >>justin+Zp >>renega+2s
◧◩◪◨
28. shadow+hk[view] [source] [discussion] 2023-11-02 14:10:36
>>crtasm+Lg
I don't have nearly enough visibility on the political process of other countries to comment. I can only extrapolate by observing that Cambridge Analytica's core business value (harvesting personal data, using it to build a political profile, and microtargeting ads based on that profile) was functionally no better than just "spending more on advertising" in the American 2016 race. So I put burden of proof on those who claim otherwise to show that CA had meaningful influence in any other race.

It would be more surprising to me if they were uniquely unable to build a working psychological profile of an American voter versus any other voter then the simpler scenario that their entire concept was technological snake oil.

29. falsab+ik[view] [source] 2023-11-02 14:10:37
>>dr_dsh+(OP)
You assume people stealing your private details for using it against you, are going to use it unaltered and in context. With enough bad faith and some manipulation, or just a bit of a twist, anything can be weaponized against you.

If private data is stored, there's already a chance of it getting out. You may get lucky, you may not, but for someone that hates you enough, any random detail can be a weapon. Even stuff that doesn't depend on your actions, like religion, country of origin or even medical details. People you associate with, even at a superficial level, can make you guilty by association. And let's not get into stuff like porn habits...

Political manipulation can be made real easy if they got dirt on you, too.

◧◩◪◨⬒
30. Anders+Wk[view] [source] [discussion] 2023-11-02 14:13:41
>>corned+Vj
You can still opt for taking money out of an ATM and and pay by cash to not be tracked by the card company. Meta does not offer the same option
replies(1): >>4RealF+qt
◧◩◪◨
31. hef198+Xk[view] [source] [discussion] 2023-11-02 14:13:48
>>crtasm+Lg
There are other countries? /s
32. equals+yl[view] [source] 2023-11-02 14:17:04
>>dr_dsh+(OP)
The moral threat is that your private data is processed and transformed into influence. And it's been the case over an over again that that influence is wielded without accountability or care for those being targeted. The methods of applying that influence are sometimes sophisticated, sometimes crude but almost always effective. The only protections against this an individual has is privacy. Yes there are other forms of influence, but targeted campaigns feed from data that _should_ be private is vastly more toxic.
◧◩◪◨⬒
33. sidlls+im[view] [source] [discussion] 2023-11-02 14:20:54
>>garden+ak
At least TikTok definitely uses data somehow from microphones, without any (explicit) consent.
replies(1): >>garden+mH
34. lm2846+Bn[view] [source] 2023-11-02 14:27:08
>>dr_dsh+(OP)
> like political manipulation

Take 5 minutes to imagine how political troll campaigns are targeting their audience...

35. __Matr+yp[view] [source] 2023-11-02 14:36:16
>>dr_dsh+(OP)
How does political manipulation look in a world where the platforms can't tell two users apart? Without the ability to tell a different story to each user, you're left trying to sway the masses all together. That's not manipulation, that's democracy.
◧◩◪◨⬒
36. justin+Zp[view] [source] [discussion] 2023-11-02 14:39:29
>>garden+ak
Google Nest? Amazon Alexa? Apple Siri?
◧◩◪◨
37. goodpo+nq[view] [source] [discussion] 2023-11-02 14:41:30
>>staunt+18
Just whataboutism, it seems.
◧◩◪◨
38. j16sdi+Jq[view] [source] [discussion] 2023-11-02 14:43:05
>>nunez+wg
They are heavily regulated, but it's hard to audit or prove in court
◧◩◪
39. Noboru+0r[view] [source] [discussion] 2023-11-02 14:44:18
>>lefsta+t5
Right, which is why many of those companies are also commonly criticised on privacy grounds.
◧◩◪◨⬒
40. gpvos+6r[view] [source] [discussion] 2023-11-02 14:44:28
>>corned+Vj
Well, then we should re-engineer things so that they don't, or regulate them heavily.
◧◩◪◨
41. s3p+cr[view] [source] [discussion] 2023-11-02 14:45:04
>>renega+Xc
>Keeping all of the data under one company umbrella

This part of your post is also not correct. What company knows everything about you? There's insurance companies, credit card companies, social media companies... they all have a substantial amount of info about you but they don't all collude to aggregate it.

◧◩◪◨⬒
42. renega+2s[view] [source] [discussion] 2023-11-02 14:48:31
>>garden+ak
Every voice assistant enabled company store voice audio. Some by accident. Some to "make" voice assistant better. Some to do other stuff with it. Your voice is stored for undisclosed time on their servers. It does not have to be stored for long. Unknown contractors look at the footage of your iRobot,tesla.

No regulator will find any proof of anything though, as regular employees will not have access to such crucial data. Regulators will also can be fooled by the maze of interfaces and servers.

There is also incentive for governments to "not see" any wrongdoings of the companies, if they profit from surveillance system.

Ad business is like Palantir in lord of the rings. You do know know who is watching on the other side.

All you have is some "vague" promise from corporations that your data are properly removed.

◧◩◪◨
43. xigoi+5s[view] [source] [discussion] 2023-11-02 14:48:45
>>dotnet+Wj
Also, Facebook encourages users to tag people in photos even if they don't have a Facebook account.
44. swatco+qs[view] [source] 2023-11-02 14:50:07
>>dr_dsh+(OP)
> The moral threat

Forget fussy debates about morality.

There is a practical threat to society when a few nation-sized corporations operate pipelines of data collection and profile aggregation on every online citizen of the world.

Those profiles represent a massive amount of power, and that power is being let to accumulate in opaque organizations that have no explicit commitment to public benefit and extremely little accountability. That power is not yet being weaponized, but it doesn’t evaporate just because nobody’s using it for leverage or control yet.

The responsible, long-term, practical way to ensure that legitimate governments and the people that constitute them continue to have the power to shape their own society is to make sure that these techniques for accumulating power are dismantled and the already-accumulated power is dissipated.

Yes, we will lose some novelties and baubles in our online life when they can’t track you anymore. Yes, investing new power into government so that it can counter corporate profile-accumulation is dangerous as well.

But the greater danger of inaction against these corporations is that they are already only lightly-accountable and are on the verge of escape from accountability forever if they gain enough power. Modern governments, meanwhile, are comparatively slow and dumb and can still be steered as their dangers become manifest.

◧◩
45. cm2012+ft[view] [source] [discussion] 2023-11-02 14:53:53
>>poison+V1
FYI - referring to Cambridge Analytica like it has any meaningful relationship to privacy, ad tech or election results is the silliest thing you can say to someone who has any understanding of the subject. Cambridge Analytica is a like a canary in a coal mine that says, "this person has no actual understanding of the issues".
◧◩◪◨⬒⬓
46. 4RealF+qt[view] [source] [discussion] 2023-11-02 14:54:37
>>Anders+Wk
You have to use Meta? I'm not arguing for or against privacy here - just trying to point out that you still have options. It might be a pain to contact loved ones, check in on friends, etc. but so is using an ATM.
replies(2): >>phone8+oX >>waveBi+4Z
◧◩◪◨
47. cm2012+xt[view] [source] [discussion] 2023-11-02 14:55:03
>>crtasm+Lg
As a professional advertiser who has worked in political ads, I can tell you it had no effect on any of the results of these campaigns. A bunch of bad people hired them after the Trump campaign because they bought the fluff too.
replies(1): >>crtasm+T2b
48. tsukik+Hw[view] [source] 2023-11-02 15:06:24
>>dr_dsh+(OP)
Cool! I trust you won't object if I put a webcam in your bedroom, then?
◧◩
49. briand+rz[view] [source] [discussion] 2023-11-02 15:15:48
>>poison+V1
Wait until you read about tactics Obama used before Cambridge..

https://www.investors.com/politics/editorials/facebook-data-...

◧◩◪
50. Santal+uz[view] [source] [discussion] 2023-11-02 15:15:54
>>pb7+Vh
It seems dishonest to frame it as a market transaction when it clearly isn't. There is no explicit agreement, and most users probably don't have the slightest clue what data they're giving up.
◧◩
51. arrows+LE[view] [source] [discussion] 2023-11-02 15:37:11
>>poison+V1
Wait until you learn that Cambridge Analytica had no discernible effect on the 2016 US presidential election or the Brexit referendum and this entire "scandal" was bullshit: https://truthonthemarket.com/2019/08/27/7-things-netflixs-th...
◧◩◪◨⬒⬓
52. garden+mH[view] [source] [discussion] 2023-11-02 15:46:35
>>sidlls+im
So they've hacked the permissions on the phone?
replies(1): >>sidlls+Ai2
◧◩◪◨
53. thmsth+4O[view] [source] [discussion] 2023-11-02 16:07:59
>>kibwen+N9
This, exactly this. A big part of what got us into this mess is that data is very very cheap to collect, store and process with modern computing. And there is basically no other cost or downside to dealing with the data. This has led to a gold rush where every company became obsessed with data, thinking that any piece of data was valuable and could be monetized eventually.

If however there were strict liabilities for data leaks or privacy breaches, businesses would collect just the bare minimum data and get rid of it as soon as it is not strictly needed.

◧◩◪◨
54. IX-103+zS[view] [source] [discussion] 2023-11-02 16:22:49
>>renega+Xc
Do you have evidence that these companies are using the microphones in their devices for tracking? Or that they are using "all your emails" for ad targeting? If not please quit kicking that strawman.

There are real privacy issues here, but this kind of paranoia distracts us from mitigating the actual threats and has us jumping at shadows.

◧◩◪◨⬒⬓⬔
55. phone8+oX[view] [source] [discussion] 2023-11-02 16:38:02
>>4RealF+qt
"Simply create your own social media company and get all the people you want to keep in contact with to move to it"
replies(1): >>4RealF+oq4
◧◩◪◨⬒⬓⬔
56. waveBi+4Z[view] [source] [discussion] 2023-11-02 16:43:39
>>4RealF+qt
shadow profiling doesn't care if you directly use Meta products. You're information about your friends, and another example of trends.
◧◩◪
57. mcpack+k41[view] [source] [discussion] 2023-11-02 17:02:42
>>shadow+M9
> The strongest decider in the outcome of the 2016 election is that the Trump campaign spent something like a factor of three more on advertising across the board than the Clinton campaign.

Can you cite your campaign spending numbers? Wikipedia says the opposite: https://en.wikipedia.org/wiki/2016_United_States_presidentia... I'm searching for a source that says what you claim and can't find any: https://www.google.com/search?q=trump+campaign+advertising+o... https://www.google.com/search?q=trump+outspent+hillary https://www.google.com/search?q=trump+advertising+spending+v... https://www.google.com/search?q=hillary+spent+less+than+trum...

Is my google-fu shit? Maybe. Regardless...

> The Democratic party as a whole believed Trump to be so unelectable that they pulled money from the presidential campaign

The root cause was their arrogance. Hillary was barely campaigning at all. It would not have cost her much anything to call into the major news channels every day^ but instead Hillary was effectively incommunicado for much of 2016. It's as if she thought campaigning was beneath her.

Also, I don't know how you can break apart Americans disliking women in general and Americans disliking Hillary particularly. She personally has been a popular target for derision for more than 20 years before her 2016 campaign. The DNC may have considered Trump unelectable but they were burying their heads in the sand w.r.t. Hillary's own unelectability problem. Which goes back to the arrogance thing..

At least they've figured it out now. Nobody seriously talked about her for 2020 and nobody is seriously suggesting her for 2024.

^ Most of Trump's 'advertising' was given to him for free in this manner, maybe you're assigning some arbitrary dollar value to this news coverage to say he spent more?

replies(1): >>shadow+Re1
◧◩◪
58. Clumsy+Fa1[view] [source] [discussion] 2023-11-02 17:25:33
>>pb7+Vh
Google ads trace you, and Facebook tracking pixels keep profiles on you even if you never used any services. It’s literal theft. It’s just allowed for corporation but if I do it, it’s harrasment and stalking.
◧◩◪◨
59. shadow+Re1[view] [source] [discussion] 2023-11-02 17:39:37
>>mcpack+k41
Ah, thank you for calling me out on that; I completely misremembered the anecdote.

It wasn't total spend; it was online campaign spend. "Chaos Monkeys" cites a Bloomberg report on an internal Facebook memo that indicates the Trump campaign ran six million different ads on FB during the campaign and the Clinton campaign ran 1/100th of that amount. So targeted ads were involved, but the targeting approach was very traditional: pay a bunch of advertisers a lot of money to hand-tune ads, see how they perform, re-tune, rinse, repeat. The spend on Cambridge Analytica as a ratio and the effect it had on the total process were both minimal; CA didn't prove to be the "voter whisperer" that the owners made them out to be, and in the long run, the fact that they exfiltrated a bunch of private content from Facebook's datastores isn't as interesting as how the Trump campaign took advantage of the data in Facebook's datastores using the tools Facebook legitimately provides.

One feature the campaign did (according to the author of Chaos Monkeys) find useful was "Lookalike Audiences," which is nothing fancier than crawling the social media graph and expanding an initial targeted ad along friend networks (i.e. if an ad seems to be resonating with you, Facebook's own algorithm, if the advertiser has enabled the feature, will try pushing the ad to your friends and so on). In that sense, the data Facebook collected facilitated a Trump victory, though it wasn't anything more dangerous than the social graph itself... And I don't think the EU is proposing we ban social media or collecting networks of friends at this time.

... though maybe they should? You can do a lot of damage with the information people voluntarily share about who they associate with, if you collect enough of it.

> I don't know how you can break apart Americans disliking women in general and Americans disliking Hillary particularly

A good and fair question. So it turns out one of the largest blocs of votes in the 2016 election was various flavors of Christianity, and they generally chose to vote for a known womanizer and divorcée (with Protestants and Catholics, in particular, voting for Trump by a wide margin over Clinton). This would be considered curious behavior, except scraping the surface only a tiny amount reveals that they are almost 100% unified against the concept of women in a leadership position; some have structural taboos against it, and to some it is an existential threat in the category "God will strike us down for our hubris" because it goes against their notion of a cosmic order. It's ugly and I wish it were not so, but I think most political pundits wildly under-estimated that effect because, as the first woman to be nominated by one of the two major parties in America, their prediction models had no data on what effect it would have. I agree that the fact she already had a political service history that could be criticized (vs. her opponent with no such service) was also a factor, but I don't think it was as large a factor as the voters who turned out with fear of actual divine retribution in their hearts due to their religious beliefs.

◧◩
60. dr_dsh+ND1[view] [source] [discussion] 2023-11-02 19:33:22
>>mjburg+43
You can frame it as tracking; but the fact is that the aggregation of data about people happens almost without intent. In order to provide services, you need unique id— and people want to be sharing and posting information. Facebook might be oriented around personalized ads, but even Friendster was based on an enormous amount of shared personal data. When we use tools like Facebook and Instagram and others, we want to provide our data. When I use chatGPT, I want to provide my data.

I think there are very good economic reasons why companies don’t dox their customers. They treat data cautiously even in the absence of regulation— since it would be a loss of business value to lose customer trust.

When we call for “privacy” — what does it mean when we want to share our data? Ok, one might say that you don’t want 3rd party sites tracking etc etc. That’s fine. You don’t want data sold. That’s fine. But if we make a big fuss about privacy in a world where we want to share so much personal information, I think we cloud the issues. We want a lot more than privacy, obviously, when we are so willing to give it up. I want those other desires made more clear and not lumped in as privacy. I think the GDPR just trains people to click “accept.”

Do you see my concern?

◧◩◪◨⬒⬓⬔
61. sidlls+Ai2[view] [source] [discussion] 2023-11-02 22:37:01
>>garden+mH
I don’t know how, I haven’t looked at or reverse engineered the code. What I do know is that topics I discuss with friends and family suddenly appear in the form of ads and user generated content in my FYP, when none of us has done a search or engaged in other online activity associated with them. Sometimes literally while we’re talking.
replies(1): >>garden+5m2
◧◩◪◨⬒⬓⬔⧯
62. garden+5m2[view] [source] [discussion] 2023-11-02 22:55:15
>>sidlls+Ai2
Isn't this just because they have so much data on you already?

e.g. what are 3 30 year olds in Toronto talking about this saturday? likely the drake concert, winter tires and xmas related things..

replies(1): >>sidlls+8Wb
◧◩◪◨⬒⬓⬔⧯
63. 4RealF+oq4[view] [source] [discussion] 2023-11-03 14:40:14
>>phone8+oX
When did I say anything like that? You can call or txt to check in on friends.
◧◩◪◨⬒
64. crtasm+T2b[view] [source] [discussion] 2023-11-05 19:54:02
>>cm2012+xt
Noting that CA and their parent company SCL were working on elections for years before the Trump campaign.
◧◩◪◨⬒⬓⬔⧯▣
65. sidlls+8Wb[view] [source] [discussion] 2023-11-06 03:13:59
>>garden+5m2
Even if your scenario were accurate, it's unlikely their data processing algorithms would pin the conversation down to within the hour of it happening, given that the schedules I keep are not fixed or typical in any way. Also, in my case for some of these, it's exceptionally unlikely given the individuals involved (a 20-something guy and a 40-something guy with odd senses of humor and esoteric interests in a romantic relationship).
[go to top]