zlacker

[return to "EU data regulator bans personalised advertising on Facebook and Instagram"]
1. mjburg+kc[view] [source] 2023-11-02 11:58:07
>>pbrw+(OP)
Comments here so far focus on personalised ads as the issue -- but that's a symptom of what's being banned, which is the mass collection of personal data.

Personalised ads are beside the point. The issue is how they are personalised, namely by building a rich profile of user behaviour based on non-consensual tracking.

It isnt even clear that there's a meaningful sense of 'consent' to what modern ad companies (ie., google, facebook, amazon, increasingly microsoft, etc.) do. There is both an individual harm, but a massive collective arm, to the infrastructure of behavioural tracking that has been built by these companies.

This infrastructure should be, largely, illegal. The technology to end any form of privacy is presently deployed only for ads, but should not be deployed anywhere at all.

◧◩
2. dr_dsh+kg[view] [source] 2023-11-02 12:22:21
>>mjburg+kc
Why should it be illegal? I don’t understand the moral threat. Personally I feel that privacy gets too much airtime as a value — I see lots of other more direct issues (like political manipulation) that will remain an issue even with “strong privacy.”
◧◩◪
3. mjburg+oj[view] [source] 2023-11-02 12:39:23
>>dr_dsh+kg
The ability to aggregate personal information of large numbers of people is a form of political power. Facebook can, if it so wishes, provide a list of all gay people in an area; all supporters of gaza or of israel; all people who have recently commented on an article about drugs.

The very ability to provide that list previously required an expensive secret police; today it does not.

This is an extremely dangerous ability for anyone to have -- human rights (such as that to privacy) were won against oppression. They aren't optional, they're the system by which we prevent those previous eras from reoccurring.

This is why i'm suspicious of the being a meaningful sense of 'consent' here -- if enough people consent, then the tracking agency acquires a novel political power over everyone. This is why the infrastructure of tracking itself is a concern.

◧◩◪◨
4. dr_dsh+7U1[view] [source] 2023-11-02 19:33:22
>>mjburg+oj
You can frame it as tracking; but the fact is that the aggregation of data about people happens almost without intent. In order to provide services, you need unique id— and people want to be sharing and posting information. Facebook might be oriented around personalized ads, but even Friendster was based on an enormous amount of shared personal data. When we use tools like Facebook and Instagram and others, we want to provide our data. When I use chatGPT, I want to provide my data.

I think there are very good economic reasons why companies don’t dox their customers. They treat data cautiously even in the absence of regulation— since it would be a loss of business value to lose customer trust.

When we call for “privacy” — what does it mean when we want to share our data? Ok, one might say that you don’t want 3rd party sites tracking etc etc. That’s fine. You don’t want data sold. That’s fine. But if we make a big fuss about privacy in a world where we want to share so much personal information, I think we cloud the issues. We want a lot more than privacy, obviously, when we are so willing to give it up. I want those other desires made more clear and not lumped in as privacy. I think the GDPR just trains people to click “accept.”

Do you see my concern?

[go to top]