Personalised ads are beside the point. The issue is how they are personalised, namely by building a rich profile of user behaviour based on non-consensual tracking.
It isnt even clear that there's a meaningful sense of 'consent' to what modern ad companies (ie., google, facebook, amazon, increasingly microsoft, etc.) do. There is both an individual harm, but a massive collective arm, to the infrastructure of behavioural tracking that has been built by these companies.
This infrastructure should be, largely, illegal. The technology to end any form of privacy is presently deployed only for ads, but should not be deployed anywhere at all.
The very ability to provide that list previously required an expensive secret police; today it does not.
This is an extremely dangerous ability for anyone to have -- human rights (such as that to privacy) were won against oppression. They aren't optional, they're the system by which we prevent those previous eras from reoccurring.
This is why i'm suspicious of the being a meaningful sense of 'consent' here -- if enough people consent, then the tracking agency acquires a novel political power over everyone. This is why the infrastructure of tracking itself is a concern.
I think there are very good economic reasons why companies don’t dox their customers. They treat data cautiously even in the absence of regulation— since it would be a loss of business value to lose customer trust.
When we call for “privacy” — what does it mean when we want to share our data? Ok, one might say that you don’t want 3rd party sites tracking etc etc. That’s fine. You don’t want data sold. That’s fine. But if we make a big fuss about privacy in a world where we want to share so much personal information, I think we cloud the issues. We want a lot more than privacy, obviously, when we are so willing to give it up. I want those other desires made more clear and not lumped in as privacy. I think the GDPR just trains people to click “accept.”
Do you see my concern?