zlacker

[return to "The US government is buying troves of data about Americans"]
1. fullsp+3A2[view] [source] 2023-06-13 13:59:59
>>benwer+(OP)
Every single person working in the adtech industry is complicit in this.

Joseph Cox’s reporting on the geolocation/tracking shit the US Gov buys up really highlights the direct link between consumer tracking (to sell them shit) and government intrusion into privacy.

◧◩
2. bentle+SL2[view] [source] 2023-06-13 14:51:07
>>fullsp+3A2
You don’t even have to work explicitly in adtech.

I was an early employee at Disqus (YC S07). I helped build its 3rd party commenting plugin, because I had a blog and believed in distributed discussion. I genuinely believe the company did too (the founders, the employees, etc.). But eventually the rubber hits the road, and Disqus was sold years later to Zeta Global [1], a “data-driven marketing company”.

As long as you have a database in the cloud with a non-trivial amount of user data, you don’t really have control over what becomes of it.

[1] https://techcrunch.com/2017/12/05/zeta-global-acquires-comme...

◧◩◪
3. godels+4X3[view] [source] 2023-06-13 19:27:35
>>bentle+SL2
> You don’t even have to work explicitly in adtech.

I agree, but I don't want anyone to think blame should be shared equally (not that I think you were suggesting this, but that others could interpret it this way). I also don't think that being complicit in an act equates to sharing blame. There is accountable complacency and unaccountable. This is more why it is important to think about how our work can be abused. It always will be abused, and you'll never be able to figure out all the creative ways that they can be. But that doesn't mean you shouldn't try. Intentional ignorance is accountable complacency. Everything is on a spectrum: complacency, blame, abuse, ignorance, etc. We do need to be careful in how we discretize the bins, and make sure we don't binarize. There's unwilling and unwitting complacency, and that should be accounted for. A lot does matter about how much control you have, as you said. But I also think recognition matters: if you realize that the tool will/is being abused then complacency is more explicit/accountable.

Things are easy to see post hoc, and we can always think "what if." I hope that, given what you wrote, you don't blame yourself. But now you, and others, have better tools to predict potential abuse. After all, the road to hell is paved with good intentions (the road to heaven is paved of the same stuff, which is why it is hard to distinguish).

[go to top]