I was an early employee at Disqus (YC S07). I helped build its 3rd party commenting plugin, because I had a blog and believed in distributed discussion. I genuinely believe the company did too (the founders, the employees, etc.). But eventually the rubber hits the road, and Disqus was sold years later to Zeta Global [1], a “data-driven marketing company”.
As long as you have a database in the cloud with a non-trivial amount of user data, you don’t really have control over what becomes of it.
[1] https://techcrunch.com/2017/12/05/zeta-global-acquires-comme...
sure you do, as long as you remain in charge. once you sell it, of course you don't really have control. duh. but to say just because data exists means you can't decide what to do or not to do with it is absurd
The parent comment said “every single person in adtech is complicit”.
Most employees do not possess that level of agency about what happens to their work.
> As long as you have a database in the cloud with a non-trivial amount of user data, you don’t really have control over what becomes of it.
Who I work for and what I do is the agency. If I want to better influence what happens to my work, I can make sure my work doesn't have this abuse incentive.
But in practice, almost everyone with these databases with significant amount of data is working for an entity with shareholders and creditors. It's much harder to stay in control forever in that world, especially if your company is not perpetually successful. Companies decline or fold all the time. Then they get sold off.
I agree, but I don't want anyone to think blame should be shared equally (not that I think you were suggesting this, but that others could interpret it this way). I also don't think that being complicit in an act equates to sharing blame. There is accountable complacency and unaccountable. This is more why it is important to think about how our work can be abused. It always will be abused, and you'll never be able to figure out all the creative ways that they can be. But that doesn't mean you shouldn't try. Intentional ignorance is accountable complacency. Everything is on a spectrum: complacency, blame, abuse, ignorance, etc. We do need to be careful in how we discretize the bins, and make sure we don't binarize. There's unwilling and unwitting complacency, and that should be accounted for. A lot does matter about how much control you have, as you said. But I also think recognition matters: if you realize that the tool will/is being abused then complacency is more explicit/accountable.
Things are easy to see post hoc, and we can always think "what if." I hope that, given what you wrote, you don't blame yourself. But now you, and others, have better tools to predict potential abuse. After all, the road to hell is paved with good intentions (the road to heaven is paved of the same stuff, which is why it is hard to distinguish).
there's a lot more to having start up culture than getting investment daddies.