zlacker

[parent] [thread] 10 comments
1. bentle+(OP)[view] [source] 2023-06-13 14:51:07
You don’t even have to work explicitly in adtech.

I was an early employee at Disqus (YC S07). I helped build its 3rd party commenting plugin, because I had a blog and believed in distributed discussion. I genuinely believe the company did too (the founders, the employees, etc.). But eventually the rubber hits the road, and Disqus was sold years later to Zeta Global [1], a “data-driven marketing company”.

As long as you have a database in the cloud with a non-trivial amount of user data, you don’t really have control over what becomes of it.

[1] https://techcrunch.com/2017/12/05/zeta-global-acquires-comme...

replies(2): >>dylan6+Jo >>godels+cb1
2. dylan6+Jo[view] [source] 2023-06-13 16:25:06
>>bentle+(OP)
>you don’t really have control over what becomes of it.

sure you do, as long as you remain in charge. once you sell it, of course you don't really have control. duh. but to say just because data exists means you can't decide what to do or not to do with it is absurd

replies(4): >>bentle+7v >>boplic+av >>majorm+yD >>Veserv+YF
◧◩
3. bentle+7v[view] [source] [discussion] 2023-06-13 16:46:58
>>dylan6+Jo
If you are the majority shareholder and that never changes, then sure.

The parent comment said “every single person in adtech is complicit”.

Most employees do not possess that level of agency about what happens to their work.

replies(1): >>lcnPyl+7z
◧◩
4. boplic+av[view] [source] [discussion] 2023-06-13 16:47:18
>>dylan6+Jo
Just to be clear: the startup culture is literally about giving up control, in exchange for a cash infusion and a chance to Make It Big.
replies(1): >>dylan6+9V
◧◩◪
5. lcnPyl+7z[view] [source] [discussion] 2023-06-13 17:03:27
>>bentle+7v
I think your conclusion is countered by something else you wrote:

> As long as you have a database in the cloud with a non-trivial amount of user data, you don’t really have control over what becomes of it.

Who I work for and what I do is the agency. If I want to better influence what happens to my work, I can make sure my work doesn't have this abuse incentive.

◧◩
6. majorm+yD[view] [source] [discussion] 2023-06-13 17:22:52
>>dylan6+Jo
I mean, at the extreme like this: you're gonna die sometime. You don't have control after that. You could try to set up a trust etc etc etc but... on a long enough timeline, everything's gonna change control.

But in practice, almost everyone with these databases with significant amount of data is working for an entity with shareholders and creditors. It's much harder to stay in control forever in that world, especially if your company is not perpetually successful. Companies decline or fold all the time. Then they get sold off.

◧◩
7. Veserv+YF[view] [source] [discussion] 2023-06-13 17:33:00
>>dylan6+Jo
Even if you do not remain in charge you could institute serious irrevocable voluntary liquidated damages clauses paid out to your customers for any misuse of their data. This would bind any future actions with serious financial penaltys. In addition, IANAL, but I think if the government wishes to appropriate the data there is a good chance they might be required to pay the penalty for making you violate your contract.
◧◩◪
8. dylan6+9V[view] [source] [discussion] 2023-06-13 18:26:33
>>boplic+av
That's a pretty cynical take on startup culture, but I guess it deserves it
replies(1): >>Tremen+871
◧◩◪◨
9. Tremen+871[view] [source] [discussion] 2023-06-13 19:11:14
>>dylan6+9V
It's the only truthful take. If you don't take capital you don't really qualify as being part of the "startup culture", you're just running a small business.
replies(1): >>dylan6+hm1
10. godels+cb1[view] [source] 2023-06-13 19:27:35
>>bentle+(OP)
> You don’t even have to work explicitly in adtech.

I agree, but I don't want anyone to think blame should be shared equally (not that I think you were suggesting this, but that others could interpret it this way). I also don't think that being complicit in an act equates to sharing blame. There is accountable complacency and unaccountable. This is more why it is important to think about how our work can be abused. It always will be abused, and you'll never be able to figure out all the creative ways that they can be. But that doesn't mean you shouldn't try. Intentional ignorance is accountable complacency. Everything is on a spectrum: complacency, blame, abuse, ignorance, etc. We do need to be careful in how we discretize the bins, and make sure we don't binarize. There's unwilling and unwitting complacency, and that should be accounted for. A lot does matter about how much control you have, as you said. But I also think recognition matters: if you realize that the tool will/is being abused then complacency is more explicit/accountable.

Things are easy to see post hoc, and we can always think "what if." I hope that, given what you wrote, you don't blame yourself. But now you, and others, have better tools to predict potential abuse. After all, the road to hell is paved with good intentions (the road to heaven is paved of the same stuff, which is why it is hard to distinguish).

◧◩◪◨⬒
11. dylan6+hm1[view] [source] [discussion] 2023-06-13 20:11:55
>>Tremen+871
i don't buy this at all

there's a lot more to having start up culture than getting investment daddies.

[go to top]