zlacker

[return to "Hack Club: A story in three acts (a.k.a., the shit sandwich)"]
1. blende+Sb[view] [source] 2025-11-13 13:01:00
>>alexkr+(OP)
Wow! Just wow! Just as I think the situation cannot get any worse, the OP reveals even worse things going on. I know the UX of this blog and the lack of capitalization is going to turn many people off! But I urge you to power through and read the whole OP anyway.

Use reader mode, block Javascript or whatever it takes. Give the author a break. They're a teenager. What kind of websites were you making as a teenager? I'm sure one of those dark background websites with MARQUEEs and BLINKs with glaring contrast colors! So give them a break. Behind the annoying UX is an article about serious and appalling privacy and security issues.

Like read this:

> i raised this with chris, who's a full-time staff member (not a teenager), and he insisted that exposing physical addresses and sensitive info was "just a vuln" not a breach. said he's "never heard the term 'data breach' used that way" and... also relied on chatgpt instead of actual legal advice.

Actually this Chris guy has a point. I don't call it breach either. It's PII data exposure but it is a serious exposure. So I don't 100% agree with the OP but the cavalier attitude towards security coming from the staff of a legitimate organization is appalling.

It's just mind boggling that an organization handling PII data has such appalling privacy and security lapses and they still remain arrogantly indignant about it making bold claims about laws they don't understand, why, because ChatGPT told them so? Cherry on top is they are employing teenagers to answer legal questions! Not kidding! Just read the OP! Unbelievable!

◧◩
2. SigmaE+f12[view] [source] 2025-11-13 22:20:37
>>blende+Sb
Hello, Chris here!

Nobody—certainly not any adult staff—at Hack Club relied on ChatGPT for legal advice. Nor do we employ teenagers to answer legal questions, we have actual legal counsel for that! Or in my personal case I ask my wife, who is a law professor, and then she asks ChatGPT (just kidding).

There is too much nonsense in this post to rebut line by line, and these conversations have all been had to death within Hack Club (we put a lot of time into transparently and publicly discussing our programs, problems, and decisions). Here's the short version of this saga:

- The author found a serious vuln in one of our programs introduced by a junior engineer

- We take vulns seriously—especially the serious ones! It was fixed immediately by a senior engineer upon report (within a day?)

- The author insisted that their test of the vuln to access their own address was a data breach, therefore obligating us to notify all 5,000 participants of this "breach" as per GDPR

- We judged this to be Prima Facie incorrect. A lawyer has since confirmed this judgment.

- It is, in fact, bad practice to notify users for every vulnerability. If this were the norm, you would inundated with notices from practically every software product you interact with. Almost all of these notices would be virtually non-actionable by the user, and they would wash out the few notices of breaches which are actionable. There is a good reason why the GDPR does not demand notice for vulns; mass notices are reserved for incidents where there is a known exfiltration of a meaningful amount of user data!

- The author was ultimately banned from the community not for their opinions on this matter, but because of a long streak of unrelated conduct issues that culminated in a spree of saying horribly abusive things to multiple other members of the community.

— They have been pursuing a grudge against the organization ever since. They are not a reliable narrator, this post is a fantasy version of events that casts them as a martyred hero.

Hack Club is an oddly-shaped organization with operations that often raise very real security concerns, but these are wrapped up in a complex web of tradeoffs that are very much still evolving as we refine and expand our core infrastructure. We are not Google, and it is a mistake to import reasoning from that kind of environment when analyzing our security/threat model. Nonetheless, privacy/security is something we think about and invest extensively in. In the past year we have started an organization-wide bounty system, moved all PII storage into a central "identity vault", and consulted extensively with a very fancy lawyer who specializes in corporate compliance with the growing raft of online privacy laws around the world. The good news is, according to that lawyer we already do almost everything we need to be compliant; we just need to publish a privacy policy! We are actively iterating on a mostly-finished draft of this document with our counsel, but it is taking time because, well, this stuff is very complicated. We serve or have served teenagers in almost every country, and GDPR is just the most prominent of many laws that are now on the books worldwide.

◧◩◪
3. Throwa+Vd2[view] [source] 2025-11-13 23:48:12
>>SigmaE+f12
So was kids' data exposed or no?
◧◩◪◨
4. SigmaE+Ij2[view] [source] 2025-11-14 00:45:11
>>Throwa+Vd2
The short answer is no.
[go to top]