zlacker

[parent] [thread] 45 comments
1. meowfa+(OP)[view] [source] 2020-04-21 19:11:53
In my opinion, there's no moral issue with doing this. Fighting fraud and other kinds of cybercrime is an endless cat-and-mouse game. Although there are very bad associations with it, one simply does need to use fingerprinting and supercookies/"zombie cookies"/"evercookies" if they want even a fighting chance.

I think if it's being solely used for such security purposes, isn't shared with or sold to anyone else, and is carefully safeguarded, then it's okay. The main risk I see from it is mission creep leading to it eventually being used for other purposes, like advertising or tracking for "market research" reasons. I don't personally think it's likely Stripe would do this, though.

replies(5): >>mtlync+96 >>mook+07 >>servic+aa >>lucb1e+xu >>meowfa+9U
2. mtlync+96[view] [source] 2020-04-21 19:53:51
>>meowfa+(OP)
> I think if it's being solely used for such security purposes, isn't shared with or sold to anyone else, and is carefully safeguarded, then it's okay. The main risk I see from it is mission creep leading to it eventually being used for other purposes, like advertising or tracking for "market research" reasons. I don't personally think it's likely Stripe would do this, though.

Is this view conditional on the type of data Stripe is currently collecting or would it apply to any data Stripe collects? Would this be true if Stripe began recording every keystroke in the app and hooked every XHR request to my backend server and sent copies to Stripe?

I agree that Stripe has a sensible reason for using this data. If I started seeing a high rate of chargebacks, I'd consider enabling Stripe on more parts of my site so that Stripe could consume user behavior earlier on to detect fraud.

My issue is that if there's no agreement about what data Stripe is allowed to collect and what their retention policies are, then the implicit agreement is that Stripe can just collect anything it has access to and hold it forever.

As JavaScript running on my page, Stripe.js has access to basically all user data in my app. There are certain types of user data I would not be comfortable sharing with Stripe, even if it improved fraud detection, so I'd like there to be clear limits on what they're gathering.

replies(1): >>meowfa+67
3. mook+07[view] [source] 2020-04-21 20:00:35
>>meowfa+(OP)
In my opinion, there _is_ a moral issue. Not in that they collect this information for fraud prevention; that seems like a reasonable use for that data. It's in not having informed consent, in not having a clear document describing what is collected and when it is purged. And that document would need to be consumer-facing (since it's not the vendor's behaviour being tracked).

Responding after being caught is… good, but not as good as not needing to be caught.

replies(3): >>pc+s7 >>meowfa+f8 >>jimmas+f9
◧◩
4. meowfa+67[view] [source] [discussion] 2020-04-21 20:01:33
>>mtlync+96
Yes, I would say it's conditional. They should be more explicit about what data they're collecting from users. Opaque enough to not reveal all of the exact techniques, but clear enough so site owners can make an informed decision. (Someone dedicated and experienced enough could probably reverse engineer Stripe.js and figure out everything it's doing if they really wanted, but they're also probably updating it regularly.)
replies(1): >>adamby+yh
◧◩
5. pc+s7[view] [source] [discussion] 2020-04-21 20:04:27
>>mook+07
This is a fair call-out. We have actually worked pretty hard to ensure that our Privacy[1] and Cookies[2] policies are clear and easy-to-read, rather than filled with endless boilerplate jargon. But we still did make a mistake by not have a uniquely clear document covering Stripe.js fraud prevention in particular.

[1] https://stripe.com/privacy

[2] https://stripe.com/cookies-policy/legal

replies(1): >>neltne+jq
◧◩
6. meowfa+f8[view] [source] [discussion] 2020-04-21 20:09:53
>>mook+07
That's true. They should give more clear and explicit information so site owners can make an informed decision. Including the difference in what's collected if the script is included on just the checkout page(s) vs. on every page.
◧◩
7. jimmas+f9[view] [source] [discussion] 2020-04-21 20:15:29
>>mook+07
I am so sick of informed consent and cookie and GDPR etc. popups and banners and forms and checkboxes. I could not care less and neither could most people out there. This crap is ruining the internet for no tangible benefit to the inexplicable thunderous applause of people on tech websites. It didn't hurt anyone when Sears collected rewards data for advertising and it never hurt anyone when web companies used data from user interaction. A simple static webpage is going to end up impossible for anyone but a megacorp to run legally if we keep going down this nonsensical path.

Imagine I mailed you an unsolicited letter and you were legally required to burn it and never say or benefit from what was inside just because I said so. That's the insanity of these "privacy" laws.

replies(2): >>litera+Pc >>meowfa+Ld
8. servic+aa[view] [source] 2020-04-21 20:21:17
>>meowfa+(OP)
Why are 'evercookies' necessary.

My browser is setup to record no history or cookies.

It can be annoying to always have to dismiss the same popups you've dismissed before, but I've never had any issues with online payments or unnecessary captchas, including using stripe.

What am I missing?

replies(2): >>meowfa+vd >>brongo+2s
◧◩◪
9. litera+Pc[view] [source] [discussion] 2020-04-21 20:42:44
>>jimmas+f9
Or you could just not collect information you don't need? You don't have to ask consent if you just don't do it, you know. The pop-ups are annoying because the website owners want you to just click through. Ever seen one of those where you have to uncheck every single box? Yep, those violate the GDPR. The default setting should be no advertising or other bullshit data, and opt-in if you want it. Which no one ever does. Hence the violations. Get mad at the manipulative ad companies, not the people who for once produced an OK piece of regulation.
replies(2): >>renewi+lf >>jimmas+pg
◧◩
10. meowfa+vd[view] [source] [discussion] 2020-04-21 20:48:01
>>servic+aa
Because fraudsters' browsers/clients/scripts are also set up to record no history or cookies, and otherwise evade detection/categorization as much as possible. Somewhat ironically, in order for them to accurately distinguish between privacy-conscious users like yourself and actual criminals, and to block criminals from making a purchase while not incorrectly blocking you, they need to collect additional data.
replies(2): >>yjftsj+cq >>servic+TJ
◧◩◪
11. meowfa+Ld[view] [source] [discussion] 2020-04-21 20:49:53
>>jimmas+f9
I agree with you in general, but this is a big step up. This is essentially the most invasive, intrusive technology that can possibly be deployed on the web - because fraudsters (and other cybercriminals) use the most tricky, dynamic evasion techniques.

And this is regarding website owners adding a script that may run on every page of their site; the consent is for the website owners who are using Stripe and deciding how/if to add their script to their pages.

replies(1): >>matz1+Ns
◧◩◪◨
12. renewi+lf[view] [source] [discussion] 2020-04-21 20:59:03
>>litera+Pc
He's not the guy who is collecting data. He's the guy whose data is being collected. And I agree with him. True choice is not imposing this cost on everyone. Let me set it in my browser. Then I'll consent to practically everything and you can consent to nothing. And since it's set at your user agent you can synchronize that across devices easily.

If I never see another damned cookie popup I'd be thrilled.

replies(2): >>meowfa+og >>Nextgr+SI
◧◩◪◨⬒
13. meowfa+og[view] [source] [discussion] 2020-04-21 21:05:41
>>renewi+lf
The cookie law is just insane to me. GDPR, or at least the parts that are commonly talked about, seems a lot more reasonable: a user should be able to request what data is being collected about them, and should be able to request a full account deletion, including deletion of all data collected from or about them (perhaps minus technical things that are very difficult to purge, like raw web server access logs).
replies(1): >>renewi+Zg
◧◩◪◨
14. jimmas+pg[view] [source] [discussion] 2020-04-21 21:05:42
>>litera+Pc
How do you expect the web to be funded without this advertiser data? People won't pay for every single website.
replies(2): >>adamby+7j >>mkolod+qj
◧◩◪◨⬒⬓
15. renewi+Zg[view] [source] [discussion] 2020-04-21 21:10:03
>>meowfa+og
> a user should be able to request what data is being collected about them, and should be able to request a full account deletion, including deletion of all data collected from or about them (perhaps minus technical things that are very difficult to purge, like raw web server access logs)

I think I'd find it very easy to like this. Honestly, these aspects of GDPR are great. Things I don't like:

* Not allowed to do "no service without data"

* Consent must be opt-in

Bloody exasperating as a user. At least if they'd set it in my user agent. But the browser guys just sit there like fools pontificating on third-party cookies instead of innovating for once and placing the opt-in / opt-out in the browser.

replies(2): >>mcpeep+Eo >>icebra+ix
◧◩◪
16. adamby+yh[view] [source] [discussion] 2020-04-21 21:14:37
>>meowfa+67
There's no need to rely on "security by obscurity". Stripe.js is just a thin client-side library everybody can analyse, so they might as well be fully transparent when it comes to data collection. I don't see the point of trying to obfuscate things, especially since the actual fraud-detection model works on the backend anyway.
replies(1): >>meowfa+Vk
◧◩◪◨⬒
17. adamby+7j[view] [source] [discussion] 2020-04-21 21:25:44
>>jimmas+pg
Interesting question, since the web used to exist and work just fine before online advertising. I'm not saying that we should go back in time, but claiming that ads are a requirement for the web to exist is a slight overstatement.
replies(1): >>joshua+4S2
◧◩◪◨⬒
18. mkolod+qj[view] [source] [discussion] 2020-04-21 21:29:12
>>jimmas+pg
The same way businesses have always been funded - by selling things people think are worth buying.
replies(3): >>jimmas+2l >>rafi_k+Ll >>nl+RA
◧◩◪◨
19. meowfa+Vk[view] [source] [discussion] 2020-04-21 21:39:40
>>adamby+yh
With these kinds of adversarial things, I think it's a mix of frontend and backend.

It's a library everyone can technically analyze, yes, but by 1) using ever-changing obfuscation that requires a lot of work to RE, and 2) constantly changing the client-side logic itself, it makes the work of the adversaries a lot harder and more tedious, and means either fewer of them will consistently succeed, or more of them will be forced to become more centralized around solutions/services that've successfully solved it, which means Stripe can focus-fire their efforts a bit more.

Of course there's also a lot going on on the backend that'll never be seen, but the adversary is trying to mimic a legitimate user as much as they can, so if the JavaScript is totally unobfuscated and stays the same for a while, it's a lot easier for them to consistently trace exactly what data is being sent and compare it against what their system or altered browser is sending.

It's cat-and-mouse across many dimensions. In such adversarial games, obscurity actually can and often does add some security. "Security by obscurity is no security at all" isn't exactly a fallacy, but it is a fallacy to apply it universally and with a very liberal definition of "security". It's generally meant for things that are more formal or provable, like an encryption or hashing algorithm or other cryptography. It's still totally reasonable to use obscurity as a minor practical measure. I'd agree with this part of https://en.wikipedia.org/wiki/Security_through_obscurity: "Knowledge of how the system is built differs from concealment and camouflage. The efficacy of obscurity in operations security depends by whether the obscurity lives on top of other good security practices, or if it is being used alone. When used as an independent layer, obscurity is considered a valid security tool."

For example, configuring your web server to not display its version on headers or pages is "security by obscurity", and certainly will not save you if you're running a vulnerable version, but may buy you some time if a 0-day comes out for your version and people search Shodan for the vulnerable version numbers - your site won't appear in the list. These kinds of obscurity measures of course never guarantee security and should be the very last line of defense in front of true security measures, but they can still potentially help you a little.

In the "malware vs. anti-virus" and "game cheat vs. game cheat detection software" fights that play out every day, both sides of each heavily obfuscate their code and the actions they perform. No, this never ensures it won't be fully reverse engineered. And the developers all know that. Given enough time and dedication, it'll eventually happen. But it requires more time and effort, and each time it's altered, it requires a re-investment of that time and effort.

Obfuscation and obscurity is arguably the defining feature and "value proposition" of each of those four types of software. A lot of that remains totally hidden on the backend (e.g. a botnet C2 web server only responding with malware binaries if they analyze the connection and believe it really is a regular infected computer and not a security researcher or sandbox), but a lot is also present in the client.

replies(2): >>adamby+Ss >>Comput+RN
◧◩◪◨⬒⬓
20. jimmas+2l[view] [source] [discussion] 2020-04-21 21:40:26
>>mkolod+qj
Reddit for example has nothing to sell me directly unless it was subscription-based which is a nonstarter. There's no other model for sites like that besides maybe browser-based crypto mining.
replies(3): >>supert+wq >>adamby+2r >>meowfa+K92
◧◩◪◨⬒⬓
21. rafi_k+Ll[view] [source] [discussion] 2020-04-21 21:45:48
>>mkolod+qj
That's not going to work for plenty of services. Most people (if not everyone) are not going to pay for search, social network, instant messaging, maps, mail etc.
replies(2): >>supert+Hq >>Silhou+St
◧◩◪◨⬒⬓⬔
22. mcpeep+Eo[view] [source] [discussion] 2020-04-21 22:04:10
>>renewi+Zg
Is this not what the DNT (Do Not Track) header was attempting to achieve before it was essentially abandoned (after being implemented in all major browsers)? Genuinely curious what sort of user agent approach you're looking for.
replies(1): >>renewi+Yv
◧◩◪
23. yjftsj+cq[view] [source] [discussion] 2020-04-21 22:17:00
>>meowfa+vd
> Because fraudsters' browsers/clients/scripts are also set up to record no history or cookies, and otherwise evade detection/categorization as much as possible

Ah, right, bad guys use privacy-enhancing tech, so we'd better undermine it, even if it screws over legitimate users. You know what fraudsters also tend to use? Chrome. Let's block that, shall we?

◧◩◪
24. neltne+jq[view] [source] [discussion] 2020-04-21 22:18:08
>>pc+s7
Could you explain in plain language how this is different or the same as what a credit card company does?

My outsider understanding was that credit card companies happily sell your purchase history or at least aggregate it for marketing, in addition to using your purchase history model to predict if a purchase is fraudulent.

replies(1): >>varenc+hB
◧◩◪◨⬒⬓⬔
25. supert+wq[view] [source] [discussion] 2020-04-21 22:20:25
>>jimmas+2l
It wouldn't be a non-starter if no other site could do the same thing without also charging for a subscription. Services like Facebook, Reddit, and Instagram all provide a service that many people find valuable. Let people pay for it.
◧◩◪◨⬒⬓⬔
26. supert+Hq[view] [source] [discussion] 2020-04-21 22:22:01
>>rafi_k+Ll
Why would they not? If someone wants to be able to use a social network, do you really think they wouldn't pay $5/month for something they use as much if not more than Netflix? You can't do it now because other services can undercut you and rely on advertising but there is no reason it couldn't be the standard.
◧◩◪◨⬒⬓⬔
27. adamby+2r[view] [source] [discussion] 2020-04-21 22:24:43
>>jimmas+2l
Reddit can sell you virtual coins: https://www.reddit.com/premium
◧◩
28. brongo+2s[view] [source] [discussion] 2020-04-21 22:32:32
>>servic+aa
Fastmail account recovery keeps an "evercookie" which is "first time account X successfully logged in from this device" which allows us to identify that you're using a device with a long history with the account when trying to recover your account after it was stolen. Obviously we don't want to re-authenticate somebody who first logged in yesterday, because that's probably the thief - but if your computer has been used successfully to log in for the past few years, then it's more likely that the recovery attempt is coming from you (obviously, that's still just one of many things we're checking for).
replies(1): >>dylz+672
◧◩◪◨
29. matz1+Ns[view] [source] [discussion] 2020-04-21 22:39:33
>>meowfa+Ld
In the end its simply preference, I'm fine with it, you are not fine with it. Its then boil down to who can force other to follow.
◧◩◪◨⬒
30. adamby+Ss[view] [source] [discussion] 2020-04-21 22:40:27
>>meowfa+Vk
Thanks for a thoughtful reply (upvoted), but have you looked at the library in question? The code is minified, but there is not much obfuscation going on: https://js.stripe.com/v3/

Most of your examples are quite low-level, but it's much harder to keep things hidden within the constraints of the browser sandbox when you have to interface with standard APIs which can be easily instrumented.

replies(1): >>meowfa+JT
◧◩◪◨⬒⬓⬔
31. Silhou+St[view] [source] [discussion] 2020-04-21 22:52:04
>>rafi_k+Ll
That seems like quite a big assumption. Younger generations today think nothing of spending $xx/month on their phone/data plans and another $x/month on each of Netflix/Spotify/etc. It's not hard to imagine the same people paying real money for social networking sites they value. Search could obviously still do advertising even without any personal data mining, since it knows exactly what you're interested in at that particular moment. Useful informational sites could run ads without the privacy invasion and tracking as well, since they also are aimed at specific target audiences. Plenty more sites would continue to run without a (direct) goal of revenue generation anyway; I see no ads on the free-to-use discussion forum that we're all reading right now.

This idea that the only viable business model on the web is spyware-backed advertising is baloney, and it always has been. There is little reason to assume the Web is a better place because the likes of Google and Facebook have led us down this path, nor that anything of value would be lost if they were prohibited from continuing in the same way.

32. lucb1e+xu[view] [source] 2020-04-21 22:57:20
>>meowfa+(OP)
A blanket statement "we need privacy invasion to have any chance against fraud, it cannot be done without, period." without argumentation about why we need this against fraud isn't very constructive.

For example, in my experience: user pays, website gets money, website releases product. It's the user that could be defrauded, not the website. I never heard of fraud issues from the website owners' perspective in the Netherlands where credit cards are just not the main method to pay online. Fraudulent Paypal chargebacks, sure, but iDeal.nl or a regular SEPA transfer just doesn't have chargebacks. It would appear that there is a way to solve this without tracking.

◧◩◪◨⬒⬓⬔⧯
33. renewi+Yv[view] [source] [discussion] 2020-04-21 23:07:10
>>mcpeep+Eo
Actually, I've changed my mind. I think people fall into either the advertiser+publisher camp who don't want this in the browser chrome because it will make it too easy to full opt-out and the browser guys don't want it there because they actually just want the advertisers to die out. What I'm asking for is not a stable equilibrium in any way so it's a pointless thought experiment.
◧◩◪◨⬒⬓⬔
34. icebra+ix[view] [source] [discussion] 2020-04-21 23:22:10
>>renewi+Zg
How is the first exasperating you as a user?
replies(1): >>renewi+jA
◧◩◪◨⬒⬓⬔⧯
35. renewi+jA[view] [source] [discussion] 2020-04-21 23:51:07
>>icebra+ix
Pretty sure I'd get an option of "free X more articles if you give us your data to sell". Not getting that is annoying because I was fine with giving away my data for articles.
◧◩◪◨⬒⬓
36. nl+RA[view] [source] [discussion] 2020-04-21 23:56:40
>>mkolod+qj
Media businesses have been funded by advertising for hundreds of years (since the start of regular newspapers in the 1600s at least)[1]. Many internet businesses are more like media businesses than shops.

[1] https://en.wikipedia.org/wiki/History_of_advertising#16th%E2...

◧◩◪◨
37. varenc+hB[view] [source] [discussion] 2020-04-22 00:01:03
>>neltne+jq
Stripe’s very readable privacy policy makes a clear statement on this:

Stripe does not sell or rent Personal Data to marketers or unaffiliated third parties. We share your Personal Data with trusted entities, as outlined below.

From that and my reading of the rest, I think the answer is clearly no. Also I doubt the data of consumer purchases on Stripe integrated websites is even that valuable to begin with. At least compared to Stripe’s margins.

◧◩◪◨⬒
38. Nextgr+SI[view] [source] [discussion] 2020-04-22 01:24:14
>>renewi+lf
The problem is that imposing the unsafe choice (aka tracking being on by default) puts people who'd rather opt out at risk (because their data is being leaked), while the current situation merely puts an annoyance to people who are happy to opt-in.

As far as the cookie popups go the majority of them are not actually GDPR compliant. Tracking should be off by default and consent should be freely given, which means it should be just as easy to opt-in as it is to opt-out. If it's more difficult to say no than yes then the consent is invalid and they might as well just do away with the prompt completely since they're breaking the regulation either way.

◧◩◪
39. servic+TJ[view] [source] [discussion] 2020-04-22 01:37:24
>>meowfa+vd
Right, but I'm saying my setup is no different from the 'fraudsters' you describe, yet I have a seamless shopping experience online.

If I'm able to shop online without issues, why does everyone else 'need' an evercookie?

I'm sure it's helpful, it's the idea that it's necessary is what I take issue with.

replies(1): >>floati+B71
◧◩◪◨⬒
40. Comput+RN[view] [source] [discussion] 2020-04-22 02:24:25
>>meowfa+Vk
I’ve implemented my own Stripe checkout for a native application in just a couple of hours, using their REST API. There’s nothing stopping everyone else from doing the same: it’s literally how you used to integrate with payment gateways before Stripe came along. No one gave you a JS library to use on your website.
◧◩◪◨⬒⬓
41. meowfa+JT[view] [source] [discussion] 2020-04-22 03:22:46
>>adamby+Ss
Yeah, theirs is far less obfuscated than most fraud/bot detection libraries I've seen. I believe almost all of the JS code I've seen from companies that primarily do fraud detection and web security is pretty heavily obfuscated. Here, it looks like Stripe.js is doing much more than just the fraud stuff - this is their client library for everything, including payment handling.

I haven't analyzed it and can't say this with any certainty, but my guess is that you're probably right: they're focusing primarily on backend analysis and ML comparing activity across a massive array of customers. This is different from smaller security firms who have a lot less data due to fewer customers, and a kind of sampling bias of customers who are particularly worried about or inundated by fraud.

They may be less interested in suspicious activity or fingerprinting at the device level and more interested in it at the payment and personal information level (which is suggested by articles like https://stripe.com/radar/guide).

Pure, uninformed speculation, but it's possible that if they get deeper into anti-fraud in the future (perhaps if fraudsters get smarter about this higher layer of evasion), they might supplement the data science / finance / payment oriented stuff with more lower-level device and browser analysis, in which case I wouldn't be surprised if they eventually separate out some of the anti-fraud/security parts into an obfuscated portion. (Or, more likely, have Stripe.js load that portion dynamically. Maybe they're already doing this, even? Dunno.)

42. meowfa+9U[view] [source] 2020-04-22 03:26:01
>>meowfa+(OP)
Addendum: I have no clue if they actually are using fingerprinting or supercookies. I just know many anti-fraud service providers do.
◧◩◪◨
43. floati+B71[view] [source] [discussion] 2020-04-22 05:44:50
>>servic+TJ
If you don't commit fraud, the only two issues you'll see are that:

1) a small subset of sites will refuse to complete the transaction, as their anti-fraud thresholds are set to deny likely-fraudulent browsers such as yours; and,

2) you will be much more easily fingerprinted and tracked online due to your specific combination of extremely uncommon non-default settings in your browser (which may well mitigate #1 if you're originating from a residential IP address).

If you purchase high-value audio gear or clothing or gift cards — basically, high value things can be resold on eBay immediately — you may find your transaction held for review while someone phone calls you to prove that you're you, but for everyday Amazon or etc. purchasing it's unlikely to matter at all.

◧◩◪
44. dylz+672[view] [source] [discussion] 2020-04-22 15:16:51
>>brongo+2s
Do they really? evercookie generally has a specific definition, where the application attempts to persist that chunk of data in heavy-handed, abusive, malware-like ways and repopulating it on removal with the same token when possible; usually used in fingerprinting concepts - it isn't just a normal http cookie with the expiration date set years out.
◧◩◪◨⬒⬓⬔
45. meowfa+K92[view] [source] [discussion] 2020-04-22 15:30:37
>>jimmas+2l
I wouldn't be surprised if a high percentage of reddit users use something like uBlock. I think universal ad blockers are going to slowly become more ubiquitous over time, too.

People have been trying to find ways to skip TV commercials for decades. It's going to be the same with ads. When it comes to our own personal devices, advertisers can't really win in the end. They're going to have to stick to things like billboards and other things put up in cities, but even those are being protested and banned in many places.

In theory, what about reddit can't be decentralized? All it stores is text and URLs to other content. There isn't all that much actual processing or computation going on, as far as I know, besides some rank calculation stuff. Am I wrong about this?

In that case, it comes down to figuring out how to pay the developers and some kind of election process for admins. But with a site with hundreds of millions of monthly active users, surely they'd be able to figure something out. Like each user who donates $10 or more gets a little perk.

And even without decentralization, micropayments and premium perks are already a much more promising model. Lots of people are buying reddit's silver/gold/platinum/a bunch of others awards. Tinder is free by default and manages to make loads of money without showing any ads. I don't think ads are going to be a sustainable model in 10, 20, 50 years from now. I think service providers are just going to have to figure out ways to provide value to users in exchange for money, like most "meatspace" companies do.

◧◩◪◨⬒⬓
46. joshua+4S2[view] [source] [discussion] 2020-04-22 19:37:25
>>adamby+7j
> Interesting question, since the web used to exist and work just fine before online advertising.

For what definition of "work"? There were static informational pages and....not much else. Content that requires upkeep requires revenue requires either ads or access fees, usually.

[go to top]