zlacker

[parent] [thread] 3 comments
1. Charle+(OP)[view] [source] 2022-08-14 23:55:36
> CSAM was a blatant attempt for breaching user privacy and classification with third party agency hidden criteria.

I feel like there's a good amount of FUD about this, so for anyone who might not know: All online file hosts do CSAM matching against known CSAM images, regardless of the client OS(s) you're using. In Apple's case specifically, matching only happens to images you've uploaded to iCloud Photos.¹

¹ https://www.apple.com/child-safety/pdf/Expanded_Protections_...

replies(1): >>nbzso+v
2. nbzso+v[view] [source] 2022-08-15 00:00:55
>>Charle+(OP)
I trust the experts on this.

Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life.

"That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change."

https://www.eff.org/deeplinks/2021/08/apples-plan-think-diff...

replies(2): >>lttlrc+O8 >>simonh+9J
◧◩
3. lttlrc+O8[view] [source] [discussion] 2022-08-15 01:26:26
>>nbzso+v
The point is Apple was already reacting to an external pressure. They don't make up the rules. They were attempting to be open about it, and also give the user options to avoid it (don't use iCloud).

It's great that you've found an alternative that suits you but I think it's disingenuous to argue that Apple is the culprit.

Talk to your representative.

◧◩
4. simonh+9J[view] [source] [discussion] 2022-08-15 08:25:52
>>nbzso+v
My next door neighbour owning a sledgehammer is a "fully built system just waiting" to bash my door down, but I don't lose any sleep over it.
[go to top]