zlacker

[parent] [thread] 1 comments
1. shusso+(OP)[view] [source] 2019-07-24 12:00:27
Thanks for the links. I'm still a little confused by how differential privacy can be applied to non-aggregated fields. Can differentially private algorithms also be applied to mask/anonymise non-aggregated fields?
replies(1): >>majos+d1
2. majos+d1[view] [source] 2019-07-24 12:11:11
>>shusso+(OP)
You could, but if your statistic is a function of one person's data, differential privacy will force you to add enough noise to mask that one person's data, i.e. destroy almost all of the utility of the statistic.

It's possible to learn something by aggregating a bunch of those individually-privatized statistics. Randomized response [1] is a canonical example. More generally, local differential privacy is a stronger privacy model where users privatize their own data before releasing it for (arbitrary) analysis. As you might expect, the stronger privacy guarantee means worse utility, sometimes much worse [2].

[1] https://en.wikipedia.org/wiki/Randomized_response

[go to top]