On the information leakage of differentially-private mechanisms
Résumé
Differential privacy aims at protecting the privacy of participants in
statistical databases. Roughly, a mechanism satisfies differential privacy if
the presence or value of a single individual in the database does not
significantly change the likelihood of obtaining a certain answer to any
statistical query posed by a data analyst. Differentially-private mechanisms are
often oblivious: first the query is processed on the database to produce a true
answer, and then this answer is adequately randomized before being reported to
the data analyst. Ideally, a mechanism should minimize leakage, i.e., obfuscate
as much as possible the link between reported answers and individuals' data,
while maximizing utility, i.e., report answers as similar as possible to the
true ones. These two goals, however, are in conflict with each other, thus
imposing a trade-off between privacy and utility.
In this paper we use quantitative information flow principles to analyze leakage
and utility in oblivious differentially-private mechanisms. We introduce a
technique that exploits graph symmetries of the adjacency relation on databases
to derive bounds on the min-entropy leakage of the mechanism. We consider a
notion of utility based on identity gain functions, which is closely related to
min-entropy leakage, and we derive bounds for it. Finally, given some graph
symmetries, we provide a mechanism that maximizes utility while preserving the
required level of differential privacy.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...