Differential Privacy has Bounded Impact on Fairness in Classification - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Preprints, Working Papers, ... Year : 2022

Differential Privacy has Bounded Impact on Fairness in Classification

Abstract

We theoretically study the impact of differential privacy on fairness in classification. We prove that, given a class of models, popular group fairness measures are pointwise Lipschitz-continuous with respect to the parameters of the model. This result is a consequence of a more general statement on accuracy conditioned on an arbitrary event (such as membership to a sensitive group), which may be of independent interest. We use the aforementioned Lipschitz property to prove a high probability bound showing that, given enough examples, the fairness level of private models is close to the one of their non-private counterparts.
Fichier principal
Vignette du fichier
paper.pdf (839.21 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03902203 , version 1 (27-01-2023)
hal-03902203 , version 2 (18-09-2023)

Identifiers

Cite

Paul Mangold, Michaël Perrot, Aurélien Bellet, Marc Tommasi. Differential Privacy has Bounded Impact on Fairness in Classification. 2022. ⟨hal-03902203v1⟩
101 View
52 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More