A statistical approach to detect disparity prone features in a group fairness setting - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Journal Articles AI and Ethics Year : 2023

A statistical approach to detect disparity prone features in a group fairness setting

Abstract

The use of machine learning models in decision support systems with high societal impact raised concerns about unfair (disparate) results for different groups of people. When evaluating such unfair decisions, one generally relies on predefined groups that are determined by a set of features that are considered sensitive. However, such an approach is subjective and does not guarantee that these features are the only ones to be considered as sensitive nor that they entail unfair (disparate) outcomes. In this paper, we propose a preprocessing step to address the task of automatically recognizing disparity prone features that does not require a trained model to verify unfair results. Our proposal is based on the Hilbert-Schmidt independence criterion, which measures the statistical dependence of variable distributions. We hypothesize that if the dependence between the label vector and a candidate is high for a sensitive feature, then the information provided by this feature will entail disparate performance measures between groups. Our empirical results attest our hypothesis and show that several features considered as sensitive in the literature do not necessarily entail disparate (unfair) results.
Fichier principal
Vignette du fichier
2305.06994.pdf (629.47 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04096649 , version 1 (13-05-2023)

Licence

Attribution

Identifiers

Cite

Guilherme Dean Pelegrina, Miguel Couceiro, Leonardo Tomazeli Duarte. A statistical approach to detect disparity prone features in a group fairness setting. AI and Ethics, 2023, ⟨10.48550/arXiv.2305.06994⟩. ⟨hal-04096649⟩
53 View
33 Download

Altmetric

Share

Gmail Facebook X LinkedIn More