FAPFID: A Fairness-Aware Approach for Protected Features and Imbalanced Data - Gestion des Données Accéder directement au contenu
Article Dans Une Revue Transactions on Large-Scale Data- and Knowledge-Centered Systems Année : 2023

FAPFID: A Fairness-Aware Approach for Protected Features and Imbalanced Data

Résumé

The use of automated decision-making based on machine learning algorithms has raised concerns about potential discrimination against minority group defined by protected features such as gender, race, etc. Particularly, in some areas with high social impact such as justice, job search or healthcare, it has been observed that using protected feature in machine learning algorithms can lead to unfair decisions that favor one group (privileged) over another group (unprivileged). In order to improve fairness in decision-making with regard to protected features, many machine learning approaches focus either on discarding the protected features or maintaining an overall accuracy performance for both unprivileged and privileged groups. However, we notice that these approaches have limited efficiency in the case where the protected features are useful for the learning model or when dealing with imbalanced data. To overcome this limitation when dealing with such issues, we propose in this work FAPFID, a fairness-aware strategy based on the use of balanced and stable clusters. To do this, we divide our input data into stable clusters (subgroups) while ensuring that privileged and unprivileged groups are fairly represented in each cluster. Experiments on three real-world and biased datasets demonstrated that our proposed method outperforms state-of-the-art fairness-aware methods under comparison in terms of performance and fairness scores.
Fichier principal
Vignette du fichier
fapfid_tldks.pdf (408.53 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03995398 , version 1 (18-02-2023)

Licence

Copyright (Tous droits réservés)

Identifiants

Citer

Ginel Dorleon, Imen Megdiche, Nathalie Bricon-Souf, Olivier Teste. FAPFID: A Fairness-Aware Approach for Protected Features and Imbalanced Data. Transactions on Large-Scale Data- and Knowledge-Centered Systems, 2023, Lecture Notes in Computer Science book series (LNCS), 13840 (TLDKS), pp.107-125. ⟨10.1007/978-3-662-66863-4_5⟩. ⟨hal-03995398⟩
204 Consultations
90 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More