no code implementations • 5 Feb 2024 • Vitalii Emelianov, Michaël Perrot
We theoretically study how differential privacy interacts with both individual and group fairness in binary linear classification.
no code implementations • 10 Dec 2021 • Vitalii Emelianov, Nicolas Gast, Krishna P. Gummadi, Patrick Loiseau
In the second setting (with known variances), imposing the $\gamma$-rule decreases the utility but we prove a bound on the utility loss due to the fairness mechanism.
no code implementations • 24 Jun 2020 • Vitalii Emelianov, Nicolas Gast, Krishna P. Gummadi, Patrick Loiseau
We then compare the utility obtained by imposing a fairness mechanism that we term $\gamma$-rule (it includes demographic parity and the four-fifths rule as special cases), to that of a group-oblivious selection algorithm that picks the candidates with the highest estimated quality independently of their group.
1 code implementation • 15 Jun 2019 • Vitalii Emelianov, George Arvanitakis, Nicolas Gast, Krishna Gummadi, Patrick Loiseau
In particular, our experiments show that the price of local fairness is generally smaller when the sensitive attribute is observed at the first stage; but globally fair selections are more locally fair when the sensitive attribute is observed at the second stage---hence in both cases it is often possible to have a selection that has a small price of local fairness and is close to locally fair.