Dimensions of Diversity in Human Perceptions of Algorithmic Fairness

2 May 2020  ·  Nina Grgić-Hlača, Gabriel Lima, Adrian Weller, Elissa M. Redmiles ·

A growing number of oversight boards and regulatory bodies seek to monitor and govern algorithms that make decisions about people's lives. Prior work has explored how people believe algorithmic decisions should be made, but there is little understanding of how individual factors like sociodemographics or direct experience with a decision-making scenario may affect their ethical views. We take a step toward filling this gap by exploring how people's perceptions of one aspect of procedural algorithmic fairness (the fairness of using particular features in an algorithmic decision) relate to their (i) demographics (age, education, gender, race, political views) and (ii) personal experiences with the algorithmic decision-making scenario. We find that political views and personal experience with the algorithmic decision context significantly influence perceptions about the fairness of using different features for bail decision-making. Drawing on our results, we discuss the implications for stakeholder engagement and algorithmic oversight including the need to consider multiple dimensions of diversity in composing oversight and regulatory bodies.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here