1 code implementation • 22 Dec 2023 • Iris Dominguez-Catena, Daniel Paternain, Mikel Galar
Often, these biases can be traced back to the data used for training, where large uncurated datasets have become the norm.
1 code implementation • 28 Mar 2023 • Iris Dominguez-Catena, Daniel Paternain, Mikel Galar
One of the most prominent types of demographic bias are statistical imbalances in the representation of demographic groups in the datasets.
no code implementations • 11 Oct 2022 • Iris Dominguez-Catena, Daniel Paternain, Mikel Galar
Our findings support the need for a thorough bias analysis of public datasets in problems like FER, where a global balance of demographic representation can still hide other types of bias that harm certain demographic groups.
Facial Expression Recognition Facial Expression Recognition (FER)
1 code implementation • 20 May 2022 • Iris Dominguez-Catena, Daniel Paternain, Mikel Galar
Of the three metrics proposed, two focus on the representational and stereotypical bias of the dataset, and the third one on the residual bias of the trained model.
Facial Expression Recognition Facial Expression Recognition (FER) +1