Unfair Exposure of Artists in Music Recommendation

25 Mar 2020  ·  Himan Abdollahpouri, Robin Burke, Masoud Mansoury ·

Fairness in machine learning has been studied by many researchers. In particular, fairness in recommender systems has been investigated to ensure the recommendations meet certain criteria with respect to certain sensitive features such as race, gender etc. However, often recommender systems are multi-stakeholder environments in which the fairness towards all stakeholders should be taken care of. It is well-known that the recommendation algorithms suffer from popularity bias; few popular items are over-recommended which leads to the majority of other items not getting proportionate attention. This bias has been investigated from the perspective of the users and how it makes the final recommendations skewed towards popular items in general. In this paper, however, we investigate the impact of popularity bias in recommendation algorithms on the provider of the items (i.e. the entities who are behind the recommended items). Using a music dataset for our experiments, we show that, due to some biases in the algorithms, different groups of artists with varying degrees of popularity are systematically and consistently treated differently than others.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here