Active Learning for Top-$K$ Rank Aggregation from Noisy Comparisons

ICML 2017  ·  Soheil Mohajer, Changho Suh, Adel Elmahdy ·

We explore an active top-$K$ ranking problem based on pairwise comparisons that are collected possibly in a sequential manner as per our design choice. We consider two settings: (1) top-$K$ sorting in which the goal is to recover the top-$K$ items in order out of $n$ items; (2) top-$K$ partitioning where only the set of top-$K$ items is desired. Under a fairly general model which subsumes as special cases various models (e.g., Strong Stochastic Transitivity model, BTL model and uniform noise model), we characterize upper bounds on the sample size required for top-$K$ sorting as well as for top-$K$ partitioning. As a consequence, we demonstrate that active ranking can offer significant multiplicative gains in sample complexity over passive ranking. Depending on the underlying stochastic noise model, such gain varies from around $\frac{\log n}{\log \log n}$ to $\frac{ n^2 \log n }{\log \log n}$. We also present an algorithm that is applicable to both settings.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here