Active Learning for Binary Classification with Abstention

1 Jun 2019  ·  Shubhanshu Shekhar, Mohammad Ghavamzadeh, Tara Javidi ·

We construct and analyze active learning algorithms for the problem of binary classification with abstention. We consider three abstention settings: \emph{fixed-cost} and two variants of \emph{bounded-rate} abstention, and for each of them propose an active learning algorithm. All the proposed algorithms can work in the most commonly used active learning models, i.e., \emph{membership-query}, \emph{pool-based}, and \emph{stream-based} sampling. We obtain upper-bounds on the excess risk of our algorithms in a general non-parametric framework and establish their minimax near-optimality by deriving matching lower-bounds. Since our algorithms rely on the knowledge of some smoothness parameters of the regression function, we then describe a new strategy to adapt to these unknown parameters in a data-driven manner. Since the worst case computational complexity of our proposed algorithms increases exponentially with the dimension of the input space, we conclude the paper with a computationally efficient variant of our algorithm whose computational complexity has a polynomial dependence over a smaller but rich class of learning problems.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here