Neyman-Pearson Multi-class Classification via Cost-sensitive Learning

8 Nov 2021  ·  Ye Tian, Yang Feng ·

Most existing classification methods aim to minimize the overall misclassification error rate. However, in applications, different types of errors can have different consequences. Two popular paradigms have been developed to account for this asymmetry issue: the Neyman-Pearson (NP) paradigm and the cost-sensitive (CS) paradigm. Compared to the CS paradigm, the NP paradigm does not require a specification of costs. Most previous works on the NP paradigm focused on the binary case. In this work, we study the multi-class NP problem by connecting it to the CS problem and propose two algorithms. We extend the NP oracle inequalities and consistency from the binary case to the multi-class case, showing that our two algorithms enjoy these properties under certain conditions. The simulation and real data studies demonstrate the effectiveness of our algorithms. To our knowledge, this is the first work to solve the multi-class NP problem via cost-sensitive learning techniques with theoretical guarantees. The proposed algorithms are implemented in the R package npcs on CRAN.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here