Second-Order Asymptotically Optimal Statistical Classification

3 Jun 2018  ·  Lin Zhou, Vincent Y. F. Tan, Mehul Motani ·

Motivated by real-world machine learning applications, we analyze approximations to the non-asymptotic fundamental limits of statistical classification. In the binary version of this problem, given two training sequences generated according to two {\em unknown} distributions $P_1$ and $P_2$, one is tasked to classify a test sequence which is known to be generated according to either $P_1$ or $P_2$. This problem can be thought of as an analogue of the binary hypothesis testing problem but in the present setting, the generating distributions are unknown. Due to finite sample considerations, we consider the second-order asymptotics (or dispersion-type) tradeoff between type-I and type-II error probabilities for tests which ensure that (i) the type-I error probability for {\em all} pairs of distributions decays exponentially fast and (ii) the type-II error probability for a {\em particular} pair of distributions is non-vanishing. We generalize our results to classification of multiple hypotheses with the rejection option.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here