Paper

Beyond Trees: Classification with Sparse Pairwise Dependencies

Several classification methods assume that the underlying distributions follow tree-structured graphical models. Indeed, trees capture statistical dependencies between pairs of variables, which may be crucial to attain low classification errors. The resulting classifier is linear in the log-transformed univariate and bivariate densities that correspond to the tree edges. In practice, however, observed data may not be well approximated by trees. Yet, motivated by the importance of pairwise dependencies for accurate classification, here we propose to approximate the optimal decision boundary by a sparse linear combination of the univariate and bivariate log-transformed densities. Our proposed approach is semi-parametric in nature: we non-parametrically estimate the univariate and bivariate densities, remove pairs of variables that are nearly independent using the Hilbert-Schmidt independence criteria, and finally construct a linear SVM on the retained log-transformed densities. We demonstrate using both synthetic and real data that our resulting classifier, denoted SLB (Sparse Log-Bivariate density), is competitive with popular classification methods.

Results in Papers With Code
(↓ scroll down to see all results)