The Performance Analysis of Generalized Margin Maximizers on Separable Data

ICML 2020  ·  Fariborz Salehi, Ehsan Abbasi, Babak Hassibi ·

Logistic models are commonly used for binary classification tasks. The success of such models has often been attributed to their connection to maximum-likelihood estimators. It has been shown that SGD algorithms , when applied on the logistic loss, converge to the max-margin classifier (a.k.a. hard-margin SVM). The performance of hard-margin SVM has been recently analyzed in~\cite{montanari2019generalization, deng2019model}. Inspired by these results, in this paper, we present and study a more general setting, where the underlying parameters of the logistic model possess certain structures (sparse, block-sparse, low-rank, etc.) and introduce a more general framework (which is referred to as “Generalized Margin Maximizer”, GMM). While classical max-margin classifiers minimize the 2-norm of the parameter vector subject to linearly separating the data, GMM minimizes any arbitrary convex function of the parameter vector. We provide a precise performance analysis of the generalization error of such methods and show improvement over the max-margin method which does not take into account the structure of the model. In future work we show that mirror descent algorithms, with a properly tuned step size, can be exploited to achieve GMM classifiers. Our theoretical results are validated by extensive simulation results across a range of parameter values, problem instances, and model structures.

PDF ICML 2020 PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here