Search Results for author: Stephen A. Vavasis

Found 4 papers, 0 papers with code

A termination criterion for stochastic gradient descent for binary classification

no code implementations23 Mar 2020 Sina Baghal, Courtney Paquette, Stephen A. Vavasis

We propose a new, simple, and computationally inexpensive termination test for constant step-size stochastic gradient descent (SGD) applied to binary classification on the logistic and hinge loss with homogeneous linear predictors.

Binary Classification Classification +1

On the Complexity of Robust PCA and $\ell_1$-norm Low-Rank Matrix Approximation

no code implementations30 Sep 2015 Nicolas Gillis, Stephen A. Vavasis

The low-rank matrix approximation problem with respect to the component-wise $\ell_1$-norm ($\ell_1$-LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning.

Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization

no code implementations8 Oct 2013 Nicolas Gillis, Stephen A. Vavasis

Nonnegative matrix factorization (NMF) under the separability assumption can provably be solved efficiently, even in the presence of noise, and has been shown to be a powerful technique in document classification and hyperspectral unmixing.

Document Classification Hyperspectral Unmixing +1

Fast and Robust Recursive Algorithms for Separable Nonnegative Matrix Factorization

no code implementations6 Aug 2012 Nicolas Gillis, Stephen A. Vavasis

In this paper, we study the nonnegative matrix factorization problem under the separability assumption (that is, there exists a cone spanned by a small subset of the columns of the input nonnegative data matrix containing all columns), which is equivalent to the hyperspectral unmixing problem under the linear mixing model and the pure-pixel assumption.

Hyperspectral Unmixing

Cannot find the paper you are looking for? You can Submit a new open access paper.