Search Results for author: Damek Davis

Found 23 papers, 4 papers with code

Asymptotic normality and optimality in nonsmooth stochastic approximation

no code implementations16 Jan 2023 Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang

In their seminal work, Polyak and Juditsky showed that stochastic approximation algorithms for solving smooth equations enjoy a central limit theorem.

Open-Ended Question Answering

Clustering a Mixture of Gaussians with Unknown Covariance

no code implementations4 Oct 2021 Damek Davis, Mateo Díaz, Kaizheng Wang

We investigate a clustering problem with data from a mixture of Gaussians that share a common but unknown, and potentially ill-conditioned, covariance matrix.

Clustering

Active manifolds, stratifications, and convergence to local minima in nonsmooth optimization

no code implementations26 Aug 2021 Damek Davis, Dmitriy Drusvyatskiy, Liwei Jiang

We show that the subgradient method converges only to local minimizers when applied to generic Lipschitz continuous and subdifferentially regular functions that are definable in an o-minimal structure.

Escaping strict saddle points of the Moreau envelope in nonsmooth optimization

no code implementations17 Jun 2021 Damek Davis, Mateo Díaz, Dmitriy Drusvyatskiy

The main conclusion is that a variety of algorithms for nonsmooth optimization can escape strict saddle points of the Moreau envelope at a controlled rate.

Proximal methods avoid active strict saddles of weakly convex functions

no code implementations16 Dec 2019 Damek Davis, Dmitriy Drusvyatskiy

We introduce a geometrically transparent strict saddle property for nonsmooth functions.

From low probability to high confidence in stochastic convex optimization

no code implementations31 Jul 2019 Damek Davis, Dmitriy Drusvyatskiy, Lin Xiao, Junyu Zhang

Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation.

Stochastic Optimization Vocal Bursts Intensity Prediction

Stochastic algorithms with geometric step decay converge linearly on sharp functions

1 code implementation22 Jul 2019 Damek Davis, Dmitriy Drusvyatskiy, Vasileios Charisopoulos

In this work, we ask whether geometric step decay similarly improves stochastic algorithms for the class of sharp nonconvex problems.

Retrieval

Composite optimization for robust blind deconvolution

1 code implementation6 Jan 2019 Vasileios Charisopoulos, Damek Davis, Mateo Díaz, Dmitriy Drusvyatskiy

The blind deconvolution problem seeks to recover a pair of vectors from a set of rank one bilinear measurements.

Graphical Convergence of Subgradients in Nonconvex Optimization and Learning

no code implementations17 Oct 2018 Damek Davis, Dmitriy Drusvyatskiy

We investigate the stochastic optimization problem of minimizing population risk, where the loss defining the risk is assumed to be weakly convex.

regression Stochastic Optimization

Stochastic model-based minimization under high-order growth

no code implementations1 Jul 2018 Damek Davis, Dmitriy Drusvyatskiy, Kellie J. MacPhee

Given a nonsmooth, nonconvex minimization problem, we consider algorithms that iteratively sample and minimize stochastic convex models of the objective function.

Vocal Bursts Intensity Prediction

Stochastic subgradient method converges on tame functions

1 code implementation20 Apr 2018 Damek Davis, Dmitriy Drusvyatskiy, Sham Kakade, Jason D. Lee

This work considers the question: what convergence guarantees does the stochastic subgradient method have in the absence of smoothness and convexity?

Stochastic model-based minimization of weakly convex functions

no code implementations17 Mar 2018 Damek Davis, Dmitriy Drusvyatskiy

We consider a family of algorithms that successively sample and minimize simple stochastic models of the objective function.

Stochastic subgradient method converges at the rate $O(k^{-1/4})$ on weakly convex functions

2 code implementations8 Feb 2018 Damek Davis, Dmitriy Drusvyatskiy

We prove that the proximal stochastic subgradient method, applied to a weakly convex problem, drives the gradient of the Moreau envelope to zero at the rate $O(k^{-1/4})$.

Open-Ended Question Answering

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

no code implementations12 Jul 2017 Damek Davis, Benjamin Grimmer

In this paper, we introduce a stochastic projected subgradient method for weakly convex (i. e., uniformly prox-regular) nonsmooth, nonconvex functions---a wide class of functions which includes the additive and convex composite classes.

The Sound of APALM Clapping: Faster Nonsmooth Nonconvex Optimization with Stochastic Asynchronous PALM

no code implementations NeurIPS 2016 Damek Davis, Brent Edmunds, Madeleine Udell

We introduce the Stochastic Asynchronous Proximal Alternating Linearized Minimization (SAPALM) method, a block coordinate stochastic proximal-gradient method for solving nonconvex, nonsmooth optimization problems.

A SMART Stochastic Algorithm for Nonconvex Optimization with Applications to Robust Machine Learning

no code implementations4 Oct 2016 Aleksandr Aravkin, Damek Davis

In this paper, we show how to transform any optimization problem that arises from fitting a machine learning model into one that (1) detects and removes contaminated data from the training set while (2) simultaneously fitting the trimmed model on the uncontaminated data that remains.

BIG-bench Machine Learning

Beating level-set methods for 3D seismic data interpolation: a primal-dual alternating approach

no code implementations9 Jul 2016 Rajiv Kumar, Oscar López, Damek Davis, Aleksandr Y. Aravkin, Felix J. Herrmann

Acquisition cost is a crucial bottleneck for seismic workflows, and low-rank formulations for data interpolation allow practitioners to `fill in' data volumes from critically subsampled data acquired in the field.

Multi-View Feature Engineering and Learning

no code implementations CVPR 2015 Jingming Dong, Nikolaos Karianakis, Damek Davis, Joshua Hernandez, Jonathan Balzer, Stefano Soatto

We frame the problem of local representation of imaging data as the computation of minimal sufficient statistics that are invariant to nuisance variability induced by viewpoint and illumination.

Feature Engineering

An $O(n\log(n))$ Algorithm for Projecting Onto the Ordered Weighted $\ell_1$ Norm Ball

no code implementations5 May 2015 Damek Davis

The ordered weighted $\ell_1$ (OWL) norm is a newly developed generalization of the Octogonal Shrinkage and Clustering Algorithm for Regression (OSCAR) norm.

Clustering regression

Asymmetric Sparse Kernel Approximations for Large-scale Visual Search

no code implementations CVPR 2014 Damek Davis, Jonathan Balzer, Stefano Soatto

We introduce an asymmetric sparse approximate embedding optimized for fast kernel comparison operations arising in large-scale visual search.

Image Retrieval Retrieval

On the Design and Analysis of Multiple View Descriptors

no code implementations23 Nov 2013 Jingming Dong, Jonathan Balzer, Damek Davis, Joshua Hernandez, Stefano Soatto

We propose an extension of popular descriptors based on gradient orientation histograms (HOG, computed in a single image) to multiple views.

Specificity

Cannot find the paper you are looking for? You can Submit a new open access paper.