Search Results for author: Jingwei Liang

Found 9 papers, 5 papers with code

Screening for Sparse Online Learning

1 code implementation18 Jan 2021 Jingwei Liang, Clarice Poon

In the realm of deterministic optimization, the sequence generated by iterative algorithms (such as proximal gradient descent) exhibit "finite activity identification", namely, they can identify the low-complexity structure in a finite number of iterations.

TFPnP: Tuning-free Plug-and-Play Proximal Algorithm with Applications to Inverse Imaging Problems

1 code implementation18 Nov 2020 Kaixuan Wei, Angelica Aviles-Rivero, Jingwei Liang, Ying Fu, Hua Huang, Carola-Bibiane Schönlieb

In this work, we present a class of tuning-free PnP proximal algorithms that can determine parameters such as denoising strength, termination time, and other optimization-specific parameters automatically.

Denoising Retrieval

SPRING: A fast stochastic proximal alternating method for non-smooth non-convex optimization

no code implementations27 Feb 2020 Derek Driggs, Junqi Tang, Jingwei Liang, Mike Davies, Carola-Bibiane Schönlieb

We introduce SPRING, a novel stochastic proximal alternating linearized minimization algorithm for solving a class of non-smooth and non-convex optimization problems.

Image Deconvolution Stochastic Optimization Optimization and Control 90C26

Tuning-free Plug-and-Play Proximal Algorithm for Inverse Imaging Problems

1 code implementation ICML 2020 Kaixuan Wei, Angelica Aviles-Rivero, Jingwei Liang, Ying Fu, Carola-Bibiane Schönlieb, Hua Huang

Moreover, we discuss the practical considerations of the plugged denoisers, which together with our learned policy yield state-of-the-art results.

Denoising Retrieval

Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit

no code implementations25 May 2019 Aritra Dutta, Filip Hanzely, Jingwei Liang, Peter Richtárik

The best pair problem aims to find a pair of points that minimize the distance between two disjoint sets.

Local Convergence Properties of SAGA/Prox-SVRG and Acceleration

1 code implementation ICML 2018 Clarice Poon, Jingwei Liang, Carola Schoenlieb

In this paper, we present a local convergence anal- ysis for a class of stochastic optimisation meth- ods: the proximal variance reduced stochastic gradient methods, and mainly focus on SAGA (Defazio et al., 2014) and Prox-SVRG (Xiao & Zhang, 2014).

A Multi-step Inertial Forward-Backward Splitting Method for Non-convex Optimization

no code implementations NeurIPS 2016 Jingwei Liang, Jalal Fadili, Gabriel Peyré

In this paper, we propose a multi-step inertial Forward--Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient.

BIG-bench Machine Learning

Local Linear Convergence of Forward--Backward under Partial Smoothness

no code implementations NeurIPS 2014 Jingwei Liang, Jalal Fadili, Gabriel Peyré

In this paper, we consider the Forward--Backward proximal splitting algorithm to minimize the sum of two proper closed convex functions, one of which having a Lipschitz continuous gradient and the other being partly smooth relatively to an active manifold $\mathcal{M}$.

Cannot find the paper you are looking for? You can Submit a new open access paper.