1 code implementation • 10 Feb 2022 • Bingxin Zhou, Yuanhong Jiang, Yu Guang Wang, Jingwei Liang, Junbin Gao, Shirui Pan, Xiaoqun Zhang
The performance of graph representation learning is affected by the quality of graph input.
1 code implementation • 18 Jan 2021 • Jingwei Liang, Clarice Poon
In the realm of deterministic optimization, the sequence generated by iterative algorithms (such as proximal gradient descent) exhibit "finite activity identification", namely, they can identify the low-complexity structure in a finite number of iterations.
1 code implementation • 18 Nov 2020 • Kaixuan Wei, Angelica Aviles-Rivero, Jingwei Liang, Ying Fu, Hua Huang, Carola-Bibiane Schönlieb
In this work, we present a class of tuning-free PnP proximal algorithms that can determine parameters such as denoising strength, termination time, and other optimization-specific parameters automatically.
no code implementations • 27 Feb 2020 • Derek Driggs, Junqi Tang, Jingwei Liang, Mike Davies, Carola-Bibiane Schönlieb
We introduce SPRING, a novel stochastic proximal alternating linearized minimization algorithm for solving a class of non-smooth and non-convex optimization problems.
Image Deconvolution Stochastic Optimization Optimization and Control 90C26
1 code implementation • ICML 2020 • Kaixuan Wei, Angelica Aviles-Rivero, Jingwei Liang, Ying Fu, Carola-Bibiane Schönlieb, Hua Huang
Moreover, we discuss the practical considerations of the plugged denoisers, which together with our learned policy yield state-of-the-art results.
no code implementations • 25 May 2019 • Aritra Dutta, Filip Hanzely, Jingwei Liang, Peter Richtárik
The best pair problem aims to find a pair of points that minimize the distance between two disjoint sets.
1 code implementation • ICML 2018 • Clarice Poon, Jingwei Liang, Carola Schoenlieb
In this paper, we present a local convergence anal- ysis for a class of stochastic optimisation meth- ods: the proximal variance reduced stochastic gradient methods, and mainly focus on SAGA (Defazio et al., 2014) and Prox-SVRG (Xiao & Zhang, 2014).
no code implementations • NeurIPS 2016 • Jingwei Liang, Jalal Fadili, Gabriel Peyré
In this paper, we propose a multi-step inertial Forward--Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient.
no code implementations • NeurIPS 2014 • Jingwei Liang, Jalal Fadili, Gabriel Peyré
In this paper, we consider the Forward--Backward proximal splitting algorithm to minimize the sum of two proper closed convex functions, one of which having a Lipschitz continuous gradient and the other being partly smooth relatively to an active manifold $\mathcal{M}$.