no code implementations • NeurIPS 2020 • Quoc Tran Dinh, Deyi Liu, Lam Nguyen
We develop a novel and single-loop variance-reduced algorithm to solve a class of stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective function, which has various applications in different fields such as ma- chine learning and robust optimization.
no code implementations • 20 Aug 2020 • Deyi Liu, Lam M. Nguyen, Quoc Tran-Dinh
In this note we propose a new variant of the hybrid variance-reduced proximal gradient method in [7] to solve a common stochastic composite nonconvex optimization problem under standard assumptions.
2 code implementations • ICLR 2021 • Zhenlin Xu, Deyi Liu, Junlin Yang, Colin Raffel, Marc Niethammer
In this work, we show that the robustness of neural networks can be greatly improved through the use of random convolutions as data augmentation.
Ranked #117 on Domain Generalization on PACS
no code implementations • NeurIPS 2020 • Quoc Tran-Dinh, Deyi Liu, Lam M. Nguyen
This problem class has several computational challenges due to its nonsmoothness, nonconvexity, nonlinearity, and non-separability of the objective functions.
no code implementations • 3 Mar 2020 • Quoc Tran-Dinh, Deyi Liu
We develop a novel unified randomized block-coordinate primal-dual algorithm to solve a class of nonsmooth constrained convex optimization problems, which covers different existing variants and model settings from the literature.
1 code implementation • 17 Feb 2020 • Deyi Liu, Volkan Cevher, Quoc Tran-Dinh
We demonstrate how to scalably solve a class of constrained self-concordant minimization problems using linear minimization oracles (LMO) over the constraint set.