no code implementations • 15 Apr 2024 • Nachuan Xiao, Kuangyu Ding, Xiaoyin Hu, Kim-Chuan Toh
Preliminary numerical experiments on deep learning tasks illustrate that our proposed framework yields efficient variants of Lagrangian-based methods with convergence guarantees for nonconvex nonsmooth constrained optimization problems.
no code implementations • 8 Feb 2024 • Ling Liang, Kim-Chuan Toh, Jia-Jie Zhu
The Halpern iteration for solving monotone inclusion problems has gained increasing interests in recent years due to its simple form and appealing convergence properties.
no code implementations • 21 Dec 2023 • Anh Duc Nguyen, Tuan Dung Nguyen, Quang Minh Nguyen, Hoang H. Nguyen, Lam M. Nguyen, Kim-Chuan Toh
This paper studies the Partial Optimal Transport (POT) problem between two unbalanced measures with at most $n$ supports and its applications in various AI tasks such as color transfer or domain adaptation.
no code implementations • 13 Oct 2023 • Kuangyu Ding, Nachuan Xiao, Kim-Chuan Toh
As a practical application of our proposed framework, we propose a novel Adam-family method named Adam with Decoupled Weight Decay (AdamD), and establish its convergence properties under mild conditions.
no code implementations • 19 Jul 2023 • Nachuan Xiao, Xiaoyin Hu, Kim-Chuan Toh
In this paper, we investigate the convergence properties of the stochastic gradient descent (SGD) method and its variants, especially in training neural networks built from nonsmooth activation functions.
no code implementations • 26 Jun 2023 • Kuangyu Ding, Jingyang Li, Kim-Chuan Toh
Experimental results on representative benchmarks demonstrate the effectiveness and robustness of MSBPG in training neural networks.
no code implementations • 6 May 2023 • Nachuan Xiao, Xiaoyin Hu, Xin Liu, Kim-Chuan Toh
In this paper, we present a comprehensive study on the convergence properties of Adam-family methods for nonsmooth optimization, especially in the training of nonsmooth neural networks.
no code implementations • 13 Sep 2022 • Ngoc Hoang Anh Mai, Victor Magron, Jean-Bernard Lasserre, Kim-Chuan Toh
We consider polynomial optimization problems (POP) on a semialgebraic set contained in the nonnegative orthant (every POP on a compact set can be put in this format by a simple translation of the origin).
no code implementations • 29 Apr 2022 • Ching-pei Lee, Ling Liang, Tianyun Tang, Kim-Chuan Toh
This work proposes a rapid algorithm, BM-Global, for nuclear-norm-regularized convex and low-rank matrix optimization problems.
1 code implementation • 28 May 2021 • Heng Yang, Ling Liang, Luca Carlone, Kim-Chuan Toh
In particular, we first design a globally convergent inexact projected gradient method (iPGM) for solving the SDP that serves as the backbone of our framework.
no code implementations • 22 Oct 2020 • Yangjing Zhang, Kim-Chuan Toh, Defeng Sun
We consider the problem of learning a graph under the Laplacian constraint with a non-convex penalty: minimax concave penalty (MCP).
no code implementations • 17 Apr 2020 • Meixia Lin, Defeng Sun, Kim-Chuan Toh, Chengjing Wang
The sparsity and clustering structure of the concentration matrix is enforced to reduce model complexity and describe inherent regularities.
no code implementations • 26 Feb 2020 • Meixia Lin, Defeng Sun, Kim-Chuan Toh
We prove that the least squares estimator is computable via solving a constrained convex quadratic programming (QP) problem with $(n+1)d$ variables and at least $n(n-1)$ linear inequality constraints, where $n$ is the number of data points.
no code implementations • 27 Mar 2019 • Peipei Tang, Chengjing Wang, Defeng Sun, Kim-Chuan Toh
In this paper, we consider high-dimensional nonconvex square-root-loss regression problems and introduce a proximal majorization-minimization (PMM) algorithm for these problems.
no code implementations • 1 Feb 2019 • Meixia Lin, Defeng Sun, Kim-Chuan Toh, Yancheng Yuan
In addition, we derive the corresponding HS-Jacobian to the proximal mapping and analyze its structure --- which plays an essential role in the efficient computation of the PPA subproblem via applying a semismooth Newton method on its dual.
no code implementations • 4 Oct 2018 • Defeng Sun, Kim-Chuan Toh, Yancheng Yuan
The perfect recovery properties of the convex clustering model with uniformly weighted all pairwise-differences regularization have been proved by Zhu et al. (2014) and Panahi et al. (2017).
no code implementations • 12 Sep 2018 • Lei Yang, Jia Li, Defeng Sun, Kim-Chuan Toh
When the support points of the barycenter are pre-specified, this problem can be modeled as a linear programming (LP) problem whose size can be extremely large.
no code implementations • 22 Aug 2018 • Meixia Lin, Yong-Jin Liu, Defeng Sun, Kim-Chuan Toh
Based on the new formulation, we derive an efficient procedure for its computation.
no code implementations • ICML 2018 • Yancheng Yuan, Defeng Sun, Kim-Chuan Toh
Clustering may be the most fundamental problem in unsupervised learning which is still active in machine learning research because its importance in many applications.
no code implementations • 24 Sep 2016 • Ethan X. Fang, Han Liu, Kim-Chuan Toh, Wen-Xin Zhou
This paper studies the matrix completion problem under arbitrary sampling schemes.
no code implementations • CVPR 2016 • Zhuwen Li, Shuoguang Yang, Loong-Fah Cheong, Kim-Chuan Toh
Estimating the number of clusters remains a difficult model selection problem.
no code implementations • 6 Sep 2013 • Yu-Xiang Wang, Choon Meng Lee, Loong-Fah Cheong, Kim-Chuan Toh
Low-rank matrix completion is a problem of immense practical importance.