no code implementations • 22 Nov 2023 • Chuan He, Zhaosong Lu
In this paper we consider a nonconvex unconstrained optimization problem minimizing a twice differentiable objective function with H\"older continuous Hessian.
no code implementations • 10 Jan 2023 • Chuan He, Heng Huang, Zhaosong Lu
In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint.
no code implementations • 9 Jan 2023 • Chuan He, Zhaosong Lu, Ting Kei Pong
In particular, we first propose a new Newton-CG method for finding an approximate SOSP of unconstrained optimization and show that it enjoys a substantially better complexity than the Newton-CG method [56].
no code implementations • 5 Jan 2023 • Zhaosong Lu, Sanyou Mei
In this paper we study a class of constrained minimax problems.
no code implementations • 4 Jan 2023 • Zhaosong Lu, Sanyou Mei
Under suitable assumptions, an \emph{operation complexity} of $O(\varepsilon^{-4}\log\varepsilon^{-1})$ and $O(\varepsilon^{-7}\log\varepsilon^{-1})$, measured by their fundamental operations, is established for the proposed penalty methods for finding an $\varepsilon$-KKT solution of the unconstrained and constrained bilevel optimization problems, respectively.
no code implementations • 12 Jul 2022 • Chuan He, Zhaosong Lu
In this paper we consider finding an approximate second-order stationary point (SOSP) of nonconvex conic optimization that minimizes a twice differentiable function over the intersection of an affine subspace and a convex cone.
no code implementations • 2 Jun 2022 • Zhaosong Lu, Sanyou Mei
In particular, we first consider unconstrained convex optimization with LLCG and propose accelerated proximal gradient (APG) methods for solving it.
no code implementations • 2 Jun 2022 • Zhaosong Lu, Sanyou Mei
In particular, we first propose a primal-dual extrapolation (PDE) method for solving a structured strongly MI problem by modifying the classical forward-backward splitting method by using a point and operator extrapolation technique, in which the parameters are adaptively updated by a backtracking line search scheme.
no code implementations • 10 Oct 2016 • Feihu Huang, Songcan Chen, Zhaosong Lu
Specifically, the first class called the nonconvex stochastic variance reduced gradient ADMM (SVRG-ADMM), uses a multi-stage scheme to progressively reduce the variance of stochastic gradients.
no code implementations • 1 Jul 2016 • Zhaosong Lu
The CSC minimization is the cornerstone of the path-following interior point methods for solving a broad class of convex optimization problems.
no code implementations • 24 Nov 2015 • Zhaosong Lu, Xiaojun Chen
In this paper we propose some generalized CG (GCG) methods for solving the $\ell_1$-regularized (possibly not strongly) convex QP that terminate at an optimal solution in a finite number of iterations.
no code implementations • 23 Nov 2015 • Zhaosong Lu, Xiaorui Li
Moreover, for a class of partial regularizers, any global minimizer of these models is a sparsest solution to the linear system.
no code implementations • 29 Sep 2015 • Zhaosong Lu
For this problem, it is known that any accumulation point of the classical projected gradient (PG) method with a constant stepsize $1/L$ satisfies the $L$-stationarity optimality condition that was introduced in [3].
no code implementations • NeurIPS 2014 • Qihang Lin, Zhaosong Lu, Lin Xiao
We develop an accelerated randomized proximal coordinate gradient (APCG) method, for solving a broad class of composite convex optimization problems.
no code implementations • 9 Sep 2014 • Xiaojun Chen, Zhaosong Lu, Ting Kei Pong
We consider a class of constrained optimization problems with a possibly nonconvex non-Lipschitz objective and a convex feasible set being the intersection of a polyhedron and a possibly degenerate ellipsoid.
1 code implementation • 4 Apr 2014 • Zheng Wang, Ming-Jun Lai, Zhaosong Lu, Wei Fan, Hasan Davulcu, Jieping Ye
Numerical results show that our proposed algorithm is more efficient than competing algorithms while achieving similar or better prediction performance.
no code implementations • 5 Jan 2014 • Zhaosong Lu, Yong Zhang
In particular, we first introduce a class of first-order stationary points for them, and show that the first-order stationary points introduced in [11] for an SPQN regularized $vector$ minimization problem are equivalent to those of an SPQN regularized $matrix$ minimization reformulation.
no code implementations • 25 Jun 2013 • Zhaosong Lu, Lin Xiao
When the problem under consideration is convex, we show that the expected objective values generated by RNBPG converge to the optimal value of the problem.
no code implementations • 21 May 2013 • Zhaosong Lu, Lin Xiao
In this paper we analyze the randomized block-coordinate descent (RBCD) methods proposed in [8, 11] for minimizing the sum of a smooth convex function and a block-separable convex function.
4 code implementations • 18 Mar 2013 • Pinghua Gong, Chang-Shui Zhang, Zhaosong Lu, Jianhua Huang, Jieping Ye
A commonly used approach is the Multi-Stage (MS) convex relaxation (or DC programming), which relaxes the original non-convex problem to a sequence of convex problems.
no code implementations • 10 Oct 2012 • Zhaosong Lu
In this paper we study a broad class of structured nonlinear programming (SNLP) problems.
no code implementations • 10 Sep 2012 • Sen Yang, Zhaosong Lu, Xiaotong Shen, Peter Wonka, Jieping Ye
We expect the two brain networks for NC and MCI to share common structures but not to be identical to each other; similarly for the two brain networks for MCI and AD.
no code implementations • NeurIPS 2011 • Yong Zhang, Zhaosong Lu
In this paper we consider general rank minimization problems with rank appearing in either objective function or constraint.