Search Results for author: Zhaosong Lu

Found 23 papers, 2 papers with code

Newton-CG methods for nonconvex unconstrained optimization with Hölder continuous Hessian

no code implementations22 Nov 2023 Chuan He, Zhaosong Lu

In this paper we consider a nonconvex unconstrained optimization problem minimizing a twice differentiable objective function with H\"older continuous Hessian.

A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization

no code implementations10 Jan 2023 Chuan He, Heng Huang, Zhaosong Lu

In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint.

A Newton-CG based augmented Lagrangian method for finding a second-order stationary point of nonconvex equality constrained optimization with complexity guarantees

no code implementations9 Jan 2023 Chuan He, Zhaosong Lu, Ting Kei Pong

In particular, we first propose a new Newton-CG method for finding an approximate SOSP of unconstrained optimization and show that it enjoys a substantially better complexity than the Newton-CG method [56].

First-order penalty methods for bilevel optimization

no code implementations4 Jan 2023 Zhaosong Lu, Sanyou Mei

Under suitable assumptions, an \emph{operation complexity} of $O(\varepsilon^{-4}\log\varepsilon^{-1})$ and $O(\varepsilon^{-7}\log\varepsilon^{-1})$, measured by their fundamental operations, is established for the proposed penalty methods for finding an $\varepsilon$-KKT solution of the unconstrained and constrained bilevel optimization problems, respectively.

Bilevel Optimization

A Newton-CG based barrier method for finding a second-order stationary point of nonconvex conic optimization with complexity guarantees

no code implementations12 Jul 2022 Chuan He, Zhaosong Lu

In this paper we consider finding an approximate second-order stationary point (SOSP) of nonconvex conic optimization that minimizes a twice differentiable function over the intersection of an affine subspace and a convex cone.

Second-order methods

Accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient

no code implementations2 Jun 2022 Zhaosong Lu, Sanyou Mei

In particular, we first consider unconstrained convex optimization with LLCG and propose accelerated proximal gradient (APG) methods for solving it.

Primal-dual extrapolation methods for monotone inclusions under local Lipschitz continuity with applications to variational inequality, conic constrained saddle point, and convex conic optimization problems

no code implementations2 Jun 2022 Zhaosong Lu, Sanyou Mei

In particular, we first propose a primal-dual extrapolation (PDE) method for solving a structured strongly MI problem by modifying the classical forward-backward splitting method by using a point and operator extrapolation technique, in which the parameters are adaptively updated by a backtracking line search scheme.

Stochastic Alternating Direction Method of Multipliers with Variance Reduction for Nonconvex Optimization

no code implementations10 Oct 2016 Feihu Huang, Songcan Chen, Zhaosong Lu

Specifically, the first class called the nonconvex stochastic variance reduced gradient ADMM (SVRG-ADMM), uses a multi-stage scheme to progressively reduce the variance of stochastic gradients.

Randomized block proximal damped Newton method for composite self-concordant minimization

no code implementations1 Jul 2016 Zhaosong Lu

The CSC minimization is the cornerstone of the path-following interior point methods for solving a broad class of convex optimization problems.

Generalized Conjugate Gradient Methods for $\ell_1$ Regularized Convex Quadratic Programming with Finite Convergence

no code implementations24 Nov 2015 Zhaosong Lu, Xiaojun Chen

In this paper we propose some generalized CG (GCG) methods for solving the $\ell_1$-regularized (possibly not strongly) convex QP that terminate at an optimal solution in a finite number of iterations.

Sparse Recovery via Partial Regularization: Models, Theory and Algorithms

no code implementations23 Nov 2015 Zhaosong Lu, Xiaorui Li

Moreover, for a class of partial regularizers, any global minimizer of these models is a sparsest solution to the linear system.

Optimization over Sparse Symmetric Sets via a Nonmonotone Projected Gradient Method

no code implementations29 Sep 2015 Zhaosong Lu

For this problem, it is known that any accumulation point of the classical projected gradient (PG) method with a constant stepsize $1/L$ satisfies the $L$-stationarity optimality condition that was introduced in [3].

An Accelerated Proximal Coordinate Gradient Method

no code implementations NeurIPS 2014 Qihang Lin, Zhaosong Lu, Lin Xiao

We develop an accelerated randomized proximal coordinate gradient (APCG) method, for solving a broad class of composite convex optimization problems.

Penalty methods for a class of non-Lipschitz optimization problems

no code implementations9 Sep 2014 Xiaojun Chen, Zhaosong Lu, Ting Kei Pong

We consider a class of constrained optimization problems with a possibly nonconvex non-Lipschitz objective and a convex feasible set being the intersection of a polyhedron and a possibly degenerate ellipsoid.

Orthogonal Rank-One Matrix Pursuit for Low Rank Matrix Completion

1 code implementation4 Apr 2014 Zheng Wang, Ming-Jun Lai, Zhaosong Lu, Wei Fan, Hasan Davulcu, Jieping Ye

Numerical results show that our proposed algorithm is more efficient than competing algorithms while achieving similar or better prediction performance.

Low-Rank Matrix Completion

Schatten-$p$ Quasi-Norm Regularized Matrix Optimization via Iterative Reweighted Singular Value Minimization

no code implementations5 Jan 2014 Zhaosong Lu, Yong Zhang

In particular, we first introduce a class of first-order stationary points for them, and show that the first-order stationary points introduced in [11] for an SPQN regularized $vector$ minimization problem are equivalent to those of an SPQN regularized $matrix$ minimization reformulation.

A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming

no code implementations25 Jun 2013 Zhaosong Lu, Lin Xiao

When the problem under consideration is convex, we show that the expected objective values generated by RNBPG converge to the optimal value of the problem.

On the Complexity Analysis of Randomized Block-Coordinate Descent Methods

no code implementations21 May 2013 Zhaosong Lu, Lin Xiao

In this paper we analyze the randomized block-coordinate descent (RBCD) methods proposed in [8, 11] for minimizing the sum of a smooth convex function and a block-separable convex function.

A General Iterative Shrinkage and Thresholding Algorithm for Non-convex Regularized Optimization Problems

4 code implementations18 Mar 2013 Pinghua Gong, Chang-Shui Zhang, Zhaosong Lu, Jianhua Huang, Jieping Ye

A commonly used approach is the Multi-Stage (MS) convex relaxation (or DC programming), which relaxes the original non-convex problem to a sequence of convex problems.

Sparse Learning

Sequential Convex Programming Methods for A Class of Structured Nonlinear Programming

no code implementations10 Oct 2012 Zhaosong Lu

In this paper we study a broad class of structured nonlinear programming (SNLP) problems.

Fused Multiple Graphical Lasso

no code implementations10 Sep 2012 Sen Yang, Zhaosong Lu, Xiaotong Shen, Peter Wonka, Jieping Ye

We expect the two brain networks for NC and MCI to share common structures but not to be identical to each other; similarly for the two brain networks for MCI and AD.

Penalty Decomposition Methods for Rank Minimization

no code implementations NeurIPS 2011 Yong Zhang, Zhaosong Lu

In this paper we consider general rank minimization problems with rank appearing in either objective function or constraint.

Matrix Completion

Cannot find the paper you are looking for? You can Submit a new open access paper.