Search Results for author: Ya-xiang Yuan

Found 8 papers, 2 papers with code

LancBiO: dynamic Lanczos-aided bilevel optimization via Krylov subspace

1 code implementation4 Apr 2024 Bin Gao, Yan Yang, Ya-xiang Yuan

As a result, the constructed subspace is able to dynamically and incrementally approximate the Hessian inverse vector product with less effort and thus leads to a favorable estimate of the hyper-gradient.

Bilevel Optimization

Linear convergence of forward-backward accelerated algorithms without knowledge of the modulus of strong convexity

no code implementations16 Jun 2023 Bowen Li, Bin Shi, Ya-xiang Yuan

A significant milestone in modern gradient-based optimization was achieved with the development of Nesterov's accelerated gradient descent (NAG) method.

On Underdamped Nesterov's Acceleration

no code implementations28 Apr 2023 Shuo Chen, Bin Shi, Ya-xiang Yuan

In this paper, based on the high-resolution differential equation framework, we construct the new Lyapunov functions for the underdamped case, which is motivated by the power of the time $t^{\gamma}$ or the iteration $k^{\gamma}$ in the mixed term.

Linear Convergence of ISTA and FISTA

no code implementations13 Dec 2022 Bowen Li, Bin Shi, Ya-xiang Yuan

Specifically, assuming the smooth part to be strongly convex is more reasonable for the least-square model, even though the image matrix is probably ill-conditioned.

Proximal Subgradient Norm Minimization of ISTA and FISTA

no code implementations3 Nov 2022 Bowen Li, Bin Shi, Ya-xiang Yuan

We apply the tighter inequality discovered in the well-constructed Lyapunov function and then obtain the proximal subgradient norm minimization by the phase-space representation, regardless of gradient-correction or implicit-velocity.

Gradient Norm Minimization of Nesterov Acceleration: $o(1/k^3)$

no code implementations19 Sep 2022 Shuo Chen, Bin Shi, Ya-xiang Yuan

In the history of first-order algorithms, Nesterov's accelerated gradient descent (NAG) is one of the milestones.

Open-Ended Question Answering

Parallelizable Algorithms for Optimization Problems with Orthogonality Constraints

1 code implementation9 Oct 2018 Bin Gao, Xin Liu, Ya-xiang Yuan

Numerical experiments in serial illustrate that the novel updating rule for the Lagrangian multipliers significantly accelerates the convergence of PLAM and makes it comparable with the existent feasible solvers for optimization problems with orthogonality constraints, and the performance of PCAL does not highly rely on the choice of the penalty parameter.

Optimization and Control 15A18, 65F15, 65K05, 90C06

Cannot find the paper you are looking for? You can Submit a new open access paper.