Search Results for author: Zhiqiang Xu

Found 6 papers, 0 papers with code

A Comprehensively Tight Analysis of Gradient Descent for PCA

no code implementations NeurIPS 2021 Zhiqiang Xu, Ping Li

We further give the first worst-case analysis that achieves a rate of convergence at $O(\frac{1}{\epsilon}\log\frac{1}{\epsilon})$.

Towards Better Generalization of Adaptive Gradient Methods

no code implementations NeurIPS 2020 Yingxue Zhou, Belhal Karimi, Jinxing Yu, Zhiqiang Xu, Ping Li

Adaptive gradient methods such as AdaGrad, RMSprop and Adam have been optimizers of choice for deep learning due to their fast training speed.

Towards Practical Alternating Least-Squares for CCA

no code implementations NeurIPS 2019 Zhiqiang Xu, Ping Li

To promote the practical use of ALS for CCA, we propose truly alternating least-squares.

Gradient Descent Meets Shift-and-Invert Preconditioning for Eigenvector Computation

no code implementations NeurIPS 2018 Zhiqiang Xu

Shift-and-invert preconditioning, as a classic acceleration technique for the leading eigenvector computation, has received much attention again recently, owing to fast least-squares solvers for efficiently approximating matrix inversions in power iterations.

Stochastic Variance Reduced Riemannian Eigensolver

no code implementations26 May 2016 Zhiqiang Xu, Yiping Ke

We generalize it to Riemannian manifolds and realize it to solve the non-convex eigen-decomposition problem.

The performance of orthogonal multi-matching pursuit under RIP

no code implementations19 Oct 2012 Zhiqiang Xu

In particular, for $M=s^a$ with $a\in [0, 1/2]$, OMMP(M) can recover slowly-decaying $s$-sparse signal within $O(s^{1-a})$ iterations.

Cannot find the paper you are looking for? You can Submit a new open access paper.