Search Results for author: Huanran Lu

Found 3 papers, 0 papers with code

Tighten after Relax: Minimax-Optimal Sparse PCA in Polynomial Time

no code implementations NeurIPS 2014 Zhaoran Wang, Huanran Lu, Han Liu

In this paper, we propose a two-stage sparse PCA procedure that attains the optimal principal subspace estimator in polynomial time.

Nonconvex Statistical Optimization: Minimax-Optimal Sparse PCA in Polynomial Time

no code implementations22 Aug 2014 Zhaoran Wang, Huanran Lu, Han Liu

To optimally estimate sparse principal subspaces, we propose a two-stage computational framework named "tighten after relax": Within the 'relax' stage, we approximately solve a convex relaxation of sparse PCA with early stopping to obtain a desired initial estimator; For the 'tighten' stage, we propose a novel algorithm called sparse orthogonal iteration pursuit (SOAP), which iteratively refines the initial estimator by directly solving the underlying nonconvex problem.

A Direct Estimation of High Dimensional Stationary Vector Autoregressions

no code implementations1 Jul 2013 Fang Han, Huanran Lu, Han Liu

In addition, we provide thorough experiments on both synthetic and real-world equity data to show that there are empirical advantages of our method over the lasso-type estimators in both parameter estimation and forecasting.

Time Series Time Series Analysis +1

Cannot find the paper you are looking for? You can Submit a new open access paper.