no code implementations • 8 Apr 2023 • Chanwoo Lee, Miaoyan Wang
We find that high-dimensional latent variable tensors are of log-rank; the fact explains the pervasiveness of low-rank tensors in applications.
no code implementations • 7 Mar 2023 • Chanwoo Lee
We address the problem of sufficient dimension reduction for feature matrices, which arises often in sensor network localization, brain neuroimaging, and electroencephalography analysis.
no code implementations • 8 Nov 2021 • Chanwoo Lee, Miaoyan Wang
A phase transition phenomenon is revealed with respect to the smoothness threshold needed for optimal recovery.
no code implementations • 4 May 2021 • Chanwoo Lee, Lexin Li, Hao Helen Zhang, Miaoyan Wang
Trace regression is a widely used method to model effects of matrix predictors and has shown great success in matrix learning.
no code implementations • NeurIPS 2021 • Chanwoo Lee, Miaoyan Wang
A nonparametric approach to tensor completion is developed based on a new model which we coin as sign representable tensors.
no code implementations • ICML 2020 • Chanwoo Lee, Miaoyan Wang
Higher-order tensors arise frequently in applications such as neuroimaging, recommendation system, social network analysis, and psychological studies.
no code implementations • 21 Oct 2019 • Jiaxin Hu, Chanwoo Lee, Miaoyan Wang
Here, we develop a tensor decomposition method that incorporates multiple feature matrices as side information.